화학공학소재연구정보센터
IEEE Transactions on Automatic Control, Vol.45, No.5, 993-998, 2000
Observation control for discrete-continuous stochastic systems
This paper presents a theoretical framework for the optimization of observations in partially observable linear discrete-continuous stochastic systems. The problem of achievement the best quality of estimation with discrete and continuous observations is reduced to a deterministic one with impulse or generalized control. An approach based on the discontinuous time transformation is presented and used to reduce the original optimization problem with impulsive control to the standard one with bounded controls. The existence and description of the optimal generalized observation process are also discussed. The illustrative example of the observations optimal control is presented.