화학공학소재연구정보센터
IEEE Transactions on Automatic Control, Vol.43, No.3, 351-372, 1998
Adaptive control of linear systems with Markov perturbations
The stochastic model under consideration is a linear jump diffusion process X for which the coefficients and the jump processes depend on a Markov chain Z with finite state space, First we study the optimal filtering and control problem for these systems with non-Gaussian initial conditions, given noisy observations of the state X and perfect measurements of Z, Under technical assumptions it is proved that the conditional characteristic function of X is parametrically determined by the output of a finite number of stochastic differential equations (i.e,, the filter is of finite dimension), Next, we derive a new sufficient condition which ensures the existence and the uniqueness of the solution of the nonlinear stochastic differential equations satisfied by the output of the filter, extending the result of Haussmann [19], We study a quadratic control problem and show that the separation principle holds, widening the class of Linear systems for which this principle holds, We give the form of the controller, which can be explicitly calculated in terms of the optimal filter, The gain of the controller depends on a system of modified coupled Riccati equations, The existence of its solution is proved, Secondly, we investigate an adaptive control problem for a state process X defined by a linear diffusion for which the coefficients depend on a Markov chain (i,e,, a subclass of the previous models), the processes X and Z being observed in independent white noises, Suboptimal estimates for the process X, Z and approximate control law are investigated for a large class of probability distributions of the initial state X-0. Asymptotic properties of these filters and this control law are obtained, Upper bounds for the corresponding error are given.