IEEE Transactions on Automatic Control, Vol.48, No.2, 261-266, 2003
Stochastic nonlinear minimax dynamic games with noisy measurements
This note is concerned with nonlinear stochastic minimax dynamic games which are subject to noisy measurements. The minimizing players are control inputs while the maximizing players are square-integrable stochastic processes. The minimax dynamic game is formulated using an information state, which depends on the paths of the observed processes. The information state satisfies a partial differential equation of the Hamilton-Jacobi-Bellman (HJB) type. The HJB equation is employed to characterize the dissipation properties of the system, to derive a separation theorem between the design of the estimator and the controller, and to introduce a certainty-equivalence principle along the lines of Whittle. Finally, the separation theorem and the certainty-equivalence principle are applied to solve, the linear-quadratic-Gaussian minimax game. The results of this note generalize the L-2-gain of deterministic systems to stochastic analogs; they are related to the controller design of stochastic systems which employ risk-sensitive performance criteria, and to the controller design of deterministic systems which employ minimax performance criteria.