Automatica, Vol.45, No.8, 1847-1853, 2009
Stochastic minimax control for stabilizing uncertain quasi- integrable Hamiltonian systems
A procedure for designing feedback control to asymptotically stabilize, with probability one, quasi-integrable Hamiltonian systems with bounded uncertain parametric disturbances is proposed. First, the partially averaged Ito stochastic differential equations are derived from given system by using the stochastic averaging method for quasi-integrable Hamiltonian systems. Second, the Hamilton-Jacobi-Issacs (HJI) equation for the ergodic control problem of the averaged system and a performance index with undetermined cost function is established based on the principle of optimality. This equation is then solved to yield the worst disturbances and the associated optimal controls. Third, the asymptotic Lyapunov stability with probability one of the optimally controlled system with worst disturbances is analyzed by evaluating the maximal Lyapunov exponent of the fully averaged Ito equations. Finally, the cost function and feedback control are determined by the requirement of stabilizing the worst-disturbed system. A simple example is worked out to illustrate the application of the proposed procedure and the effects of optimal control on stabilizing the uncertain system. (C) 2009 Elsevier Ltd. All rights reserved.
Keywords:Quasi-integrable Hamiltonian system;Feedback stabilization;Uncertain parametric disturbance;Stochastic minimax control;Stochastic averaging;Stochastic differential game