SIAM Journal on Control and Optimization, Vol.39, No.6, 1779-1816, 2001
Optimal control problems for stochastic reaction-diffusion systems with non-Lipschitz coefficients
By using the dynamic programming approach, we study a control problem for a class of stochastic reaction-diffusion systems with coefficients having polynomial growth. In the cost functional a non-Lipschitz term appears, and this allows us to treat the quadratic case, which is of interest in the applications. The corresponding Hamilton Jacobi Bellman equation is rst resolved by a fixed point argument in a small time interval and then is extended to arbitrary time intervals by suitable a priori estimates. The main ingredient in the proof is the smoothing effect of the transition semigroup associated with the uncontrolled system.
Keywords:stochastic reaction-diffusion systems;Hamilton-Jacobi-Bellman equations in infinite;dimension;stochastic optimal control problems