화학공학소재연구정보센터
SIAM Journal on Control and Optimization, Vol.45, No.6, 2224-2256, 2007
Pathwise stochastic control problems and stochastic HJB equations
In this paper we study a class of pathwise stochastic control problems in which the optimality is allowed to depend on the paths of exogenous noise (or information). Such a phenomenon can be illustrated by considering a particular investor who wants to take advantage of certain extra information but in a completely legal manner. We show that such a control problem may not even have a "minimizing sequence," but nevertheless the (Bellman) dynamical programming principle still holds. We then show that the corresponding Hamilton-Jacobi-Bellman equation is a stochastic partial differential equation, as was predicted by Lion and Souganidis [C. R. Acad. Sci. Paris Ser. I Math., 327 (1998), pp. 735-741]. Our main device is a Doss-Sussmann-type transformation introduced in our previous work [Stochastic Process. Appl., 93 (2001), pp. 181-204] and [Stochastic Process. Appl., 93 (2001), pp. 205-228]. With the help of such a transformation we reduce the pathwise control problem to a more standard relaxed control problem, from which we are able to verify that the value function of the pathwise stochastic control problem is the unique stochastic viscosity solution to this stochastic partial differential equation, in the sense of [Stochastic Process. Appl., 93 (2001), pp. 181-204] and [Stochastic Process. Appl., 93 (2001), pp. 205-228].