SIAM Journal on Control and Optimization, Vol.36, No.2, 609-653, 1998
Existence of Markov controls and characterization of optimal Markov controls
Given a solution of a controlled martingale problem it is shown under general conditions that there exists a solution having Markov controls which has the same cost as the original solution. This result is then used to show that the original stochastic control problem is equivalent to a linear program over a space of measures under a variety of optimality criteria. Existence and characterization of optimal Markov controls then follows. An extension of Echeverria's theorem characterizing stationary distributions for (uncontrolled) Markov processes is obtained as a corollary. In particular, this extension covers diffusion processes with discontinuous drift and diffusion coefficients.