Applied Mathematics and Optimization, Vol.60, No.1, 105-132, 2009
Approximate Controllability for Linear Stochastic Differential Equations in Infinite Dimensions
The objective of the paper is to investigate the approximate controllability property of a linear stochastic control system with values in a separable real Hilbert space. In a first step we prove the existence and uniqueness for the solution of the dual linear backward stochastic differential equation. This equation has the particularity that in addition to an unbounded operator acting on the Y-component of the solution there is still another one acting on the Z-component. With the help of this dual equation we then deduce the duality between approximate controllability and observability. Finally, under the assumption that the unbounded operator acting on the state process of the forward equation is an infinitesimal generator of an exponentially stable semigroup, we show that the generalized Hautus test provides a necessary condition for the approximate controllability. The paper generalizes former results by Buckdahn, Quincampoix and Tessitore (Stochastic Partial Differential Equations and Applications, Series of Lecture Notes in Pure and Appl. Math., vol. 245, pp. 253-260, Chapman and Hall, London, 2006) and Goreac (Applied Analysis and Differential Equations, pp. 153-164, World Scientific, Singapore, 2007) from the finite dimensional to the infinite dimensional case.