화학공학소재연구정보센터
IEEE Transactions on Automatic Control, Vol.39, No.4, 780-792, 1994
Risk-Sensitive Control and Dynamic-Games for Partially Observed Discrete-Time Nonlinear-Systems
In this paper we solve a finite-horizon partially observed risk-sensitive stochastic optimal control problem for discrete-time nonlinear systems and obtain small noise and small risk limits. The small noise limit is interpreted as a deterministic partially observed dynamic game, and new insights into the optimal solution of such game problems are obtained. Both the risk-sensitive stochastic control problem and the deterministic dynamic game problem are solved using information states, dynamic programming, and associated separated policies. A certainty equivalence principle is also discussed. Our results have implications for the nonlinear robust stabilization problem. The small risk limit is a standard partially observed risk-neutral stochastic optimal control problem.