Dynamic Optimization Deterministic and Stochastic Models
Feedback Strategies for Partially Observable Stochastic Systems (Lecture Notes in Control and Information Sciences, 48)
Discrete Systems Analysis, Control and Optimization
Stochastic Control in Discrete and Continuous Time
Discrete-Time Markov Control Processes Basic Optimality Criteria
Finite Approximations in Discrete-Time Stochastic Control Quantized Models and Asymptotic Optimality
Extensions of Linear-Quadratic Control Theory (Lecture Notes in Control and Information Sciences, 27)
Stochastic Multi-Stage Optimization At the Crossroads between Discrete Time Stochastic Control and Stochastic Programming
Optimization of Discrete Time Systems: The Upper Boundary Approach (Lecture Notes in Control and Information Sciences, 51)
Introduction to Mathematical Systems Theory: Linear Systems, Identification and Control