AAE 56800

Applied Optimal Control and Estimation


The main objective of this course is to study analysis and synthesis methods of optimal controllers and estimators for stochastic dynamical systems. Optimal control is a time-domain method that computes the control input to a dynamical system which minimizes a cost function. The dual problem is optimal estimation which computes the estimated states of the system with stochastic disturbances by minimizing the errors between the true states and the estimated states. Combination of the two leads to optimal stochastic control. Applications of optimal stochastic control are to be found in science, economics, and engineering. The course presents a review of mathematical background, optimal control and estimation, duality, and optimal stochastic control.

Credit Hours: 3
Offered: Spring
Pre-Requisite: AAE 564
Co-Requisite: None
Professor Hwang
URL: https://engineering.purdue.edu/AAE/Academics/Courses/aae590w/
Text: Lecture notes
Homework, exams, and a course project (subject to modification)
  1. Review of some mathematical background
    • Matrices, random variables, dynamical system models
  2. Optimal control
    • Pontryagin's Maximum/Minimum principle
    • Hamilton-Jacobi-Bellman equation
    • Dynamic Programming
    • Linear Quadratic (LQR) problems
  3. Classical estimation
    • Minimum variance unbiased estimation
    • Least squares estimation
    • Maximum likelihood estimation
  4. Stochastic optimal control and estimation
    • Discrete/continuous-time state space models
    • Stochastic dynamic programming
    • Kalman Filter: discrete/continuous-time filters
    • Duality of LQR with Kalman filter (LQE)
    • Linear Quadratic Gaussian (LQG)
  5. Advanced estimation techniques
    • Nonlinear estimation, adaptive estimation, etc.

Prepared by: I. Hwang
Date: November 7, 2014