Applied Optimal Control And Estimation

This course introduces students to analysis and synthesis methods of optimal controllers and estimators for deterministic and stochastic dynamical systems. Optimal control is a time-domain method that computes the control input to a dynamical system which minimizes a cost function. The dual problem is optimal estimation which computes the estimated states of the system with stochastic disturbances by minimizing the errors between the true states and the estimated states. Combination of the two leads to optimal stochastic control. Applications of optimal stochastic control are to be found in science, economics, and engineering. The course presents a review of mathematical background, optimal control and estimation, duality, and optimal stochastic control. Spring 2020 Syllabus

AAE56800

Credit Hours:

3

Learning Objective:

1. Study analysis and synthesis methods of optimal controllers and estimators for deterministic and stochastic dynamical systems. 2. Study a review of probability and random processes, calculus of variations, dynamic programming, Maximum Principles, optimal control and estimation, duality, and optimal stochastic control. 3. Through class projects, students learn how to effectively communicate their ideas and how to formulate a problem and solve it.

Description:

This course introduces students to analysis and synthesis methods of optimal controllers and estimators for deterministic and stochastic dynamical systems. Optimal control is a time-domain method that computes the control input to a dynamical system which minimizes a cost function. The dual problem is optimal estimation which computes the estimated states of the system with stochastic disturbances by minimizing the errors between the true states and the estimated states. Combination of the two leads to optimal stochastic control. Applications of optimal stochastic control are to be found in science, economics, and engineering. The course presents a review of mathematical background, optimal control and estimation, duality, and optimal stochastic control.
Spring 2020 Syllabus

Topics Covered:

Review of some mathematical background: matrices, random variables, dynamical system models; Optimal Control: Pontryagin's maximum/minimum principle, Hamilton-Jacobi-Bellman equation, Dynamic Programming, Linear Quadratic (LQR) problems; Classical Estimation: minimum variance unbiased estimation, least squares estimation, maximum likelihood estimation; Stochastic optimal control and estimation: discrete/continuous-time state space models, stochastic dynamic programming, Kalman Filter:discrete/continuous-time filters, duality of LQR with Kalman filter (LQE), Linear Quadratic Gaussian (LQG); Advanced estimation techniques: nonlinear estimation, adaptive estimation, etc.

Prerequisites:

Graduate Standing. Understanding of basic concepts and techniques used in the analysis and control design of linear systems.

Applied / Theory:

30 / 70

Web Address:

https://mycourses.purdue.edu

Web Content:

A link to current course website, syllabus, grades, lecture notes, homework assignments, solutions, message board and email.

Homework:

There will be a few homework assignments.

Projects:

Required. The project is a team project with two people (The team size can vary). The project could be an extension of existing methods in the literature or, preferably, involve the original research ideas related to your research interests.

Exams:

No exams.

Textbooks:

Official textbook information is now listed in the Schedule of Classes. NOTE: Textbook information is subject to be changed at any time at the discretion of the faculty member. If you have questions or concerns please contact the academic department.
Tentative: No textbook. Lecture notes and class materials.

Computer Requirements:

ProEd Minimum Requirements and MATLAB.

ProEd Minimum Requirements:

view