# Teaching

Term offered: each semester

Description:
Modeling and analysis of dynamical systems with aerospace applications.  Laplace transforms, transfer functions, block diagrams.  Transient and steady-state response of dynamical systems.  Root Locus, Bode, Nyquist methods for control systems analysis.  Introduction to controller design.

Objectives:
The objectives of this course are to (1) teach tools for the analysis of dynamical systems, and (2) give an introduction to the design of simple control systems.

• Analysis: Evaluation of the behavior of a dynamic system.
• Examples
• Predicting where a spacecraft will land on the moon.
• Determination of the pointing accuracy of a space telescope.
• Predicting the stock market.
• Determination of the altitude of an aircraft.
• Control systems:  “Mechanism” used to enable the operation of a dynamical    system, or to improve its behavior.
• Examples
• Computer systems that regulate the temperature in modern buildings.
• Flight computer that controls the pointing direction of a space telescope.
•  The autopilot that holds the altitude of an aircraft.
•  Automobile cruise controller.

Main Topics

1. Basic mathematical tools
Complex variables and functions.

Laplace transformation.

Linear, time-invariant, differential equations.

2. Mathematical modeling of dynamic systems

3. Control systems analysis
Time-domain analysis

Frequency domain analysis

4. Control systems design
Root-locus method

Frequency domain methods

Jointly offered by

Inseok Hwang, School of Aeronautics and Astronautics
Jianghai Hu, School of Electrical and Computer Engineering

Term offered: Spring, 2006

Prerequisite: Linear system theory and linear algebra.

Hybrid systems are dynamical systems with both continuous and discrete dynamics. Developed jointly by the computer science and control communities, they are finding increasing applications in a variety of engineering fields, even in scientific fields such as biological systems. This course will cover some basic aspects of hybrid systems, including their modeling, reachability and stability analysis, controller synthesis, optimization, and simulation tools, that are important in applying hybrid systems to engineering problems.

The revolution in digital technology has fueled a need for design techniques that can guarantee safety and performance specifications of embedded systems, or systems that couple discrete logics with analog physical environment. Such systems can be modeled by hybrid systems, which are dynamical systems that combine continuous-time dynamics modeled by differential equations and discrete-event dynamics modeled by finite automata. Important applications of hybrid systems include CAD, real-time software, robotics and automation, mechatronics, aeronautics, air and ground transportation systems, process control, as well as biological systems. Recently, hybrid systems have been at the center of intense research activity in the control theory, computer-aided verification, and artificial intelligence communities, and methodologies have been developed to model hybrid systems, to analyze their behaviors, and to synthesize controllers that guarantee closed-loop safety and performance specifications. These advances have also been complemented by computational tools for the automatic verification and simulation of hybrid systems. This course will present the recent advances in modeling, analysis, control, and verification of hybrid systems. Topics covered in this course include the following aspects of hybrid systems: continuous-time and discrete-event models; reachability analysis; safety specifications and model checking; optimal control and differential games; (Lyapunov) stability analysis and verification tools; stochastic hybrid systems; numerical simulations; and a range of engineering applications.

Course Content:

1. Continuous-time and discrete-event models
2. Reachability analysis, safety specifications, and model checking
3. Optimal control, estimation, and differential games
4. (Lyapunov) stability analysis and verification tools
5. Stochastic hybrid systems

Term offerd: Spring 2007

Prerequisite: Linear System Theory

The main objective of this course is to study analysis and synthesis methods of optimal controllers and estimators for stochastic dynamical systems. Optimal control is a time-domain method that computes the control input to a dynamical system which minimizes a cost function. The dual problem is optimal estimation which computes the estimated states of the system with stochastic disturbances by minimizing the errors between the true states and the estimated states. Combination of the two leads to optimal stochastic control. Applications of optimal stochastic control are to be found in science, economics, and engineering. The course presents a review of mathematical background, optimal control and estimation, duality, and optimal stochastic control.