Description:
Modeling and analysis of
dynamical systems with aerospace applications.
Complex variables and functions.
Linear,
time-invariant, differential equations.
Time-domain
analysis
Frequency
domain analysis
Root-locus
method
Frequency
domain methods
Jointly offered by
Inseok Hwang,
Jianghai Hu, School of
Electrical and Computer Engineering
Term
offered: Spring, 2006
Prerequisite: Linear system theory and linear algebra.
Hybrid systems are
dynamical systems with both continuous and discrete
dynamics. Developed jointly by the computer science and control
communities, they are finding increasing applications in a variety of
engineering
fields, even in scientific fields such as biological systems. This
course will
cover some basic aspects of hybrid systems, including their modeling,
reachability and stability analysis, controller synthesis,
optimization, and simulation
tools, that are important in applying hybrid systems to engineering
problems.
The revolution in digital technology has fueled a
need for design techniques that can guarantee safety and performance
specifications of embedded
systems, or systems that couple discrete logics with analog physical
environment. Such systems can be modeled by hybrid systems, which are
dynamical systems
that combine continuous-time dynamics modeled by differential equations
and
discrete-event dynamics modeled by finite automata. Important
applications of
hybrid systems include CAD, real-time software, robotics and
automation,
mechatronics, aeronautics, air and ground transportation systems,
process
control, as well as biological systems. Recently, hybrid systems have
been at
the center of intense research activity in the control theory,
computer-aided
verification, and artificial intelligence communities, and
methodologies have
been developed to model hybrid systems, to analyze their behaviors, and
to
synthesize controllers that guarantee closed-loop safety and
performance
specifications. These advances have also been complemented by
computational
tools for the automatic verification and simulation of hybrid systems.
This course will present the recent advances in modeling,
analysis, control, and verification of hybrid systems. Topics covered
in this
course include the following aspects of hybrid systems: continuous-time
and
discrete-event models; reachability analysis; safety specifications and
model
checking; optimal control and differential games; (Lyapunov) stability
analysis
and verification tools; stochastic hybrid systems; numerical
simulations; and a
range of engineering applications.
Term offerd: Spring 2007
Prerequisite: Linear System Theory
The
main objective of this course is to study analysis and synthesis
methods of
optimal controllers and estimators for stochastic dynamical systems.
Optimal
control is a time-domain method that computes the control input to a
dynamical
system which minimizes a cost function. The dual problem is optimal
estimation
which computes the estimated states of the system with stochastic
disturbances
by minimizing the errors between the true states and the estimated
states.
Combination of the two leads to optimal stochastic control.
Applications of
optimal stochastic control are to be found in science, economics, and
engineering. The course presents a review of mathematical background,
optimal
control and estimation, duality, and optimal stochastic control.