ECE 69500 - Inference and Learning in Generative Models
Course Details
Lecture Hours: 3 Credits: 3
Areas of Specialization:
- Communications, Networking, Signal & Image Processing
- Computer Engineering
Counts as:
- EE Elective
- CMPE Complementary Selective
Normally Offered:
Spring - even years
Campus/Online:
On-campus only
Requisites:
MA 26100 and MA 26500
Requisites by Topic:
multivariable calculus, linear algebra, basic probability theory.
Catalog Description:
Generative models are a powerful alternative to discriminative models that, when properly specified, estimate their parameters more efficiently and can generate samples from the distribution of their input data, but also can be used (like discriminative models) to infer features or labels from their inputs. However, the generative and inferential faculties typically come at each other's expense. This course will cover five different attempts at finessing this trade-off, and the resulting learning algorithms: exact inference in directed graphical models (EM algorithm); sampling-based methods; deterministic approximate inference (variational EM); RBM-like architectures (contrastive-divergence learning); and generative adversarial networks (adversarial training).
Required Text(s):
- Pattern Recognition and Machine Learning , C. Bishop , Springer , 2006 , ISBN No. 9780387310732
Recommended Text(s):
None.
Lecture Outline:
Weeks | Major Topics |
---|---|
3 | Preliminaries: directed and undirected graphical models; generative vs. discriminative models; supervised vs. unsupervised learning; KL divergence, cross entropy, mutual information; exponential families; conjugate priors; Bayes' theorem; the multivariate Gaussian |
1 | Supervised Learning of Generative Models: Naive Bayes; linear & quadratic discriminant analysis; generalized linear models |
2.5 | Exact Inference in Directed Graphical Models: Gaussian mixture models; hidden Markov models; filtering and smoothing (forward-backward algorithm); factor analysis; linear-Gaussian dynamical systems; Kalman filtering and smoothing; the EM algorithm; EM algorithms for GMMs, HMMs, etc. |
1.5 | Sampling-Based Inference in Directed Graphical Models: basic sampling algorithms, MCMC, Gibbs sampling; particle filtering |
1 | Variational Inference: Variational EM; Sparse Coding (and ICA); Variational Autoencoders |
1 | The Exponential-Family Harmonium: the RBM and EFH; contrastive-divergence learning; deep belief networks; recurrent EFHs |
1 | Generative Adversarial Networks |
1 | Autoregressive Models: encoder-decoder models; PixelRNN; Transformers (GPT) |
2 | Current state of the art |
1 | Exams and project presentations |
Assessment Method:
Exams Homework Projects