August 8, 2022
Fall 2022 Course: CS 59200-PML: Probabilistic Machine Learning
I am a faculty at Purdue CS. This Fall semester I will be teaching CS 59200-PML: Probabilistic Machine Learning TTh 12:00 - 1:15p which may be of interest to graduate students in ECE.
Assistant Professor, Department of Computer Science
If you ever wondered: How can a machine learn from data? How should we construct appropriate models? How do we design algorithms for inference? What are the fundamental design principles of data-driven learning and decision-making?
This course will help you answer these fundamental questions in machine learning from a probabilistic perspective, and will introduce techniques and research topics in the field.
For more information, please see below or the attached syllabus. Hope to see you in class!
CS 59200 – Probabilistic Machine Learning
Semester: Fall 2022
Lecture: TTh 12:00 - 1:15p @ LWSN B134
Instructor: Ruqi Zhang
Prerequisites: Knowledge of basic probability and machine learning
How can a machine learn from data? How should we construct appropriate models? How do we design algorithms for inference? What are the fundamental design principles of data-driven learning and decision-making?
This course will help you answer these fundamental questions in machine learning from a probabilistic perspective. You will read and discuss research papers in probabilistic machine learning, including but not limited to research topics about
- Probabilistic models: Bayesian neural networks, variational autoencoder, normalizing flows, score-based generative models, diffusion models, Gaussian processes.
- Probabilistic inference: Markov chain Monte Carlo (MCMC), variational inference, Laplace approximation, dropout, ensemble.
- Evaluations and applications: uncertainty estimation, calibration, distribution shift, out-of-distribution detection, adversarial attacks.
This course is designed to introduce techniques and research topics in probabilistic machine learning. Students at all levels are welcome. PhD, master and undergrad students who are interested in doing a research project in related fields are particularly encouraged to take this course.
The class will be a mix of lectures and student presentations. Students will be mostly presenting and discussing research papers from top-tier ML venues, followed by breakout discussions about the material. I will give some lectures on probabilistic machine learning foundations.
By the end of the semester, you should be able to:
- Think about any ML problem from a probabilistic perspective.
- Be familiar with common probabilistic models and inference methods.
- Know the style of academic writing in ML.
- Know how to critically assess a ML paper.
- Formulate and carry out a short-term research project.
Learning Resources, Technology & Texts
There is no required textbook for the course. Some recommended textbooks:
- Probabilistic Machine Learning (a book series), Kevin Murphy
- Pattern Recognition and Machine Learning, Christopher Bishop
- Information Theory, Inference, and Learning Algorithm, David MacKay
- Deep Learning, Ian Goodfellow, Yoshua Bengio and Aaron Courville
Paper presentation (25%)
Final project (55%)
Discussion participation (5%)