BME 646 and ECE 60146 by Avi Kak and Charles Bouman Spring 2025 |
In order to do well in this class, you must be proficient in programming with Python. Beyond that, all that is required for you to enroll in this class is that you be a graduate student in engineering, computer science, quantitative psychology, mathematics, etc. |
Week 1 |
Tuesday, Jan 14:
Course Intro (Bouman) [Slides]
and Theory Lecture Syllabus (Bouman) [Slides],
and Python OO for DL (Kak) [Slides] [Python OO updated: Jan 16, 2024] Thursday, Jan 16: (Bouman) [slides] What is machine learning? Single layer neural networks and the MSE loss function |
Week 2 |
Tuesday, Jan 21: (Kak) [slides]
Torchvision and Random Tensors [updated: January 16, 2024]
Thursday, Jan 23: (Bouman) [Slides] Gradient descent optimization; Calculation of gradient; matrix interpretation of gradient (Some of the Torchvision related material during this week will be illustrated with the functionality built into the YOLOLogic module that you can access by clicking here.) |
Week 3 |
Tuesday, Jan 28: (Kak) [Slides]
Autograd for Automatic Differentiation and Auto-Construction of Computational Graphs [updated: January 28, 2024] Thursday, Jan 30: (Bouman) [slides] Intro to ML: Tensors; GD for single layer NNs; Local and global minima (Several of the key ideas used for automatic differentiation in Autograd will be explained with the help of the ComputationalGraphPrimer module that you can access by clicking here.) [updated to Version 1.1.4: January 28, 2024] |
Week 4 |
Tuesday, Feb 4: (Kak) [slides]
A First Introduction to Torch.nn for Designing Deep Networks and to DLStudio for Experimenting with Them [updated: January 25, 2024] Thursday, Feb 6: (Bouman) [Slides] Optimization of deep functions; GD on acyclic graphs; General loss functions (We will talk about torch.nn with the help of the DLStudio module that you can access by clicking here.) |
Week 5 |
Tuesday, Feb 11: (Kak) [slides]
Demystifying the Convolutions in PyTorch [updated: February 6, 2024]
Thursday, Feb 13:: (Bouman) [slides] Convolutional NNs; Adjoint gradient for CNNs |
Week 6 |
Tuesday, Feb 18: (Kak) [slides]
Using Skip Connections to Mitigate the Problem of Vanishing Gradients, and Using Batch, Instance, and Layer Normalizations for Improved SGD in Deep Networks [updated: February 17, 2024] Thursday, Feb 20: (Bouman) [slides] Probability and estimation; Frequentist vs Bayesian estimation; bias variance tradeoff (The material related to Kak's lecture on skip connections will be explained with the help an in-class demo based on the inner class SkipConnections of the DLStudio module that you can access by clicking here.) |
Week 7 |
Tuesday, Feb 27: (Kak) [slides]
Multi-Instance Object Detection -- Image Cells and Anchor Boxes [updated: February 20, 2024] Thursday, March 1: (Bouman) [Slides] Training and Generalization; Regularization and dropout methods (Multi-instance object detection and localization is much more difficult problem compared to the case when you have a single object in an image. Such problems are solved with the help of image cells and anchor boxes. This lecture introduces you to these concepts. My code for explaining these ideas is in Version 2.1.1 of my YOLOLogic module that you can access by clicking here.) |
Week 8 |
Tuesday, March 6: (Kak) [slides]
Transpose Convolutions and the Encoder-Decoder Architectures for Semantic Segmentation of Images [updated: September 19, 2024] Thursday, March 8: (Bouman) Mid-Term Test 1 ( exams ) (The material related to semantic segmentation is based on the nUnet network which is my implementation of the Unet. You will find the code for mUnet in my DLStudio module that you can access by clicking here.) |
Week 9 |
Tuesday, March 13: (Kak) [slides]
Metric Learning with Deep Neural Networks [posted: March 5, 2024]
Thursday, March 15: (Bouman) [slides] Stochastic gradient descent; Batches and epochs; Learning rate and momenturn; ADAM optimization |
Week 10 |
Spring Break
|
Week 11 |
Tuesday, March 25: (Kak) [Slides]
Generative Data Modeling with Networks Based on Adversarial Learning and Denoising Diffusion [updated: March 26, 2024] Thursday, March 27: (Bouman) [slides] Batch normalization; Positional encoding; NeRFs (The lecture by Kak on Generative Modeling will be explained with the help of demos based on the code in DLStudio's AdversarialLearning and Diffusion modules that you can access by clicking here.) |
Week 12 |
Tuesday, April 1: (Kak) [Slides]
Recurrent Neural Networks for Text Classification and Data Prediction [updated: April 2, 2024] Thursday, April 3: (Bouman) [slides] Recurrent Neural Networks, LSTM, GRU; Unsupervised Learning and Autoencoders (The material related to text classification is based on the TEXTnet, TEXTnetOrder2, and GRUnet networks of the DLStudio that you can access by clicking here.) |
Week 13 |
Tuesday, April 8: (Kak) [Slides]
Word Embeddings and Sequence-to-Sequence Learning
[updated: April 2, 2024]
Thursday, April 10: (Bouman) [slides] Adversarial Learning, generators and discriminators, generative adversarial networks, and Nash equilibrium. |
Week 14 |
Tuesday, April 15: (Kak) [Slides]
Transformers: Learning with Purely Attention Based Networks [updated: May 4, 2024] Thursday, April 17: (Bouman) [slides] GAN convergence, theory and practice; Wasserstein GANs; and Conditional GANs |
Week 15 |
Tuesday, April 22: (Kak)
[slides]
Transformer Based Learning for BERT and GPT Language Models [updated: November 5, 2024] Thursday, April 24: (Bouman) Mid-Term Test 2 ( exams ) |
Week 16 |
Tuesday, April 29: (Kak) [Slides]
Reinforcement Learning: Incorporating Human Preferences in the Fine-Tuning of Large Language Models [updated: April 23, 2024] Thursday, May 1: (Bouman) [slides] Generative diffusion models, and DALL-E generative diffusion models |
Links to the documentation pages you will be visiting frequently:
Recommended Books:
Recommended Supplementary Course Material: