BME 646 and ECE 60146 by Avi Kak and Charles Bouman Spring 2025 |
In order to do well in this class, you must be proficient in programming with Python. Beyond that, all that is required for you to enroll in this class is that you be a graduate student in engineering, computer science, quantitative psychology, mathematics, etc. |
Week 1 |
Tuesday, Jan 9:
Course Intro (Bouman) [Slides]
and Theory Lecture Syllabus (Bouman) [Slides],
and Python OO for DL (Kak) [Slides] [Python OO updated: Jan 16, 2024] Thursday, Jan 11: (Bouman) [slides] What is machine learning? Single layer neural networks and the MSE loss function |
Week 2 |
Tuesday, Jan 16: (Kak) [slides]
Torchvision and Random Tensors [updated: January 16, 2024]
Thursday, Jan 18: (Bouman) [Slides] Gradient descent optimization; Calculation of gradient; matrix interpretation of gradient (Some of the Torchvision related material during this week will be illustrated with the functionality built into the YOLOLogic module that you can access by clicking here.) |
Week 3 |
Tuesday, Jan 23: (Kak) [Slides]
Autograd for Automatic Differentiation and Auto-Construction of Computational Graphs [updated: January 28, 2024] Thursday, Jan 25: (Bouman) [slides] Intro to ML: Tensors; GD for single layer NNs; Local and global minima (Several of the key ideas used for automatic differentiation in Autograd will be explained with the help of the ComputationalGraphPrimer module that you can access by clicking here.) [updated to Version 1.1.4: January 28, 2024] |
Week 4 |
Tuesday, Jan 30: (Kak) [slides]
A First Introduction to Torch.nn for Designing Deep Networks and to DLStudio for Experimenting with Them [updated: January 25, 2024] Thursday, Feb 1: (Bouman) [Slides] Optimization of deep functions; GD on acyclic graphs; General loss functions (We will talk about torch.nn with the help of the DLStudio module that you can access by clicking here.) |
Week 5 |
Tuesday, Feb 6: (Kak) [slides]
Demystifying the Convolutions in PyTorch [updated: February 6, 2024]
Thursday, Feb 8:: (Bouman) [slides] Convolutional NNs; Adjoint gradient for CNNs |
Week 6 |
Tuesday, Feb 13: (Kak) [slides]
Using Skip Connections to Mitigate the Problem of Vanishing Gradients, and Using Batch, Instance, and Layer Normalizations for Improved SGD in Deep Networks [updated: February 17, 2024] Thursday, Feb 15: (Bouman) [slides] Probability and estimation; Frequentist vs Bayesian estimation; bias variance tradeoff (The material related to Kak's lecture on skip connections will be explained with the help an in-class demo based on the inner class SkipConnections of the DLStudio module that you can access by clicking here.) |
Week 7 |
Tuesday, Feb 20: (Kak) [slides]
Multi-Instance Object Detection -- Image Cells and Anchor Boxes [updated: February 20, 2024] Thursday, Feb 22: (Bouman) [Slides] Training and Generalization; Regularization and dropout methods (Multi-instance object detection and localization is much more difficult problem compared to the case when you have a single object in an image. Such problems are solved with the help of image cells and anchor boxes. This lecture introduces you to these concepts. My code for explaining these ideas is in Version 2.1.1 of my YOLOLogic module that you can access by clicking here.) |
Week 8 |
Tuesday, Feb 27: (Kak) [slides]
Transpose Convolutions and the Encoder-Decoder Architectures for Semantic Segmentation of Images [updated: September 19, 2024] Thursday, Feb 29: (Bouman) Mid-Term Test 1 ( exams ) (The material related to semantic segmentation is based on the nUnet network which is my implementation of the Unet. You will find the code for mUnet in my DLStudio module that you can access by clicking here.) |
Week 9 |
Tuesday, March 5: (Kak) [slides]
Metric Learning with Deep Neural Networks [posted: March 5, 2024]
Thursday, March 7: (Bouman) [slides] Stochastic gradient descent; Batches and epochs; Learning rate and momenturn; ADAM optimization |
Week 10 |
Spring Break
|
Week 11 |
Tuesday, March 19: (Kak) [Slides]
Generative Data Modeling with Networks Based on Adversarial Learning and Denoising Diffusion [updated: March 26, 2024] Thursday, March 21: (Bouman) [slides] Batch normalization; Positional encoding; NeRFs (The lecture by Kak on Generative Modeling will be explained with the help of demos based on the code in DLStudio's AdversarialLearning and Diffusion modules that you can access by clicking here.) |
Week 12 |
Tuesday, March 26: (Kak) [Slides]
Recurrent Neural Networks for Text Classification and Data Prediction [updated: April 2, 2024] Thursday, March 28: (Bouman) [slides] Recurrent Neural Networks, LSTM, GRU; Unsupervised Learning and Autoencoders (The material related to text classification is based on the TEXTnet, TEXTnetOrder2, and GRUnet networks of the DLStudio that you can access by clicking here.) |
Week 13 |
Tuesday, April 2: (Kak) [Slides]
Word Embeddings and Sequence-to-Sequence Learning
[updated: April 2, 2024]
Thursday, April 4: (Bouman) [slides] Adversarial Learning, generators and discriminators, generative adversarial networks, and Nash equilibrium. |
Week 14 |
Tuesday, April 9: (Kak) [Slides]
Transformers: Learning with Purely Attention Based Networks [updated: May 4, 2024] Thursday, April 11: (Bouman) [slides] GAN convergence, theory and practice; Wasserstein GANs; and Conditional GANs |
Week 15 |
Tuesday, April 16: (Kak)
[slides]
Transformer Based Learning for BERT and GPT Language Models [updated: October 23, 2024] Thursday, April 18: (Bouman) Mid-Term Test 2 ( exams ) |
Week 16 |
Tuesday, April 23: (Kak) [Slides]
Reinforcement Learning: Incorporating Human Preferences in the Fine-Tuning of Large Language Models [updated: April 23, 2024] Thursday, April 25: (Bouman) [slides] Generative diffusion models, and DALL-E generative diffusion models |
Links to the documentation pages you will be visiting frequently:
Recommended Books:
Recommended Supplementary Course Material: