BME 646 and ECE 60146

Deep Learning


Avi Kak   and   Charles Bouman

Next Offering: Spring 2023


In order to do well in this class, you must be proficient in programming with Python. Beyond that, all that is required for you to enroll in this class is that you be a graduate student in engineering, computer science, quantitative psychology, mathematics, etc.

Piazza: link

Click Here for the Homework Problems and Their Two Best Solutions for Previous Offerings of This Class

Click Here for Old Exams and Solutions

Week 1   Tuesday,  Jan 11: Course Intro (Bouman) [Slides] and Theory Lecture Syllabus (Bouman) [Slides],
  and Python OO for DL (Kak) [Slides]    [OO updated: April 19, 2022]

  Thursday, Jan 13: (Bouman) [slides] What is machine learning? Single layer neural networks
  and the MSE loss function

Week 2   Tuesday,  Jan 18: (Kak) [slides] Torchvision and Random Tensors    [updated: April 19, 2022]

  Thursday, Jan 20: (Bouman) [Slides] Gradient descent optimization; Calculation of gradient;
  matrix interpretation of gradient

  (Some of the Torchvision related material during this week will be illustrated with the functionality built into the
  RegionProposalGenerator module that you can access by clicking here.)
Week 3   Tuesday,  Jan 25:  (Kak) [Slides] Autograd for Automatic Differentiation and Auto-Construction
  of Computational Graphs    [updated: May 2, 2022]

  Thursday, Jan 27: (Bouman) [slides] Intro to ML: Tensors; GD for single layer NNs; Local and
  global minima

  (Several of the key ideas used for automatic differentiation in Autograd will be explained with the help of the
  ComputationalGraphPrimer module that you can access by clicking here.)      [updated to Version 1.0.8: February 2, 2022]
Week 4   Tuesday,   Feb 1: (Kak) [slides] A First Introduction to Torch.nn for Designing Deep Networks
  and to DLStudio for Experimenting with Them    [updated: April 18, 2022]

  Thursday, Feb 3:  (Bouman) [Slides] Optimization of deep functions; GD on acyclic graphs;
  General loss functions

  (We will talk about torch.nn with the help of the DLStudio module that you can access by clicking here.)
Week 5   Tuesday,  Feb 8: (Kak) [slides] Demystifying the Convolutions in PyTorch    [updated: April 18, 2022]

  Thursday, Feb 10:: (Bouman) [slides] Convolutional NNs; Adjoint gradient for CNNs
Week 6   Tuesday,  Feb 15: (Kak) [slides] Using Skip Connections to Mitigate the Problem of Vanishing Gradients,
  and Using Batch, Instance, and Layer Normalizations for Improved SGD in Deep Networks     [updated: April 18, 2022]

  Thursday, Feb 17: (Bouman) [slides] Probability and estimation; Frequentist vs Bayesian estimation;
  bias variance tradeoff

  (The material related to Kak's lecture on skip connections will be explained with the help an in-class demo based on the
  inner class SkipConnections of the DLStudio module that you can access by clicking here.)
Week 7   Tuesday,  Feb 22: (Kak) [Slides] Object Detection and Localization with Deep Networks    [updated: April 18, 2022]

  Thursday, Feb 24: (Bouman) [Slides] Training and Generalization; Regularization and dropout methods

  (The material related to object detection and localization will be explained with the help an in-class demo based on the inner class
  DetectAndLocalize of the DLStudio module that you can access by clicking here.)
Week 8   Tuesday,  March 1:   (Kak) [slides] Multi-Instance Object Detection -- Anchor Boxes and Region
  Proposals    [updated: September 5, 2022]

  Thursday, March 3: Mid-Term Test 1 ( exams )

  (You must solve the problem of multi-instance object detection and localization when an image is allowed to contain multiple objects
  of interest. In such cases, the input/output relationship for a neural network is made complicated by the presences of multiple bounding
  boxes and multiple class labels in the same image. This problem has been solved with the help of region proposals and anchor boxes.
  The goal of this lecture is to introduce you to these concepts. My code for explaining these ideas is in Version 2.0.8 of my
  RegionProposalGenerator module that you can access by clicking here.)
Week 9   Tuesday,  March 8: (Kak) [slides] Encoder-Decoder Architectures for Semantic Segmentation of
  Images    [updated: April 18, 2022]

  Thursday, March 10:   (Bouman) [slides] Stochastic gradient descent; Batches and epochs;
  Learning rate and momenturn; ADAM optimization

  (The material related to semantic segmentation is based on the nUnet network which is my implementation of the Unet. You will find the
  code for mUnet in my DLStudio module that you can access by clicking here.)
Week 10               Spring Break
Week 11   Tuesday,  March 22: (Kak) [Slides] Generative Adversarial Networks for Data Modeling     [updated: April 18, 2022]

  Thursday, March 24: (Bouman) [slides] Vanishing gradients; Batch normalization;
   Transfer learning and data augmentation

  (The lecture by Kak on Adversarial Learning for data modeling will be explained with the help of demos based on the code in
  DLStudio's AdversarialLearning class that you can access by clicking here.)
Week 12   Tuesday,  March 29: (Kak) [Slides] Recurrent Neural Networks for Text Classification and Data Prediction
  [updated: April 18, 2022]

  Thursday, March 31: (Bouman) [slides] Recurrent Neural Networks: LSTM; GRU;

  (The material related to text classification is based on the TEXTnet, TEXTnetOrder2, and GRUnet networks of the DLStudio
   that you can access by clicking here.)
Week 13   Tuesday,  April 5: (Kak) [Slides] Word Embeddings and Sequence-to-Sequence Learning     [updated: April 18, 2022]

  Thursday, April 7: (Bouman) [slides] Unsupervised Training: Autoencoders; Self and Zero-shot
Week 14   Tuesday,  April 12: (Kak) [Slides] Transformers: Learning with Purely Attention Based Networks
  [updated: April 18, 2022]

  Thursday, April 14: (Bouman) [slides] Adversarial Learning: Nash Equilibrium; Zero-sum games;
  and GANs;
Week 15   Tuesday,  April 19: (Bouman) Generative Adversarial Networks (GAN): GANs; Conditional GANs;
   Wasserstein distance

  Thursday, April 21: Mid-Term Test 2 ( exams )
Week 16   Tuesday,  April 26: (Kak) [Slides] Reinforcement Learning with Discrete and Continuous State
  Spaces     [updated: April 28, 2022]

  Thursday, April 28: (Bouman) Reinforcement learning

Links to the documentation pages you will be visiting frequently:

- DLStudio    [updated: March 5, 2022]
- ComputationalGraphPrimer    [updated: February 2, 2022]
- RegionProposalGenerator    [updated: April 16, 2022]
- Master documentation page for PyTorch
- A direct link to the torch.nn module
- Torchvision Datasets
- Master documentation page for Torchvision
- A direct link to Torchvision Transforms
- Master documentation page for Torchtext
- A useful summary of many of the most basic operations on PyTorch Tensors
- The homepage for CIFAR-10 and CIFAR-100 image datasets

Recommended Books:

- The Principles of Deep Learning Theory
- Deep Learning
- Neural Networks and Deep Learning
- Data Science from Scratch
- Python Machine Learning
- Pytorch tutorials

Recommended Supplementary Course Material:

- CMU deep learning
- Stanford Class by FeiFei and Karpathy. For videos
- Udacity Deep Learning nano-degree

Valid HTML 4.01 Transitional Valid CSS!