BME 69500DL

(cross-listed as ECE 695)

Deep Learning


Avi Kak   and   Charles Bouman

Spring 2020


This class is substantially self-contained. All you need in order to enroll for this class is that you be a graduate student in engineering, computer science, quantitative psychology, mathematics, etc., and that you possess at least a rudimentary knowledge of programming in Python.

Online Lectures: Video Lectures

Week 1   Tuesday,  Jan 14: Course Intro: (slides)

  Thursday, Jan 16: (Bouman) (slides) Intro to ML: What is machine learning?; single layer
  neural networks; the MSE loss function

  (Link to Piazza class web cite. Can be used for asking and answering questions.)
Week 2   Tuesday,  Jan 21: (Kak) (slides) Python OO for DL

  Thursday, Jan 23: (Bouman) (slides) Intro to ML: Gradient descent optimization;
Week 3   Tuesday,  Jan 28: (Kak) (slides) Image and text datasets for DL research, Torchvision, Torchtext

  Thursday, Jan 30: (Bouman) (slides) Intro to ML: Tensors; GD for single layer NNs

  (Some of the Torchvision related material during this week will be illustrated with the functionality built into the
  RegionProposalGenerator module that you can access by clicking here.)
Week 4   Tuesday,  Feb 4:  (Kak) (slides) Autograd for automatic differentiation and computational graphs

  Thursday, Feb 6:  (Bouman) (slides) Intro to ML: Optimization of deep functions; GD on acyclic
  graph structures; general loss functions

  (Several of the key ideas used for automatic differentiation in Autograd will be explained with the help of the
  ComputationalGraphPrimer module that you can access by clicking here.)
Week 5   Tuesday,   Feb 11: (Kak) (slides) A first introduction to torch.nn for designing CNNs

  Thursday, Feb 13:: (Bouman) (slides) Intro to NNs: Convolutional NNs; adjoint gradient for CNNs

  (We will talk about torch.nn with the help of the new DLStudio module that you can access by clicking here.)
Week 6   Tuesday,  Feb 18: (Kak) (slides) Demystifying the Convolutions in PyTorch

  Thursday, Feb 20: (Bouman) (slides) Intro to ML: Probability and estimation; Frequentist
  and Bayesian estimation; the bias variance tradeoff
Week 7   Tuesday,  Feb 25: (Kak) (slides) Using Skip Connections to Mitigate the Problem of Vanishing
  Gradients in Deep Networks.

  Thursday, Feb 27: Mid-Term Test 1 ( exam , exam solution )

  (The material related to skip connections will be explained with the help an in-class demo based on the inner class SkipConnections
  of version 1.0.6 of the DLStudio module that you can access by clicking here.)
Week 8   Tuesday,  March 3: (Kak) (slides) Object Detection and Localization with Deep Networks

  Thursday, March 5: (Bouman) (slides) Intro to Optimization: stochastic gradient descent;
  momenturn; ADAM optimization

  (The material related to object detection and localization will be explained with the help an in-class demo based on the inner class
  DetectAndLocalize of version 1.0.7 of the DLStudio module that you can access by clicking here.)
Week 9   Tuesday,  March 10:   (Kak) (slides) Graph Based Algorithms for Generating Region Proposals for
  Object Detection

  Thursday, March 12:   (Bouman) (slides) Training Techniques: vanishing gradient; ResNet; U-Net;
  Transfer Learning; Data Augmentation;

  (Forming region proposals is critical to object detection. While the more recent frameworks use CNNs for generating region proposals,
   using the older approach based on the more traditional algorithms is still important for many problem domains. The traditional approach
   will be illustrated with the RegionProposalGenerator module that you can access by clicking here.)
Week 10               Spring Break
Week 11   Tuesday,  March 24: (Kak) (slides) Semantic Segmentation of Images with Fully Convolutional Networks

  Thursday, March 26: (Bouman) Intro to Optimization: non-linear back propagation and the chain rule
  forward/backward propagation; automated differentiation.

  (The material related to semantic segmentation is based on the nUnet network which is my implementation of the Unet. You will find the
  code for mUnet in version 1.1.1 of the DLStudio module that you can access by clicking here.)
Week 12   Tuesday,  March 31: (Kak) Further Disccussion on Semantic Segmentation (see previous lecture slides)

  Thursday, April 2: (Bouman) Advanced Optimization: convex versus non-convex optimization;
   momentum and Adam optimizer
Week 13   Tuesday,  April 7: (Kak) (slides) Recurrent Neural Networks for Text Classification

  Thursday, April 9: (Bouman) Advanced Optimization: batch normalization; regularization methods;
  transfer learning.

  (The material related to text classification is based on the TEXTnet, TEXTnetOrder2, and GRUnet networks in Version 1.1.4 of the DLStudio
   that you can access by clicking here.)
Week 14   Tuesday,  April 14:     (Kak) (slides) Using Word Embeddings for Text Search and Retrieval

  Thursday, April 16:     (Bouman) Advanced Optimization: minimax optimization; saddle points and local
  minimum; hyperparameters optimization.
Week 15   Tuesday,  April 21: (Bouman) The theory underlying reinforcement learning

  Thursday, April 23: Tuesday,  April 23: Mid-Term Test 2 ( exam solution )
Week 16   Tuesday,  April 28: (Kak) (slides) Reinforcement Learning with Discrete and Continuous State Spaces

  Thursday, April 30: (Bouman) Variational autoencoders; variational and conditional GANs.

Recommended Books:

- Deep Learning
- Neural Networks and Deep Learning
- Data Science from Scratch
- Python Machine Learning
- Pytorch tutorials

Links to documentation pages you will frequently be visiting:

- Master documentation page for PyTorch
- A direct link to the torch.nn module
- Master documentation page for Torchvision
- A direct link to Torchvision Transforms
- Master documentation page for Torchtext
- A useful summary of many of the most basic operations on PyTorch Tensors
- The homepage for CIFAR-10 and CIFAR-100 image datasets

Recommended Supplementary Course Material:

- CMU deep learning
- Stanford Class by FeiFei and Karpathy. For videos
- Udacity Deep Learning nano-degree

Valid HTML 4.01 Transitional Valid CSS!