BME 69500DL

(cross-listed as ECE 695)

Deep Learning

by

Avi Kak   and   Charles Bouman


Spring 2020

PREREQUISITES FOR THIS CLASS

This class is substantially self-contained. All you need in order to enroll for this class is that you be a graduate student in engineering, computer science, quantitative psychology, mathematics, etc., and that you possess at least a rudimentary knowledge of programming in Python.



Week 1   Tuesday,  Jan 14: Course Intro: (slides)

  Thursday, Jan 16: (Bouman) (slides) Intro to ML: What is machine learning?; single layer
  neural networks; the MSE loss function
Week 2   Tuesday,  Jan 21: (Kak) Python OO for DL

  Thursday, Jan 23: (Bouman) (slides) Intro to ML: Gradient descent optimization


  (Python OO will be taught with the help of the slides that you can download by clicking here.)
Week 3   Tuesday,  Jan 28: (Kak) Image and text datasets for DL research, Torchvision, Torchtext

  Thursday, Jan 30: (Bouman) (slides) Intro to ML: Local and global minima; training
  and generalization

  (Some of the Torchvision related material during this week will be illustrated with the functionality built into the
  RegionProposalGenerator module that you can access by clicking here.)
Week 4   Tuesday,  Feb 4:  (Kak) Autograd for automatic differentiation and computational graphs

  Thursday, Feb 6:  (Bouman) (slides) Intro to ML: Optimization of deep functions;
  alternative loss functions

  (Several of the key ideas used for automatic differentiation in Autograd will be explained with the help of the
  ComputationalGraphPrimer module that you can access by clicking here.)
Week 5   Tuesday,   Feb 11: (Kak) torch.nn for designing neural and convolutional networks

  Thursday, Feb 13:: (Bouman) (slides) Intro to ML: Probability and estimation; Frequentist
  and Bayesian estimation; the bias variance tradeoff
Week 6   Tuesday,  Feb 18: (Kak) Residual Learning with skipped connections; ResNet architecture

  Thursday, Feb 20: (Bouman) Intro to NNs: Why do NNs work? Deep versus shallow;
                                NNs versus tree classifiers; space invariance; fast nearest neighbor
Week 7   Tuesday,  Feb 25: (Kak) Using drop-offs and data normalization to improve generalization

  Thursday, Feb 27: Mid-Term Test 1
Week 8   Tuesday,  March 3: (Kak) Generating region proposals for object detection

  Thursday, March 5: (Bouman) Intro to Optimization: gradient descent; preconditioning;
                                stochastic gradient descent.


  (Forming region proposals is critical to object detection. While the more recent frameworks use CNNs for generating
   region proposals, using the older approach based on the more traditional algorithms is still important for many
   problem domains. The traditional approach will be illustrated with the RegionProposalGenerator module that you can
   access by clicking here.)
Week 9   Tuesday,  March 10:   (Kak) Convolutional networks for object detection

  Thursday, March 12:   (Bouman) Theoretical characterizations of detectors with ROC,
                                    etc.; Intro to Optimization: matrix representation of DIAGs; linear
                                    back propagation and the chain rule; computational analysis
Week 10   Tuesday,  March 24: (Kak) Recurrent Networks for language modeling and seq2seq learning

  Thursday, March 26: (Bouman) Intro to Optimization: non-linear back propagation and the
                                    chain rule; forward/backward propagation; automated differentiation.
Week 11   Tuesday,  March 31: (Kak) seq2seq learning: GRU and LSTM for adaptation and dealing with
                                    vanishing gradient

  Thursday, April 2: (Bouman) Advanced Optimization: convex versus non-convex
                                    optimization; momentum and Adam optimizer
Week 12   Tuesday,  April 7: (Kak) Word embeddings for textural classification, torch.nn.Embedding

  Thursday, April 9:     (Bouman) Advanced Optimization: batch normalization; regularization
                                    methods; transfer learning.
Week 13   Tuesday,  April 14:     (Kak) GANs for domain adaptation and domain repair

  Thursday, April 16:     (Bouman) Advanced Optimization: minimax optimization; saddle points
                                    and local minimum; hyperparameters optimization.
Week 14   Tuesday,  April 21: (Bouman) The theory underlying reinforcement learning

  Thursday, April 23: (Kak) Reinforcement learning with the Double Q algorithm and the
                                    DQN network
Week 15   Tuesday,  April 28: Mid-Term Test 2

  Thursday, April 30: (Bouman) Variational autoencoders; variational and conditional GANs.
Week 16   Tuesday,  May 5: (Kak) Is deep learning really DEEP learning?

  Thursday, May 7: (Bouman) Limitations of DL


Recommended Books:

- https://www.manning.com/books/grokking-deep-learning
- Deep Learning
- Neural Networks and Deep Learning
- Data Science from Scratch
- Python Machine Learning
- http://machinelearningmastery.com/deep-learning-courses/
- Pytorch tutorials

Links to documentation pages you will frequently be visiting:

- Master documentation page for PyTorch
- A direct link to the torch.nn module
- Master documentation page for Torchvision
- A direct link to Torchvision Transforms
- Master documentation page for Torchtext
- A useful summary of many of the most basic operations on PyTorch Tensors
- The homepage for CIFAR-10 and CIFAR-100 image datasets

Recommended Supplementary Course Material:

- CMU deep learning
- Stanford Class by FeiFei and Karpathy. For videos
- Udacity Deep Learning nano-degree

Valid HTML 4.01 Transitional Valid CSS!