Avi Kak and Charles Bouman
This class is substantially self-contained. All you need in order to enroll for this class is that you be a graduate student in engineering, computer science, quantitative psychology, mathematics, etc., and that you possess at least a rudimentary knowledge of programming in Python.
Tuesday, Jan 14:
Course Intro: (slides)
Thursday, Jan 16: (Bouman) (slides) Intro to ML: What is machine learning?; single layer
neural networks; the MSE loss function
Tuesday, Jan 21: (Kak) Python OO for DL
Thursday, Jan 23: (Bouman) (slides) Intro to ML: Gradient descent optimization
(Python OO will be taught with the help of the slides that you can download by clicking here.)
Tuesday, Jan 28: (Kak) Image and text datasets for DL research, Torchvision, Torchtext
Thursday, Jan 30: (Bouman) (slides) Intro to ML: Local and global minima; training
(Some of the Torchvision related material during this week will be illustrated with the functionality built into the
RegionProposalGenerator module that you can access by clicking here.)
Tuesday, Feb 4: (Kak) Autograd for automatic differentiation and computational graphs
Thursday, Feb 6: (Bouman) (slides) Intro to ML: Optimization of deep functions;
alternative loss functions
(Several of the key ideas used for automatic differentiation in Autograd will be explained with the help of the
ComputationalGraphPrimer module that you can access by clicking here.)
Tuesday, Feb 11: (Kak) torch.nn for designing neural and convolutional networks
Thursday, Feb 13:: (Bouman) (slides) Intro to ML: Probability and estimation; Frequentist
and Bayesian estimation; the bias variance tradeoff
Tuesday, Feb 18: (Kak) Residual Learning with skipped connections; ResNet architecture
Thursday, Feb 20: (Bouman) Intro to NNs: Why do NNs work? Deep versus shallow;
NNs versus tree classifiers; space invariance; fast nearest neighbor
Tuesday, Feb 25: (Kak) Using drop-offs and data normalization to improve generalization
Thursday, Feb 27: Mid-Term Test 1
Tuesday, March 3: (Kak) Generating region proposals for object detection
Thursday, March 5: (Bouman) Intro to Optimization: gradient descent; preconditioning;
stochastic gradient descent.
(Forming region proposals is critical to object detection. While the more recent frameworks use CNNs for generating
region proposals, using the older approach based on the more traditional algorithms is still important for many
problem domains. The traditional approach will be illustrated with the RegionProposalGenerator module that you can
access by clicking here.)
Tuesday, March 10: (Kak) Convolutional networks for object detection
Thursday, March 12: (Bouman) Theoretical characterizations of detectors with ROC,
etc.; Intro to Optimization: matrix representation of DIAGs; linear
back propagation and the chain rule; computational analysis
Tuesday, March 24: (Kak) Recurrent Networks for language modeling and seq2seq learning
Thursday, March 26: (Bouman) Intro to Optimization: non-linear back propagation and the
chain rule; forward/backward propagation; automated differentiation.
Tuesday, March 31: (Kak) seq2seq learning: GRU and LSTM for adaptation and dealing with
Thursday, April 2: (Bouman) Advanced Optimization: convex versus non-convex
optimization; momentum and Adam optimizer
Tuesday, April 7: (Kak) Word embeddings for textural classification, torch.nn.Embedding
Thursday, April 9: (Bouman) Advanced Optimization: batch normalization; regularization
methods; transfer learning.
Tuesday, April 14: (Kak) GANs for domain adaptation and domain repair
Thursday, April 16: (Bouman) Advanced Optimization: minimax optimization; saddle points
and local minimum; hyperparameters optimization.
Tuesday, April 21: (Bouman) The theory underlying reinforcement learning
Thursday, April 23: (Kak) Reinforcement learning with the Double Q algorithm and the
Tuesday, April 28: Mid-Term Test 2
Thursday, April 30: (Bouman) Variational autoencoders; variational and conditional GANs.
Tuesday, May 5: (Kak) Is deep learning really DEEP learning?
Thursday, May 7: (Bouman) Limitations of DL
Links to documentation pages you will frequently be visiting:
Recommended Supplementary Course Material: