ECE 59500 - Introduction to Deep Learning
Course Details
Lecture Hours: 3 Credits: 3
Counts as:
- EE Elective
- CMPE Selective - Special Content
Normally Offered:
Each Fall
Campus/Online:
On-campus and online
Requisites:
ECE 30200 and MA 26500
Requisites by Topic:
Probability, Linear Algebra
Catalog Description:
This course provides focused training on deep learning algorithms; students will acquire a principled understanding for the various techniques that have a proven successful record in solving important engineering problems. Further, hands-on experimental training will be provided through the course projects.
Required Text(s):
None.
Recommended Text(s):
- Deep Learning (available online; link provided in Brightspace) , 1st Edition , Ian Goodfellow, Yoshua Bengio, and Aaron Courville , MIT Press , 2016 , ISBN No. 0262035618
Learning Outcomes:
A student who successfully fulfills the course requirements will have demonstrated an ability to:
- Justify the development state-of-the-art deep learning algorithms with a clear historical perspective.. [None]
- Make design choices regarding the construction of deep learning algorithms.. [None]
- Implement, optimize and tune state-of-the-art deep neural network architectures.. [None]
- Identify and address the security aspects of state-of-the-art deep learning algorithms.. [None]
- Examine open research problems in deep learning, and proposed approaches in the literature to tackle them.. [None]
Lecture Outline:
Week | Lecture Topics |
---|---|
1 | Introduction to deep learning; Non-Linearity and Complexity of the Hypothesis Space; Components of a learning Algorithm |
2 | Gradient-based learning; sigmoidal output units for Bernoulli distributions; Softmax output units for multinoulli distributions |
3 | Hidden unit activation functions: Rectified linear units (ReLU); Variants of ReLU activation |
4 | Universal approximation theorem, impact of depth and introduction to back propagation |
5 | Back Propagation in a Fully Connected Multi-Layer Perceptron; Introduction to Regularization for Deep Learning |
6 | L2 and L1 regularization for deep learning |
7 | Norm Penalty vs. Explicit Constraints, Dataset Augmentation, and Noise Injection; Adding Noise to Weights/Outputs, Discriminative and Generative Regularization; Early Stopping |
8 | Ensemble methods; introduction to dropout |
9 | Introduction to optimization for training deep models |
10 | Stochastic gradient descent: Local minima and saddle points, cliffs and flat regions, learning rate, convergence, momentum, Nesterov momentum; Parameter initialization |
11 | Adaptive Learning Rates, Adam Optimizer, Introduction to Approximate Second Order Methods; Steepest Descent, Conjugate Gradients, BFGS; Quasi-Newton Methods, L-BFGS, Batch Normalization, Polyak Averaging |
12 | Greedy Supervised Pretraining, FitNets, Curriculum Learning; Introduction to Convolutional Neural Networks (CNN); Pooling, CNN Example |
13 | Convolution and Pooling as Priors, Variants of Convolutional Layers; Back Propagating Convolutional Layers, Data Types and Variably Sized Inputs; Introduction to Recurrent Neural Networks (RNN) |
14 | Teacher Forcing, Computing Gradients in RNN, Statistical Dependence Relationships; Bidirectional RNN, Determining the Sequence Length, Deep RNN; Sequence-to-Sequence Architectures, Recursive Networks, Challenges of Long-Term Dependency |
15 | Echo State Networks (Reservoir Computing), Skip Connections, Leaky Units; Long Short Term Memory (LSTM) Cells, Gradient Propagation Issues in RNN; Neural Networks with Explicit Memory |
Engineering Design Content:
- Establishment of Objectives and Criteria
- Synthesis
- Analysis
- Construction
- Testing
- Evaluation
Engineering Design Consideration(s):
- Economic
- Environmental
- Ethical
- Health/Safety
- Social
Assessment Method:
Quizzes, projects, exams. (3/2022)