ECE 59500 - Machine Learning

Lecture Hours: 3 Credits: 3

Areas of Specialization(s):

Communications, Networking, Signal & Image Processing

Counts as:
CMPE Special Content Elective
EE Elective

Experimental Course Offered: Spring 2019, Spring 2020, Spring 2021

Requisites:
ECE 26400, ECE 30200, MA 26500

Requisites by Topic:
programming, probability, linear algebra

Catalog Description:
An introductory course to machine learning, with a focus on supervised learning using linear models. The course will have four parts: (1) mathematical background on linear algebra, probability, and optimization. (2) classification methods including Bayesian decision, linear regression, logistic, regression and support vector machine. (3) robustness of classifier and adversarial examples. (4) learning theory on feasibility of learning, VC dimension, complexity analysis, bias-variance analysis. Suitable for senior undergraduates and graduates with background in probability, linear algebra and programming.

Required Text(s): None.

Recommended Text(s):
  1. Learning from Data, 1st Edition, Abu-Mostafa, Yaser, ALM, 2012, ISBN No. 1600490069.
  2. Pattern Recognition and Machine Learning, Bishop, Chris, Springer, 2011, ISBN No. 978-0387310732.

Lecture Outline:

Week Lecture Topics
1 Course overview; Background 1: Linear Regression, Loss Function, Regularization, Minima, Convexity, Optimality, Gradient Descent, Step Size, Acceleration
2 PCA, Transform, Convolution and Deep Features, Linear Separability, Concept of Margin, MAP Rule, Nonlinear Cases, Formulation, Probabilistic Meaning, One-Layer Perceptron, Deep Networks, Hard and Soft Margin
3 Dealing with Noisy Label, Unbalanced Data and Missing Data, Knowledge Transfer, Adversarial Attack, Attack and Defense Methods, Analyzing Attacks, Robustness-Accuracy Trade-off
4 Probability Review, Learning Feasibility, Probability Inequality, Probably Approximately Correct, Shattering, Growth Number and VC Dimension, Model and Sample Complexity, Interpreting Generalization Bound, Bias and Variance, Learning Curve, Overfitting, Regularization, Validation