ECE 59500 - Machine Learning ILecture Hours: 3 Credits: 3
This is an experiential learning course.
CMPE Special Content Elective
Experimental Course Offered: Spring 2019, Spring 2020
ECE 26400, ECE 30200, MA 26500
Requisites by Topic:
programming, probability, linear algebra
An introductory course to machine learning, with a focus on supervised learning using linear models. The course will have four parts: (1) mathematical background on linear algebra, probability, and optimization. (2) classification methods including Bayesian decision, linear regression, logistic, regression and support vector machine. (3) robustness of classifier and adversarial examples. (4) learning theory on feasibility of learning, VC dimension, complexity analysis, bias-variance analysis. Suitable for senior undergraduates and graduates with background in probability, linear algebra and programming.
Required Text(s): None.
- Learning from Data, 1st Edition, Abu-Mostafa, Yaser, ALM, 2012, ISBN No. 1600490069.
- Pattern Recognition and Machine Learning, Bishop, Chris, Springer, 2011, ISBN No. 978-0387310732.
Learning Outcomes:A student who successfully fulfills the course requirements will have demonstrated an ability to:
- classify data using statistical learning methods, and an understanding of the limitations of the methods.. 
- estimate model parameters using regression methods.. 
- apply optimization algorithms to achieve the statistical learning tasks.. 
- evaluate results generated by different machine learning algorithms, and make interpretations.. [1,6]
- apply machine learning algorithms to solve complex engineering problems.. [1,6,7]
|1||Course overview; Background 1: Linear Algebra|
|2||Background 2: Probability; Background 3: Optimization Part 1|
|3||Background 4: Optimization Part 2; Linear Discriminant Analysis|
|4||Multi-dimensional Gaussian; Bayesian Optimal Classifier|
|5||Parameter Estimation; Linear Regression|
|6||Logistic Regression; Bayesian VS Linear VS Logistic|
|7||Perceptron Loss; Perceptron Algorithm|
|8||Concept of Max-Margin; Support Vector Machine|
|9||Nonlinear Transformation; Midterm|
|10||Attacking Classifiers; From Linear to Nonlinear Attacks|
|11||Can Random Noise Attack?; Is Learning Feasible?|
|12||Probably Approximately Correct; VC Dimension|
|13||Sample and Model Complexity; Bias and Variance|
|14||The Learning Curve; Overfitting and Regularization|