ECE 595: Course Information
Professor Stanley
Chan, Purdue University Spring 2019
Lectures and Tutorials
Lectures
TuTh 4:30pm  5:45pm
Room: WALC 1055
Tutorials
Wed 6pm7pm
Room: EE 117
Tutorials are run by the teaching assistants and are optional.
Course Description
A good subtitle of this course is supervised learning using linear models.
We will cover some methodologies, followed by theory. The course is structured in four parts.
Mathematical preliminaries. Matrices, vectors, Lp norm, geometry of the
norms, symmetry, positive definiteness, eigendecomposition. Unconstrained
optimization, graident descent, convex functions, Lagrange multipliers,
linear least squares. Probability space, random variables, joint
distributions, multidimensional Gaussians.
Linear Classifiers. Linear discriminant analysis, separating hyperplane,
multiclass classification, Bayesian decision rule, geometry of Bayesian
decision rule, linear regression, logistic regression, perceptron algorithms,
support vector machines, nonlinear transformations.
Robustness. Adversarial attack, targeted and untargeted attack, minimum
distance attack, maximum allowable attack, regularizationbased attack.
Perturbation through noies. Robustness of SVM.
Learning Theory. Bias and variance, training and testing, generalization,
PAC framework, Hoeffding inequality, VC dimension.
Textbook and References
There is no official textbook for this course. Please refer to the lecture
note section of the website for our lecture materials.
A few good reference books for this course are:
Pattern Classification, by Duda, Hart and Stork, WileyInterscience; 2 edition, 2000.
Learning from Data, by AbuMostafa, MagdonIsmail and Lin, AMLBook, 2012.
Elements of Statistical Learning, by Hastie, Tibshirani and Friedman, Springer, 2 edition, 2009.
Pattern Recognition and Machine Learning, by Bishop, Springer, 2006.
Grades
All students will be graded by the following rubric. Graduate students and
undergraduate students will be graded on two different curves.
Homework Assignments (30%). Hoemworks are approximately biweekly. Please
submit your homework to the dropbox located at MSEE 330 by 4:30pm on the due
date. Late homework will not be accepted. You are encouraged to work in
smalll groups, but you have to write / type your own solution.
We highly
encourage you to type your solution using the LaTeX template provided in the
course website, although we accept handwritten solutions. All programming
answers should be typed.
Midterm (30%). Probably before the Spring break. The exact date will be
announced later.
Final (40%). We will post the exam date later.
These weights are approximate. We reserve the right to change them later.
Prerequisites
We expect audience to have good knowledge in the following three subjects:
Linear Algebra (as in the materials covered by G. Strang's
Linear Algebra Textbook)
A good course at Purdue is
MA 511 Linear Algebra
Optimization (as in the materials covered by any mainstream
undergraduate textbooks on Calculus.)
A good course at Purdue (perhaps slightly overkill) is
ECE 580 Optimization
Probability (as in the materials covered by D. Bertseka's
Probability Textbook)
A good course at Purdue is ECE 302 Probability. Advanced probability such as
ECE 600 is helpful but not
essential.
To help you determine if you have adequate prerequisites, we encourage you
to try homework 0 posted in the homework section. If the
problems are significantly beyond your comfort level, we suggest considering
taking ECE 595 at a later time.
Programming
We will be primarily using Python. As such, we
expect audience to have elementary programming skills, e.g., writing a hello
world program. More information and resources on how to use Python can be
found in the programming section of this website.
Besides Python, we use optimization packages to solve optimization problems.
Of particular importance is CVX.
FAQ
Am I ready to take the course?
There is no official prerequisite of the course (e.g., taking a prior course), although we expect students to have good background in linear algebra, optimization and probability.
Pleaes check out the information about prerequisite to see if you are ready for the course.
What is the difference between ECE 595 and other machine learning courses
on campus?
We focus on only one topic called supervised learning using linear models
Our goal is to provide an indepth discussion of the subject, rather than superficially glancing through different topics
We put significant emphasis on understanding the mathematics behind the algorithms
We have plenty of handson programming exercises
What will I learn after taking ECE 595?
You will know what a linear model is, such as Bayesian decision rule, perceptron algorithm, logistic regressoin, support vector machine, etc.
You will know how to understand a linear classifier from a geometric perspective.
You will know how to attack a classifier.
You will know the how much a machine learning algorithm can do, and what a machine learning algorithm cannot do.
You will know how to implement machine learning algorithms using Python and CVX.
When will it be offered again?
While the general expectation is to offer this course regularly, at the moment it is still an experimental course under the code ECE 595.
Being an experimental course, any offering is subject to approval of the graduate and undergraduate committees in the sponsoring department and the college.
A sister course Machine Learning 2 is being planned, though not yet approved, for Fall 2019.
Can I audit the class?
Yes. However, our teaching resources will be given a higher priority to students who have registered the course.
If the classroom is full, we kindly ask you to offer the seats to those who are registered.
Where can I get help for programming problems?
