Syllabus
Readings are primarily from the course textbook Machine Learning: a Probabilistic Perspective by Kevin Murphy. Readings from the textbook will be prefixed by "M". For example, M1 means Murphy chapter 1. Unless otherwise notes, skip sections that have a * in the title. These are optional.
For those who prefer Chris Bishop's Pattern Recognition and Machine Learning, I will list the corresponding readings from this textbook. These readings will be prefixed by "B". Optional readings will be written in italics.
Recitation sections are led by the TAs and are optional.
Date 
Topics 
Readings 

Supervised Learning: Classifiers 

M 8/31 
Machine Learning: Overview Background of the field 
M1 PDF 

W 9/2 
Machine Learning: Foundations Introduce key concepts 


F 9/4 
Recitation: Probability Events, random variables, probabilities, pdf, pmf, cdf, mean, mode, median, variance, multivariate distributions, marginal\ s, conditionals, Bayes theorem, independence 

M 9/7 
NO CLASS 


W 9/9 
Regression 1 Linear regression 
M7, excluding M7.4 and M7.6 

F 9/12 
Recitation: Math Review Basic properties of matrices, eigenvalue decompositions, singular value decompositions 
cs229's linearalgebra notes


M 9/14 
Regression 2 

W 9/16 
Classification 1 Introduces logistic regression and classification 
M8 (stop at M8.3.7), M13.3 

F 9/18 
Recitation: TBD 

M 9/21 
Classification 2 

W 9/23 
Data Geometry: Support Vector Machines Maxmargin classification and optimization 
M14.5 

F 9/25 
Recitation: TBD 

M 9/28 
Data Geometry: Kernel Methods Dual optimization, kernel trick 
M14.1, M14.2 

W 9/30 
Perceptron Online learning algorithms 
M8.5  
F 10/2 
Recitation: PyTorch Intro 

M 10/5 
Deep Learning: Shallow Learning 
M16.5, M27.7, M28 

W 10/7 
Deep Learning: Backprop 

F 10/9 
Recitation: Midterm Review 

M 10/12 
Deep Learning: The Details 

W 10/14 
Decision Trees Construction, pruning, overfitting 
M2.8, M16.2 

F 10/16 
Recitation: Midterm (Required Attendance) 

M 10/19 
Boosting Ensemble methods 
M16.4, M16.6 

Unsupervised Learning: Core Methods 

W 10/21 
Clustering Kmeans 
M25.1, M11 (stop at M11.4) 

F 10/23 
FALL BREAK 

M 10/26 
Expectation Maximization 1 
M11.4 (stop at M11.4.8) 

W 10/28 
Expectation Maximization 2 

F 10/30 
Recitation: TBD 

M 11/2 
Graphical Models 1 Bayesian networks and conditional independence 
M10 

W 11/4 
Graphical Models 2 MRFs and exact inference 
M19.1 (stop at M19.4), M19.5 

F 11/6 
Fall Break 

M 11/9 
Graphical Models 3 Inference 
M20 (stop at M20.3) 

W 11/11 
Graphical Models 4 Max Sum and Max Product 

F 11/13 
Recitation: TBD 

M 11/16 
Structured Prediction 1 Margin based methods, HMMs, CRFs 
M17 (stop at M17.6), M19.6, M19.7 

W 11/18 
Structured Prediction 2 Recurrent Neural Networks 

F 11/20 
Recitation: TBD 

M 11/30 
Dimensionality reduction PCA 
M12.2 

W 12/2 
Fairness, Accountability, Transparency and Ethics of ML 

F 12/4 
Recitation 

M 12/7 
Practical Machine Learning 

W 12/9 
Recitation: Final Review 

F 12/11 
No class 

TBD 
Final Exam TBD (75 minutes) 


