Readings are primarily from the course textbook Machine Learning: a Probabilistic Perspective by Kevin Murphy. Readings from the textbook will be prefixed by "M". For example, M1 means Murphy chapter 1. Unless otherwise notes, skip sections that have a * in the title. These are optional.

For those who prefer Chris Bishop's Pattern Recognition and Machine Learning, I will list the corresponding readings from this textbook. These readings will be prefixed by "B". Optional readings will be written in italics.

Recitation sections are led by the TAs and are optional.

Date

Topics

Readings

Supervised Learning: Classifiers

Th 8/29

Machine Learning: Overview

Introduces key concepts

M1 PDF
B1
If you need to review relevant math, do it now.

W 9/4

Linear Regression

Our first algorithm

M7, excluding M7.4 and M7.6
B3
Those without a probability background should read M2.1-2.4.1
Other probability background readings: B2, B Appendix B, Andrew Moore Tutorial, Tom Minka's nuances of probability

F 9/6

Recitation: Probability

Events, random variables, probabilities, pdf, pmf, cdf, mean, mode, median, variance, multivariate distributions, marginal\ s, conditionals, Bayes theorem, independence

M 9/9

Linear Regression: continued

W 9/11

Logistic Regression

Introduces classification methods

M8 (stop at M8.3.7), M13.3
B4

F 9/13

Linear Algebra Review

Basic properties of matrices, eigenvalue decompositions, singular value decompositions

M 9/16

Perceptron

Online learning algorithms

M8.5

W 9/18

Support Vector Machines

Max-margin classification and optimization

M14.5
Andrew Ng's notes on SVMs
B7.1

F 9/20

Recitation: Math Review

Linear algebra, calculus, optimization

cs229's linear-algebra notes
Bishop's appendix on Lagrange Multipliers

M 9/23

Kernel Methods

Dual optimization, kernel trick

M14.1, M14.2
B6.1, B6.2

W 9/25

Decision Trees

Construction, pruning, over-fitting

M2.8, M16.2
Tom Mitchell's notes on decision trees

F 9/27

Recitation: Recap + TBD

M 9/30

Boosting

Ensemble methods

M16.4, M16.6
A Short Introduction to Boosting
B14.1, B14.2, B14.3

W 10/2

Deep Learning 1

M16.5, M27.7, M28
B5.1, 5.2, 5.3, 5.5

F 10/4

Recitation: PyTorch Intro

M 10/7

Deep Learning 2

W 10/9

Deep Learning 3

F 10/11

Recitation: Midterm Review

Unsupervised Learning: Core Methods

M 10/14

Clustering

K-means

M25.1, M11 (stop at M11.4)
B9

W 10/16

Expectation Maximization 1

M11.4 (stop at M11.4.8)
Andrew Ng's notes on EM

F 10/18

Fall Break

M 10/21

Midterm

W 10/23

Expectation Maximization 2

K-means

F 10/25

Recitation: TBD

M 10/28

Graphical Models 1

Bayesian networks and conditional independence

M10
B8.1, B8.2

W 10/30

Graphical Models 2

MRFs and exact inference

M19.1 (stop at M19.4), M19.5
B8.3, B8.4

F 11/1

Recitation

M 11/4

Graphical Models 3

Inference

M20 (stop at M20.3)

W 11/6

Graphical Models 4

Max Sum and Max Product

F 11/8

Recitation: TBD

M 11/11

Structured Prediction 1

Margin based methods, HMMs, CRFs

M17 (stop at M17.6), M19.6, M19.7
Sutton, McCallum CRF tutorial
B13.1, 13.2

W 11/13

Structured Prediction 2

Recurrent Neural Networks

F 11/15

Recitation

M 11/18

Dimensionality reduction

PCA

M12.2
B12.1, B12.2, B12.3

W 11/20

Fairness, Accountability, Transparency and Ethics of ML

F 11/22

Recitation

M 12/2

Practical Machine Learning

W 12/4

TBD

F 12/6

Final Review

12/11

Final Exam

Wednesday, December 11 9 AM-10:15AM (75 minutes)