Readings are primarily from the course textbook Machine Learning: a Probabilistic Perspective by Kevin Murphy. Readings from the textbook will be prefixed by "M". For example, M1 means Murphy chapter 1. Unless otherwise notes, skip sections that have a * in the title. These are optional.

For those who prefer Chris Bishop's Pattern Recognition and Machine Learning, I will list the corresponding readings from this textbook. These readings will be prefixed by "B". Optional readings will be written in italics.

Recitation sections are led by the TAs and are optional.
^M

Date

Topics

Readings

Supervised Learning: Classifiers

Th 8/30

Machine Learning: Overview

Introduces key concepts

M1 PDF
B1
If you need to review relevant math, do it now.

W 9/5

Linear Regression

Our first algorithm

M7, excluding M7.4 and M7.6
B3
Those without a probability background should read M2.1-2.4.1
Other probability background readings: B2, B Appendix B, Andrew Moore Tutorial, Tom Minka's nuances of probability

F 9/7

Recitation: Probability

Events, random variables, probabilities, pdf, pmf, cdf, mean, mode, median, variance, multivariate distributions, marginal\ s, conditionals, Bayes theorem, independence

M 9/10

Linear Regression: continued

W 9/12

Logistic Regression

Introduces classification methods

M8 (stop at M8.3.7), M13.3
B4

F 9/14

Linear Algebra Review

Basic properties of matrices, eigenvalue decompositions, singular value decompositions

M 9/17

Perceptron

Online learning algorithms

M8.5

W 9/19

Decision Trees

Construction, pruning, over-fitting

M2.8, M16.2
Tom Mitchell's notes on decision trees

F 9/21

Recitation: Math Review

Linear algebra, calculus, optimization

cs229's linear-algebra notes
Bishop's appendix on Lagrange Multipliers

M 9/24

Support Vector Machines

Max-margin classification and optimization

M14.5
Andrew Ng's notes on SVMs
B7.1

W 9/26

Kernel Methods

Dual optimization, kernel trick

M14.1, M14.2
B6.1, B6.2

F 9/28

Recitation: Recap + TBD

M 10/1

Boosting

Ensemble methods

M16.4, M16.6
A Short Introduction to Boosting
B14.1, B14.2, B14.3

W 10/3

Deep Learning 1

M16.5, M27.7, M28
B5.1, 5.2, 5.3, 5.5

F 10/5

Recitation: PyTorch Intro

M 10/8

Deep Learning 2

W 10/10

Deep Learning 3

F 10/12

Recitation: Midterm Review

Unsupervised Learning: Core Methods

M 10/15

Clustering

K-means

M25.1, M11 (stop at M11.4)
B9

W 10/17

Expectation Maximization 1

M11.4 (stop at M11.4.8)
Andrew Ng's notes on EM

F 10/19

Fall Break

M 10/22

Midterm

W 10/24

Expectation Maximization 2

K-means

F 10/26

Graphical Models 1

Bayesian networks and conditional independence

M10
B8.1, B8.2

M 10/29

Graphical Models 2

MRFs and exact inference

M19.1 (stop at M19.4), M19.5
B8.3, B8.4

W 10/31

Recitation: Review Midterm Answers

F 11/2

NO RECITATION

M 11/5

Graphical Models 3

Inference

M20 (stop at M20.3)

W 11/7

Graphical Models 4

Max Sum and Max Product

F 11/9

Recitation: TBD

M 11/12

Structured Prediction 1

Margin based methods, HMMs, CRFs

M17 (stop at M17.6), M19.6, M19.7
Sutton, McCallum CRF tutorial
B13.1, 13.2

W 11/14

Structured Prediction 2

Recurrent Neural Networks

F 11/16

TBD

M 11/26

Dimensionality reduction

PCA

M12.2
B12.1, B12.2, B12.3

W 11/28

Fairness, Accountability, Transparency and Ethics of ML

F 11/30

Recitation: Graphical Models

M 12/3

Practical Machine Learning

W 12/5

Final Review

F 12/7

TBD

Th 12/13, 9am-12pm

Final Exam