What is machine learning?

*"Machine learning is a subfield of artificial intelligence (AI) concerned with algorithms that allow computers to learn. What this means in most cases, is that an algorithm is given a set of data and infers information about the properties of the data--and that information allows it to make predictions about other data it might see in the future. This is possible because almost all nonrandom data contains patterns, and these patterns allow the machine to generalize. In order to generalize, it trains a model with what it determines are the important aspects of the data."*

--Toby Segaran, Programming Collective Intelligence

A more concise definition:

*Machine learning allows computers to observe input and produce a desired output, either by example or through identifying latent patterns in the input.*

This course takes an application driven approach to current topics in machine learning. The course covers supervised learning (classification/structured prediction/regression) and unsupervised learning (dimensionality reduction, bayesian modeling, clustering). The course will also consider challenges resulting from learning applications. We will cover popular algorithms (naive Bayes, SVM, perceptron, HMM, k-means, maximum entropy) and will focus on how statistical learning algorithms are applied to real world applications. Students in the course will implement several learning algorithms and apply machine learning to an application as a final project.

Goals

The course has several goals:

- Students will learn the fundamentals of machine learning
- Students will learn to implement machine learning algorithms
- Students will learn to evaluate how to apply machine learning to different settings

Requirements

Students are expected to have:

- Strong programming skills in Python. There will be considerable programming required for the homeworks.
- Comfort with relevant mathematical topics (linear algebra, multi-variate calculus, probability)

Grading

**Homeworks**: 50%**Midterm**: 25%**Project**: 25%

Homework

Since the focus of the course is on practical applications of machine learning, the bulk of the final grade comes from homework. Homeworks are comprised of both written problems and programming projects. Homeworks are to be turned in electronically. Instructions will be provided when homeworks are assigned. There will be about six homeworks during the semester.

Late Policy

Late homework assignments will be accepted up to 24 hours past the due date for a 25% reduction in grade. Exceptions will only be given in extreme cases. However, every student is permitted to hand-in homeworks late penalty free using a 72-hour grace period for the entire semester. This means that you can choose to hand-in the first homework 70 hours late and the second homework 2 hours late, but then every other homework must be on time for the rest of the semester. You may divide these 72 hours as you see fit, but once you have used up all of the time, you will be given no more. I will round-up to the hour (minutes don't count.)

Final Project

A significant part of the final grade will be based on a project. Projects test your knowledge of machine learning by going beyond what is covered in the course. There are two options for projects:

**Machine Learning system**: Students must implement a machine learning solution for an application of interest. This requires demonstrating knowledge learned in the class and not a black box application of a machine learning software package. A writeup describing the project is required. Projects of this type must be done by teams of 2 students.**Survey**: A tutorial that surveys the state of the art in a particular area.

There are three project reports.

**Proposal**: A 1 page description of the proposed project. This will be due half way through the course.**Progress report**: A brief (1 paragraph) update on the progress of your project.**Final report**: The final writeup of the project. Length depends on the chosen project type.

Textbook

In previous years, we used the Bishop book. Since this book first appeared there has been an explosion in new machine learning books. Many are focused on specific methods or applications.

**The official book for this class will be:**There is currently only one edition of the book, but there are multiple printings. As of September 2013, the latest printing is the fourth printing. New printings fix many errors found in earlier books. Page numbering can be different between printings, but section numbers (which we will be using) are the same. Online errata can be found here.

Many machine learning courses select readings from multiple sources without providing a single official book. I prefer to select a single book since it presents every topic using the same style, approach and notation. The consistency in presentation across multiple topics aids learning and provides a resource for exploring topics in more depth.

In addition to the official book, I will provide other readings on specific topics that offer a different (and hopefully better) presentation of the material. These readings will be available freely online.

Switching textbooks from Bishop to Murphy comes at a cost. I selected Murphy because it is more up to date, covers more topics, and uses notation common in the machine learning community. However, Bishop does have its own advantages. I think the presentation of some topics is superior to Murphy. Also, since the class was originally designed to follow Bishop, the presentation of some topics in class will be flipped from Bishop. The reading in Murphy will jump around between chapters. If you encounter a topic in the reading with which you are not familiar, you may want to go back to an earlier chapter in the book where the topic is first explained.

Overall, I think change to Murphy will be an improvement. However, for those who prefer to use Bishop, I will list the corresponding readings in Bishop as optional.

A few other relevant textbooks.

- Trevor Hastie, Robert Tibshirani, Jerome Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction. 2009

This book covers a large amount of material and is ideal for students with some prior experience with statistics. - Tom Mitchell. Machine Learning. 1997

This book used to be the standard for machine learning, but is out of date. It covers in depth some topics that more recent books overlook. It's presentation is excellent. - Ethem Alpaydin. Introduction to Machine Learning. 2004

Comparable in quality and coverage to the Bishop book, but not often used. A nice supplement for difficult topics.

Cheating

I take cheating very seriously. I expect every student to have read the Department of Computer Science Academic Integrity Code and will hold students accountable to it. So that course policies are clear, here is review of relevant rules (in addition to the integrity code.)

- Every exam, project, homework and any other work completed during this course must be entirely your own. Copying any material from other students or the web is expressly prohibited.
- All exams are closed book unless otherwise stated. This means that students may not reference any material during an exam that is not provided as part of the exam.
- Any collaboration between students during an exam will be considered cheating.
- If a student copies your work, even without your knowledge, you are cheating. It is your responsibility to ensure that no one has access to your work.
- There is no statue of limitations on punishing cheating. Even if I find on the last day of the semester that you had cheated on homeworks, you will be punished.
- Talking with other students to understand homework and course mateiral is strongly encouraged. However, discussing an assignment and cheating are very different things. If you copy someone else's work you are cheating. If you let someone copy your work you are cheating. If someone tells you the answer you are cheating. Everything you hand in must be in your own words based on your understanding of the solution.

**Cheating**

- Copying any part of a homework from someone else.
- Verbally telling someone the answer to a homework question.
- Looking at someone else's code or solution.
- Obtaining any part of your solution or code from any online resource or software library.

**Not-Cheating**

- Explaining the homework question to someone else.
- Discuss at a high level the homework.
- Helping someone think through a problem.
- Directing someone to a section of the textbook, reading, or online resource that helps explain a concept.

I am aware that many of the programming assignments will ask students to implement algorithms already available online. I will try to avoid direct duplication when possible. However, you are not permitted to copy any part of your code from other libraries.

What happens when you cheat?

I will be carefully examining homeworks and exams for signs of cheating. If you cheat, at a minimum you will be given a 0 for the assignment or exam. More likely, you will have the total value of the homework or exam

*subtracted*from your grade, ie. if you cheat on an exam worth 15% of your grade, you will get a 0 on the exam and have an additional 15% of your grade deducted. In some cases, cheating will be reported to the appropriate university board, which can result in failing the class, suspension of expulsion.

Remember:

**DO**help each other understand the lectures, readings and homeworks.

**DO NOT**complete each other's homework.