Course Description

This course is on the theoretical foundations of modern machine learning. We will analyze the inherent difficulty of learning problems and study the theoretical guarantees one can prove about a learning algorithm's ability to generalize. To accomplish this, we will reply on tools from probability, statistics, game theory, and convex analysis. Most importantly, we will see how a solid theoretical foundation can help us design better learning algorithms. Topics include ERM and sample complexity (VC theory, Rademacher complexity), PAC-Bayes anaysis, algorithmic stability, online regret minimization, online-to-batch conversion techniques, with applications to support vector machines, boosting, and online learning.

Prerequisites

This course assumes background in basic probability theory and linear algebra. Key mathematical concepts will be reviewed before they are used, but a certain level of mathematical maturity is expected. The course is theoretical and does not involve any programming.

Grading

The course can be taken either for a grade or for credit/no-credit. Grades will be given based on homework assignments, lecture notes, and class participation. 

Announcements

Mar 2, 2011

Homework exercise 3 posted.

Jan 27, 2011

Homework exercises 1 and 2 posted.

Jan 7, 2011

Please sign up to take lecture notes.

Jan 6, 2011

More information on the generalized definition of expectation can be found in the new references section of this website.