Syllabus for CPSC 531H: Machine Learning Theory

Prerequisites

·         A solid background in algorithm design is necessary.
CPSC 420 or CPSC 500 would be ideal. CPSC 320 might not be adequate.

·         A solid background in probability or randomized algorithms would be very helpful.
MATH 418, STAT 547C or CPSC 536N would be ideal. MATH 302 or STAT 302 might be adequate.

·         A solid grasp of linear algebra would be very helpful.
MATH 310 would be ideal. MATH 223 or MATH 307 might be adequate.

·         Some familiarity with complexity theory may be useful.
CPSC 421/501 may be may be useful background, but it is not required.

·         No background in machine learning is required.
CPSC 540 and CPSC 340 have little overlap with this course, but they may be useful background knowledge.

·         No background in statistics is required.
STAT 447B has some overlap with this course, and it may be useful background knowledge.

To determine whether your background is appropriate for this class, please look at the lecture notes here, here, here or here. This course will be quite similar. If you can follow those notes, then your background is a good fit for this course.

 

In exceptional circumstances, undergraduate students may also be admitted into the class.

 

Topics

Some topics we will cover include:

·         PAC learning

·         Sample complexity: VC dimension, Rademacher complexity

·         Boosting

·         Support vector machines, kernels

·         Online learning: Perceptron, Winnow, Weighted majority, Regression…

·         Spectral partitioning and clustering

·         Non-negative matrix factorization

 

Textbooks

The primary text is:

·         Mohri, Rostamizadeh, Talwalkar “Foundations of Machine Learning”

Other books relevant to portions of the course include:

·         Kearns, Vazirani “An Introduction to Computational Learning Theory”

·         Cesa-Bianchi, Lugosi “Prediction, Learning and Games”

·         Schapire, Freund “Boosting: Foundations and Algorithms”

·         Devroye, Gyorfi, Lugosi “A Probabilistic Theory of Pattern Recognition”

·         Shalev-Shwartz “Online Learning and Online Convex Optimization”

 

Evaluation

Around 4 problem-solving assignments and a final project.