CPSC 532S: Modern Statistical Learning Theory

Instructor: Danica Sutherland - ICICS X563, dsuth@cs.ubc.ca.
Lecture info: Mondays/Wednesdays, 13:30 - 15:00, DMP 101.
Term: 2021-22 W2 (January – April 2022).
Expect a prettier, more informative page before the start of term.
The course is currently fully registered, but feel free to sign up for the waiting list.

The brief idea of the course: when should we expect machine learning algorithms to work? What kinds of assumptions do we need to be able to be able to rigorously prove that they will work?

Definitely covered: PAC learning, VC dimension, Rademacher complexity, concentration inequalities. Probably: PAC-Bayes, analysis of kernel methods, margin bounds, stability. Maybe: limitations of uniform convergence, analyzing deep nets via neural tangent kernels, provable gaps between kernel methods and deep learning, online learning, feasibility of private learning, compression-based bounds.

There will be some overlap with CPSC 531H: Machine Learning Theory (Nick Harvey's course, last taught in 2018), but if you've taken that course, you'll still get something out of this one. We'll cover less on optimization / online learning / bandits than that course did, and try to cover some more recent ideas used in contemporary deep learning theory.

(This course is completely unrelated to CPSC 532S: Multimodal Learning with Vision, Language, and Sound, from Leon Sigal.)

Prerequisites

There are no formal prerequisites. I will roughly assume:

If you have any specific questions about your background, feel free to ask.

Resources

We will of course cover less than the union of these, but also a few recent topics not in any of those sources.