**Please fill out the pre-course survey if you're interested in this course (even if you're not registered).**- Canvas and Piazza (easiest registration if you follow the link from Canvas, but you can sign up directly here) are now available.
- The course is fully registered, but please sign up for the waiting list, or email me if you're interested but can't get on the waiting list for whatever reason.

SSBD below refers to the book of Shalev-Shwartz and Ben-David; MRT to that of Mohri, Rostamizadeh, and Talwakar.

1 | Mon | Jan 10 | Intro / overview | SSBD chap. 1-2; MRT chap. 2 |
---|---|---|---|---|

Mon | Jan 10 | Assignment 1 posted (and .tex) | ||

2 | Wed | Jan 12 | PAC | SSBD chap. 2-3; MRT chap. 2 |

3 | Mon | Jan 17 | Probability / uniform convergence / more? | SSBD chap. 4; MRT chap. 2 |

4 | Wed | Jan 19 | ||

Thu | Jan 20 | Assignment 1 due, 11:59pm | ||

Fri | Jan 21 | Drop deadline | ||

5 | Mon | Jan 24 | ||

6 | Wed | Jan 26 | ||

7 | Mon | Jan 31 | ||

8 | Wed | Feb 2 | ||

Mon | Feb 7 | Planned shift to hybrid mode (rather than online-only) 🤞 | ||

9 | Mon | Feb 7 | ||

10 | Wed | Feb 9 | ||

11 | Mon | Feb 14 | ||

12 | Wed | Feb 16 | ||

Mon | Feb 21 | Midterm break | ||

Wed | Feb 23 | Midterm break | ||

13 | Mon | Feb 28 | ||

14 | Wed | Mar 2 | ||

15 | Mon | Mar 7 | ||

16 | Wed | Mar 9 | ||

17 | Mon | Mar 14 | ||

18 | Wed | Mar 16 | ||

19 | Mon | Mar 21 | ||

20 | Wed | Mar 23 | ||

21 | Mon | Mar 28 | ||

22 | Wed | Mar 30 | ||

23 | Mon | Apr 4 | ||

24 | Wed | Apr 6 |

The course will initially meet on Zoom: the meeting link is available on Canvas and Piazza. Starting ~~January 24th~~February 7th, we will hopefully meet in person in DMP 101. I currently plan to both livestream and record lectures throughout the term, either via Zoom (same link) or Panopto (link will be provided if so). Plans here are subject to change.

Recordings are available from both Canvas and Piazza.

Grading scheme: 70% assignments (including a small project), 30% final.

The lowest assignment grade (not including the project) will be dropped. The exact relative weight of assignments and the project is TBD. Assignments should be done in LaTeX – not handwritten or in a word processor. Hand-in procedure will be announced before the first deadline.

There will be one “big assignment” which serves as a (small) project: something on the scale of doing some experiments to explore a paper, doing a lit review in a particular area, extending / unifying a few papers, etc. A proposal will be due beforehand; details to come.

The final exam may be take-home, synchronous online, or in-person; TBD.

There may also be some paper presentations later in the course, in which case the paper presenters will be able to use that to replace part of an assignment grade. This is dependent on the COVID situation and other factors; TBD.

The brief idea of the course: when should we expect machine learning algorithms to work? What kinds of assumptions do we need to be able to be able to rigorously prove that they will work?

Definitely covered: PAC learning, VC dimension, Rademacher complexity, concentration inequalities. Probably: PAC-Bayes, analysis of kernel methods, margin bounds, stability. Maybe: limitations of uniform convergence, analyzing deep nets via neural tangent kernels, provable gaps between kernel methods and deep learning, online learning, feasibility of private learning, compression-based bounds.

There will be some overlap with CPSC 531H: Machine Learning Theory (Nick Harvey's course, last taught in 2018), but if you've taken that course, you'll still get something out of this one. We'll cover less on optimization / online learning / bandits than that course did, and try to cover some more recent ideas used in contemporary deep learning theory.

(This course is unrelated to CPSC 532S: Multimodal Learning with Vision, Language, and Sound, from Leon Sigal.)

There are no formal prerequisites. I will roughly assume:

- Basic "mathematical maturity": familiarity with reading and writing proofs, recognizing a valid proof, etc. If you've taken a third-year math course or similar, you should be fine.
- Comfort with linear algebra, multivariable calculus, basic probability theory, and basic analysis of algorithms.
- Ideally, a basic understanding of machine learning, as in CPSC 340. If you don't have this, you should still be able to get by, but might have to do a little more reading on your own; I'll provide some resources.
- Ideally, familiarity with programming in a machine learning / statistical context, e.g. being comfortable with numpy and PyTorch/TensorFlow/etc. This course
**will not require programming**, but there will be some assignment and project options that may be easier / more fruitful / more fun if you're comfortable with it.

Learning theory textbooks and surveys:

**Understanding Machine Learning: From Theory to Algorithms (Shai Shalev-Shwartz, Shai Ben-David; 2014)**– a very readable (free) book that covers a significant portion of our material- Foundations of Machine Learning (Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwakar; second edition 2018) – free textbook, also quite good
- Introduction to Statistical Learning Theory (Olivier Bousquet, Stéphane Boucheron, Gábor Lugosi; 2003) – 40-page survey of classics
- On the Mathematical Foundations of Learning (Felipe Cucker, Steve Smale; 2001) – 50-page survey of classics
- Deep learning theory lecture notes (Matus Telgarsky; ongoing updates) – overview including some quite modern stuff; we'll pull from here especially later in the course
- An Introduction to Computational Learning Theory (Michael Kearns, Umesh Vazirani; 1994; that link should get you a copy with a UBC login) – a book from a much more CS theory point of view. A nice complement to what we'll cover in this course.

If you need to refresh your linear algebra or other areas of math:

- Mathematics for Machine Learning (Marc Deisenroth, Aldo Faisal, Cheng Soon Ong; 2020) – a nice overview. Unfortunately their probability chapter doesn't use a measure theoretic point of view (see below), but if you don't know the material in it, you probably should.

Resources on learning measure-theoretic probability (*not* required to know this stuff in detail, but you might find it helpful):

- A Measure Theory Tutorial (Measure Theory for Dummies) (Maya Gupta) – 5 pages, just the basics
- Measure Theory, 2010 (Greg Hjorth) – 110 pages but comes recommended as both thorough and readable
- A Probability Path (Sidney Resnick) – frequently recommended textbook aimed at non-mathematicians to learn it in detail, but it's a full-semester textbook scale of detail; available if you log in via UBC
- There are also lots of other places, of course; e.g. the probability textbooks by Billingsley, Klenke, and Williams are (I think) classics.

Similar courses:

- UBC CPSC 531H: Machine Learning Theory (Nick Harvey)
- MIT 9.520: Statistical Learning Theory and Applications (Tomaso Poggio, Loreno Rosasco, Alexander Rakhlin, Ardrzej Banburski)
- TTIC 31120: Computational and Statistical Learning Theory (Nati Srebro)
- CMU 10-806: Foundations of Machine Learning and Data Science (Nina Balcan, Avrim Blum)
- UIUC ECE 598MR: Statistical Learning Theory (Maxim Raginsky)