Previously offered in 2021W2 (under the name CPSC 532S); this instance will be broadly similar.

Italicized entries are tentative. The books SSBD, MRT, and Tel are described further here.

Date | Topic | Material | Supplements | |
---|---|---|---|---|

W | Sep 7 | Course intro; start finite hypothesis classes | intro slides | SSBD 1-2; MRT 2 |

Th | Sep 8 | Assignment 1 posted: pdf, tex | ||

M | Sep 12 | Class canceled (sick) | ||

W | Sep 14 | PAC learning: definitions, finite hypothesis classes | slides | SSBD 2-4; MRT 2 |

M | Sep 19 | No class: National day of mourning | ||

T | Sep 20 | Assignment 1 due at noon — solutions | ||

T | Sep 20 | Drop deadline | ||

W | Sep 21 | Uniform convergence, concentration inequalities | SSBD 4, B; MRT 2, D Tel 12; Wainwright 2 | |

Sa | Sep 24 | Assignment 2 posted: pdf, tex | ||

M | Sep 26 | More uniform convergence: Rademacher complexity | MRT 3; SSBD 26; Tel 13 | |

W | Sep 28 | More on Rademacher complexity | ||

M | Oct 3 | VC dimension [guest lecturer: Nick Harvey] | ||

W | Oct 5 | The fundamental theorem of statistical learning [guest lecturer: Nick Harvey] | ||

M | Oct 10 | No class: Thanksgiving | ||

W | Oct 12 | Assignment 2 due at noon | ||

W | Oct 12 | |||

M | Oct 17 | |||

W | Oct 19 | |||

M | Oct 24 | |||

W | Oct 26 | |||

F | Oct 28 | Withdrawal deadline | ||

M | Oct 31 | |||

W | Nov 2 | |||

M | Nov 7 | |||

W | Nov 9 | No class: midterm break | ||

M | Nov 14 | |||

W | Nov 16 | |||

M | Nov 21 | |||

W | Nov 23 | |||

M | Nov 28 | |||

W | Nov 30 | |||

M | Dec 5 | |||

W | Dec 7 | |||

W | Dec 14 | Final exam (in person in ICCS 246, handwritten) — 14:00 - 16:30 |

The course meets in person in Orchard Commons 3004. There are **no lecture recordings** due to limitations of the room.

Grading scheme: 70% assignments, 30% final.

The lowest assignment grade (not including the project) will be dropped. Assignments should be done in LaTeX – not handwritten or in a word processor. Hand-in on Gradescope, to be described in more detail soon.

One assignment, late in the term, will be a "mini-project" that will involve reading a paper and running a small exploratory experiment, doing a detailed exploration of their assumptions, etc. Suggested papers and details to come later.

The brief idea of the course: when should we expect machine learning algorithms to work? What kinds of assumptions do we need to be able to be able to rigorously prove that they will work?

Definitely covered: PAC learning, VC dimension, Rademacher complexity, concentration inequalities. Probably: PAC-Bayes, analysis of kernel methods, margin bounds, stability. Maybe: limitations of uniform convergence, analyzing deep nets via neural tangent kernels, provable gaps between kernel methods and deep learning, online learning, feasibility of private learning, compression-based bounds.

There are no formal prerequisites. I will roughly assume:

- Basic "mathematical maturity": familiarity with reading and writing proofs, recognizing a valid proof, etc. If you've taken a third-year math course or similar, you should be fine.
- Comfort with linear algebra, multivariable calculus, basic probability theory, and basic analysis of algorithms.
- Ideally, a basic understanding of machine learning, as in CPSC 340. If you don't have this, you should still be able to get by, but might have to do a little more reading on your own.
**Basic**mathematical programming ability in any language: being able to plot functions, etc.- Ideally, familiarity with programming in a machine learning / statistical context, e.g. being comfortable with numpy and PyTorch/TensorFlow/etc. This is
**not required**, but there will be some assignment and project options that may be easier / more fruitful / more fun if you're comfortable with it.

Books where we'll use sections in some detail:

**[SSBD]**Understanding Machine Learning: From Theory to Algorithms (Shai Shalev-Shwartz, Shai Ben-David; 2014) – a very readable (free) book that covers a significant portion of our material**[MRT]**Foundations of Machine Learning (Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwakar; second edition 2018) – free textbook, a little more "graduate-oriented", that covers certain topics (especially Rademacher complexity) in more depth**[Tel]**Deep learning theory lecture notes (Matus Telgarsky; ongoing updates) – overview including some quite modern stuff; we'll pull from here especially later in the course

Some other points of view you might like:

- Introduction to Statistical Learning Theory (Olivier Bousquet, Stéphane Boucheron, Gábor Lugosi; 2003) – 40-page survey of classics
- On the Mathematical Foundations of Learning (Felipe Cucker, Steve Smale; 2001) – 50-page survey of classics
- An Introduction to Computational Learning Theory (Michael Kearns, Umesh Vazirani; 1994; that link should get you a copy with a UBC login) – a book from a much more CS theory point of view. A nice complement to what we'll cover in this course.

If you need to refresh your linear algebra or other areas of math:

- Mathematics for Machine Learning (Marc Deisenroth, Aldo Faisal, Cheng Soon Ong; 2020) – a nice overview of basics.

Measure-theoretic probability is *not* required for this course, but there are instances and related areas where it could be helpful:

- A Measure Theory Tutorial (Measure Theory for Dummies) (Maya Gupta) – 5 pages, just the basics
- Measure Theory, 2010 (Greg Hjorth) – 110 pages but comes recommended as both thorough and readable
- A Probability Path (Sidney Resnick) – frequently recommended textbook aimed at non-mathematicians to learn it in detail, but it's a full-semester textbook scale of detail; available if you log in via UBC
- There are also lots of other places, of course; e.g. the probability textbooks by Billingsley, Klenke, and Williams are (I think) classics.

Similar courses:

- UBC CPSC 531H: Machine Learning Theory (Nick Harvey, not taught for a while)
- MIT 9.520: Statistical Learning Theory and Applications (Tomaso Poggio, Loreno Rosasco, Alexander Rakhlin, Ardrzej Banburski)
- TTIC 31120: Computational and Statistical Learning Theory (Nati Srebro)
- CMU 10-806: Foundations of Machine Learning and Data Science (Nina Balcan, Avrim Blum)
- UIUC ECE 598MR: Statistical Learning Theory (Maxim Raginsky)