80 Lectures on Machine Learning

This is a collection of course material from various courses that I've taught on machine learning at UBC, including material from over 80 lectures covering a large number of topics related to machine learning. The notation is fairly consistent across the topics which makes it easier to see relationships, and the topics are meant to be gone through in order (with the difficulty slowly increasing and concepts being defined at their first occurrence). I'm putting this in one place in case people find it useful for educational purposes.

Part 1: Computer Science 340

The first set of notes is mainly from the September-December 2017 version of CPSC 340, an undergraduate-level course on machine learning and data mining. Related readings and assignments are available from the course homepage. In the relevant places, I've also included some lectures from previous terms in cases where I covered different topics. Major changes since the 2016 version of the course include many improvements made by Mike Gelbart when he taught the course, and updating the slides to use a different colour for more-advanced or tangential "bonus material".

Although I've never had my lectures for this course recorded, videos of the lectures from the Winter 2018 section of this course taught by Mike Gelbart are available here (the material is largely the same).

1. Supervised Learning

2. Unsupervised Learning

3. Linear Models

4. Latent-Factor Models

5. Neural Networks

Part 2: Data Science 573 and 575

The second set of notes are from courses I've taught in UBC's Master of Data Science (MDS) program in 2017 and 2018, which could naturally follow the topics above.

Part 3: Computer Science 540

The third set of notes is from the January-April 2018 offering CPSC 540, a graduate-level course on machine learning. Related readings and assignments are available from the course homepage. This course is intended as a continuation on CPSC 340 and the notation in this course is almost the same, except that we switch to using superscripts to refer to training examples (so that subscripts can refer to individual variables).

Videos covering the first month of material in the 2016 offering are available here. Note that the material has gone through some substantial improvement since then.

A. Fundamentals

B. Large-Scale Machine Learning

C. Density Estimation

D. Graphical Models

E. Discriminative Models

F. Bayesian Learning

Part 4: Machine Learning Reading Group

The final set of notes are topics that I have not covered in a formal course, but where I've given overviews in our machine learning reading group.

Mark Schmidt > Courses