All of Machine Learning*

*This is not a comprehensive reference on all of machine learning. It's just a bunch of lecture slides from various courses that I've taught at UBC, that I'm putting in one place in case people find it useful for educational purposes. However, this material does cover a huge number of topics related to machine learning. The "All of" name is inspired by Larry Wasserman's excellent book All of Statistics.

The notation is fairly consistent across the topics which makes it easier to see relationships, and the topics are meant to be gone through in order (with the difficulty slowly increasing and concepts being defined at their first occurrence).

The first set of notes is mainly from CPSC 340, an undergraduate-level course on machine learning and data mining. Related readings and assignments are available from the course homepage.

1. Supervised Learning

2. Unsupervised Learning

3. Linear Models

4. Latent-Factor Models

5. Neural Networks

6. Sequences and Graphs

The second set of notes is from CPSC 540, a graduate-level course on machine learning. Related readings and assignments are available from the course homepage. The notation in this course is almost the same, except that we switch to using superscripts to refer to training examples (so that subscripts can refer to individual variables).

A. Large-Scale Machine Learning

B. Density Estimation

C. Probabilistic Graphical Models

D. More Neural Networks

E. Bayesian Statistics

F. Even-More Neural Networks

Some large/notable topics that are missing from both courses are: I will likely cover these topics in future years, and some of these have been covered in our machine learning reading group.

Mark Schmidt > Courses