Previous offerings: 24w2, 23w2, 22w2 by me, or 21w2, 20w2 by Mark Schmidt. This time will be broadly similar to last year's.
Italicized entries are tentative; in particular, the timing and even number of assignments might change. Textbook acronyms are explained below.
| Date | Topic/slides | Supplements | |
|---|---|---|---|
| M | Jan 5 | Syllabus Binary density estimation | ML vs. Stats, 3 Cultures of ML Math for ML, Essence of Linear Algebra PML1 2.1-2.4 |
| W | Jan 7 | Bernoulli MLE and MAP | PML1 4.5, 4.6.2 |
| W | Jan 7 | Assignment 1 released | |
| M | Jan 12 | Multivariate models; generative classifiers | PML1 9.3 |
| W | Jan 14 | Categorical data; discriminative models | PML1 2.5, 9.4, 10.2, 13.2 |
| F | Jan 16 | Assignment 1 due at 5pm | |
| W-Sa | Jan 14-17 | Quiz 1 | |
| F | Jan 16 | Add/drop deadline | |
| M | Jan 19 | Discriminative models and deep learning | PML1 13, 14 |
| W | Jan 21 | Gaussians and Bayesian learning | PML1 2.6, 4.6.7 |
| M | Jan 26 | Multivariate Gaussians | PML1 3.2 |
| W | Jan 28 | Learning with Gaussians; start empirical Bayes | PML1 3.3, 11.7 |
| W-Sa | Jan 28-31 | Quiz 2 | |
| M | Feb 2 | Finish empirical Bayes; exponential families | PML2 3.7; PML2 2.4 |
| W | Feb 4 | Mixtures and EM | PML1 8.7.2 / PML2 6.5; PML2 16.3 |
| M | Feb 9 | Finish mixtures/EM | |
| W | Feb 11 | Monte Carlo, Laplace approximation | PML2 11 |
| W-Sa | Feb 11-14 | Quiz 3 | |
| F | Feb 13 | Project proposal guidelines released | |
| M | Feb 16 | No class: Family Day + midterm break | |
| W | Feb 18 | No class: midterm break | |
| M | Feb 23 | Variational inference, VAEs | PML2 10.1-10.2, 21.1-2 PML1 20.3 |
| W | Feb 25 | Transposed convolutions, representation learning | PML1 14.4; PML2 32 |
| M | Mar 2 | Markov chains | PML2 2.6 |
| W | Mar 4 | Message passing Start MCMC | PML2 9.2; 12.1-12.2 |
| W-Sa | Mar 4-7 | Quiz 4 | |
| F | Mar 6 | Withdrawal deadline | |
| M | Mar 9 | Finish MCMC; directed graphical models | PML2 4.2, bonus material on PML2 9 |
| W | Mar 11 | Finish directed models Start Undirected graphical models | PML2 4.3-4.4; bonus on PML2 9, 28.5 |
| M | Mar 16 | Finish undirected graphical models Deep sequence models: RNNs | PML1 15.2 |
| W | Mar 18 | seq2seq and LSTMs | PML1 15.2 |
| W-Sa | Mar 18-21 | Quiz 5 | |
| M | Mar 23 | ||
| W | Mar 25 | Attention and Transformers | PML1 15.4-15.7; PML2 16.2.7, 16.3.5 |
| Su | Mar 29 | Project proposal due at 11:59pm | |
| M | Mar 30 | More representation learning | |
| W | Apr 1 | ||
| W-Sa | Apr 1-4 | Quiz 6 | |
| M | Apr 6 | No class: Easter Monday | |
| M | Apr 8 | LLMs to chatbots; diffusion models; fairness | |
| ?? | Apr ?? | Final exam (in person, handwritten) at time/place TBA | |
| Sa | Apr 25 | Final project due at 11:59pm | |
This course is intended as a second or third university-level course on machine learning, a field that focuses on using automated data analysis for tasks like pattern recognition and prediction. The class is intended as a continuation of CPSC 340 (also called 540, or previously 532M); it will assume a strong background in math and computer science. Topics will (roughly) include deep learning, generative models, latent-variable models, Markov models, probabilistic graphical models, and Bayesian methods.
The course meets in person in CHEM B250. I plan to release recordings, but encourage you to come to class in person if you can.
Grading scheme:
Further details in the syllabus slides.
Starting in the second week of classes, we'll have weekly tutorials run by the TAs. These will do things like go through provided assignment code, review background material, review big concepts, and/or do exercises. You can register for particular tutorial sections if you want to save a seat at a particular time, but note that you do not need to register in a tutorial section.
CPSC 340/540 vs. CPSC 440/550: CPSC 340 and 440 are roughly structured as one full-year course. CPSC 340 (which is sometimes cross-listed as CPSC 540 for graduate students; formerly 532M) covers more data mining methods and the methods that are most widely-used in applications of machine learning. CPSC 440 (cross-listed as CPSC 545 for graduate students) focuses on probabilistic methods which appear in more niche applications, as well as various other topics not covered in 340/540. It is strongly recommended that you take CPSC 340/540 first, as it covers the most fundamental ideas as well as the most common and practically-useful techniques. In 440/550 it will be assumed that you are basically familiar with all the material in the current offering of CPSC 340/540. Note that online machine learning courses and courses from many other universities may not be an adequate replacement for CPSC 340; they typically have more overlap with our applied machine learning course, CPSC 330. If you're not sure, look at a recent 340 website and see if it all seems familiar.
Undergraduate students will not be able to take the class without these prerequisites. Graduate students may be asked to show how they satisfy prerequisites.
Textbook: There is no textbook for the course, but the textbook with the most extensive coverage of many of the course's topics is Kevin Murphy's Probabilistic Machine Learning series. While the one-volume 2012 version covers most of the material, we'll refer to the very recent two-volume version (2022/2023), PML1 and PML2, both of which have free Creative Commons draft pdfs through those links. I'll try to refer to the relevant sections of both versions as we go, as well as links to various other free online resources.
If you need to refresh your linear algebra or other areas of math, check out Mathematics for Machine Learning (Marc Deisenroth, Aldo Faisal, Cheng Soon Ong; 2020).
Related courses: Besides CPSC340, there are several 500-level graduate courses in CPSC and STAT that are relevant: check out the graduate courses taught by people on the ML@UBC page and the MILD list. CPSC 422/425/436N, DSCI 430, EECE 360/592, EOSC 510/550, and STAT 305/306/406/460/461 are also all relevant.
Some related courses that have online notes are: