Tutorials: Mondays (5-6, Hugh Dempster Pavilion 110, beginning January 8)
Instructor Office Hours: Tuesdays (3-4, ICICS 146, beginning January 9)
TA Office Hours: Wednesdays (2-3, DLC Table 6)
Instructor: Mark Schmidt.
Teaching Assistants: Reza Babazezhad, Raunak Kumar, Alireza Shafaei
Synopsis: This is a graduate-level course on machine learning, a field that focuses on using automated data analysis for tasks like pattern recognition and prediction. The course will move quickly and assumes a strong background in math and computer science as well as previous experience with statistics and/or machine learning. The class is intended as a continuation of CPSC 340 and it is strongly recommended that you take CPSC 340 first before enrolling in CPSC 540. Topics will (roughly) include large-scale machine learning, density estimation, probabilistic graphical models, deep learning, and Bayesian statistics.
Registration and Prerequisites: Graduate and undergraduate students from any department are welcome to take the class, provided that they satisfy the prerequisites. However, you can only register automatically if you are enrolled as a graduate student in CPSC, EECE, or STAT. If you are a graduate student from a different department (or are an undergraduate student satisfying these requirements), you can register by following the instructions here and submitting the prerequisites form here. Graduate students in CPSC/EECE/STAT also need to submit the prerequisites form before the add/drop deadline to stay enrolled.
CPSC 340 vs. CPSC 540: CPSC 340 and CPSC 540 are roughly structured as one full-year course. CPSC 340 covers more data mining methods and the methods that are most widely-used in applications of machine learning while CPSC 540 covers more research-level machine learning methods and theory. It is strongly recommended that you take CPSC 340 first, as it covers the most common and practically-useful techniques. If CPSC 340 is full, you should still sign up for the CPSC 340 waiting list (not CPSC 540) as we may expand the class size: taking CPSC 540 because CPSC 340 is full is a terrible idea. In 540 it will be assumed that you are familiar with the material in the current offering of CPSC 340, and note that the Coursera machine learning course is not an adequate replacement for CPSC 340.
CPSC 540 requires a stronger computer science and math background and will require substantially more work (including proofs and implementing methods from scratch). Note that CPSC grad students typically only take 1-3 courses per term compared to 3-6 for undergraduate students: so you should expect the workload to be up to 3 times higher than in typical courses. If you want an introduction to machine learning, do not have a strong computer science and math background, or are mainly interested in applying machine learning in your research, then CPSC 340 is the right course to take. You can always decide to take (or audit) CPSC 540 later.
Auditting: Rather than registering as a student, an alternate option is to register as an auditor. This is a good option for students that may be missing some of the prerequisites or that don't have enough time to do all course requirements, but that still want exposure to the material. For graduate students, the form for auditing the course is available here. For undergraduates, you need to fill out the form here and indicate on the course information section that you wish to "audit". I will describe the auditting requirements and sign these forms on the first day of class.
Textbook: There is no textbook for the course, but the textbook with the most extensive coverage of many of the course's topics is Kevin Murphy's Machine Learning: A Probabilistic Perspective (MLAPP). This book can be purchased from Amazon, is on reserve in the CS Reading Room (ICCS 262), and can be accessed through the library here. Optional readings will be given out of this textbook, in addition to other free online resources.
Grading: Assignments 40%, Final 30%, Project 30%.
Piazza will be used for course-related questions.
|Date||Lecture Slides||Related Readings and Links||Homework|
|Wed Jan 3|| Syllabus
|MLAPP 1.1-1.2, 1.4, 6.5, 7.1-3, 7.5, 8.1-3
ML vs. Stats (2001, 2015) 3 Cultures of ML
Essence of Linear Algebra
|Fri Jan 5||Fundamentals of Learning||MLAPP 1.4, 6.5, Probability Primer|
|Mon Jan 8||Convex Optimization||MLAPP 7.4, 14.5, BV 2.1-2.3, 3.1-3.2 Taylor Polynomial|
|Wed Jan 10||Gradient Descent Convergence||BV 9.1-3|
|Fri Jan 12||Rates of Convergence||PL Inequality||Assignment 1 due|
|Mon Jan 15||Subgradients||MLAPP 13.3|
|Wed Jan 17||Proximal Gradient||MLAPP 13.5 Proximal-Gradient||Assignment 2|
|Fri Jan 19||Structured Regularization||Structured Sparsity|
|Mon Jan 22||Coordinate Optimization||BV 9.4, Coordinate Descent|
|Wed Jan 24||Stochastic Subgradient||MLAPP 8.5, 13.4|
|Fri Jan 26||Stochastic Average Gradient||SAG|
|Mon Jan 29||Kernel Methods||MLAPP 14.1-5|
|Wed Jan 31||Structured Prediction||Assignment 2 due|
|Fri Feb 2||Density Estimation||MLAPP 2.3-5, Covariance Matrix|| Assignment 3|
|Mon Feb 5||Mixture Models||MLAPP 4.1-3, 11.1-2, Properties of Gaussians|
|Wed Feb 7||More Mixture Models|
|Fri Feb 9||Expectation Maximization||MLAPP 11.3-4, 11.6|
|Wed Feb 14||Kernel Density Estimation||MLAPP 14.7|
|Fri Feb 16||Markov Chains||MLAPP 17.1-2|
|Wed Feb 21||Reading Week||Assignment 3 due|
|Mon Feb 26||Monte Carlo Methods||MLAPP 23.1-2|| Assignment 4|
|Wed Feb 28||Message Passing||MLAPP 17.4|
|Fri Mar 2||DAG Models||MLAPP 10.1-2, 10.5|
|Mon Mar 5||More DAGs||MLAPP 10.3-4|
|Wed Mar 7||Undirected Graphical Models||MLAPP 19.1-4, 26.1-4|
|Fri Mar 9||Approximate Inferece||MLAPP 20.1-4, 24.1-2|
|Mon Mar 12||More Approximate Inference||MLAPP 19.5|
|Wed Mar 14||Guest Lecture: Frank Wood||Assignment 4 due|
|Fri Mar 16||Latent Graphical Models||MLAPP 17.3-5, 18.1-4, 27.7, 28.1-2|
|Mon Mar 19||Conditional Random Fields||MLAPP 19.6|| Assignment 5|
|Wed Mar 21||Fully-Convolutional Networks|
|Fri Mar 23||Recurrent Neural Networks|
Notes: These are additional notes mentioned in the lectures and homeworks:
Some related courses that have online notes are: