Lectures

After each lecture, you can download the video or watch it in youtube, where it is listed as undergraduate machine learning.

Wed Sep 05. Introduction.

Fri Sep 07. Introduction.

Mon Sep 10. Probability.

Wed Sep 12. Bayes rule.

Fri Sep 14. Bayes rule and maximum expected utility.

Mon Sep 17. Introduction to graphical models and inference.

Wed Sep 19. Inference.

Wed Sep 19. Inference.

Mon Sep 24. The HMM filtering algorithm.

Wed Sep 26. Bernoulli distributions and expectations.

Fri Sep 28. Information theory and maximum likelihood learning.

Mon Oct 01. Maximum likelihood (continued).

Wed Oct 03. Bayesian learning.

Fri Oct 05. Learning graphical models.

Mon Oct 08. Thanksgiving Day.

Wed Oct 10. Linear algebra review and Google search.

Fri Oct 12. The Singular Value Decomposition and image compression.

Mon Oct 15. Principal component analysis (PCA) and dimensionality reduction.

Wed Oct 17. Linear prediction.

Fri Oct 19. Linear prediction.

Mon Oct 22. Probabilistic approach to linear prediction.

Wed Oct 24. Ridge regression and regularization.

Fri Oct 26. Nonlinear regression with basis functions and cross-validation for model selection.

Mon Oct 29. Lasso and automatic variable selection.

Wed Oct 31. Revision.

Fri Nov 02. Midterm.

Mon Nov 05. Naive Bayes classifier.

Wed Nov 07. Naive Bayes classifier.

Fri Nov 09. Naive Bayes classifier.

Mon Nov 12. Remembrance Day.

Wed Nov 14. Optimization.

Fri Nov 16. Logistic regression.

Mon Nov 19. Neural networks.

Wed Nov 21. Neural networks.

Fri Nov 23. Deep learning.

Mon Nov 26. Decision trees.

Wed Nov 28. Random forests.

Fri Nov 30. Object detection and kinect examples.

RECOMMENDED READING

  • My favourite book for this course is the book of Stuart Russell and Peter Norvig titled artificial intelligence. Chapter 14 covers probabilistic graphical models. Chapter 15 covers HMMs. Chapter 20 talks about maximum likelihood, the EM algorithm, learning the parameters of graphical models and naive Bayes. Chapter 18 teaches decision trees, linear regression, regularization, neural networks and ensemble learning.
  • The machine learning book of Hastie, Tibshirani and Friedman is much more advanced, but it is also a great resource and it is free online: The elements of statistical learning.
  • For graphical models and Beta-Bernoulli models, I recommend A Tutorial on Learning with Bayesian Networks David Heckerman.
  • Kevin Murphy has compiled a nice page about Bayesian learning.
  • Wikipedia tutorial on the: SVD
  • The following handout should help you with linear algebra.

MEDIA