2021 UBC [CS532W] Topics in AI: Probabilistic Programming


Probabilistic programming lies at the intersection of machine learning, statistics, programming languages, and deep learning. Historically probabilistic programming has been about automating Bayesian statistical inference; more recently it has emerged as a candidate for the next toolchain for AI, particularly for unsupervised, semi-supervised, and reinforcement learning.

Learning Outcomes

By the end of this course you will know how to (if you don’t already):

  • Write a general-purpose inference engine for graphical models specified via a probabilistic programming language
  • Write a general-purpose inference engine for higher-order probabilistic programming language

Skills and knowledge that you will reinforce or acquire:

  • What a model is
  • How to use probabilistic programming systems to solve inference problems automatically
  • The relationship between generative models, stochastic simulators, and decoders
  • What inference is, various algorithms for performing inference, and what their characteristics are

You will also be exposed to a variety of “advanced” models and methods including program synthesis via inference, deep structured variational autoencoders for semi- and un-supervised learning as well as a raft of advanced inference methods including Hamiltonian Monte Carlo, sequential Monte Carlo and stochastic variational inference.

Similar Courses Around The World

This course is one a very small handful in the world on probabilistic programming. Such courses include

  • http://probcomp.csail.mit.edu/9.S915/
  • https://www.cs.ox.ac.uk/teaching/courses/2020-2021/SPP/
  • http://danroy.org/teaching/2015/STA4516/
  • https://github.com/hongseok-yang/probprog17

There have been courses run at MIT and Stanford in the past built around the truly excellent material in DIPPL – The Design and Implementation of Probabilistic Programming Languages [1]. Also ProbMods[2].

This the second offering of a course built around the still forthcoming Foundations and Trends in Machine Learning book entitled “Introduction to Probabilistic Programming” [3]. In spirit this book is most similar to DIPPL however it, and this course, are more technical, broader, and push slightly further towards state of the art.

  1. N. D. Goodman and A. Stuhlmüller, “The Design and Implementation of Probabilistic Programming Languages.” 2014.
  2. N. D. Goodman and J. B. Tenenbaum, “Probabilistic Models of Cognition.” 2016.
  3. J. W. van de Meent, B. Paige, H. Yang, and F. Wood, “Introduction to Probabilistic Programming,” Foundations and Trends in Machine Learning, pp. in review, 2018.