This is an introductory graduate class on machine learning, covering topics such as supervised learning (classification, regression), unsupervised learning (clustering, dimensionality reduction) and graphical models (Bayes nets and Markov random fields). There will be an emphasis on Bayesian techniques. Examples of applications in the areas of vision, speech/ language and biology will be used throughout. It will be a fast-paced class, so prior exposure to machine learning at the undergraduate level (such as CS340 or Stat 306) is desirable, although not strictly necessary. The only official pre-requisites are: linear algebra, probability theory, multivariate calculus and programming skills (preferably matlab or R).