Machine learning textbook
Machine Learning: a Probabilistic Perspective
by Kevin Patrick Murphy
amazon.com and other vendors.
Kindle version also available.
Chapter 1 (Introduction)
Chapter 19 (Undirected graphical models/
Markov random fields). Note: this is from the third printing. This
corrects some errors that were found (by Sebastien Bratieres) in sec 19.7.
figures, together with matlab code to generate them
selling machine learning book on amazon.com (22 October 2012).
selling book at MIT Press (24 November 2012).
for instructors from MIT Press. If you are an official instructor,
you can request an e-copy, which can help you
decide if the book is suitable for your class. You can also request the
- "An astonishing machine learning book: intuitive, full of examples,
fun to read but still comprehensive, strong and deep! A great starting
point for any university student -- and a must have for anybody in the
-- Prof. Jan Peters, Darmstadt University of Technology/
Max-Planck Institute for Intelligent Systems
"Prof. Murphy excels at unravelling the complexities of machine
learning methods while motivating the reader with a stream of
illustrated examples and real world case studies. The accompanying
software package includes source code for many of the figures, making
it both easy and very tempting to dive in and explore these methods
for yourself. A must-buy for anyone interested in machine learning or
curious about how to extract useful knowledge from big data."
-- Dr John Winn, Microsoft Research.
"This book will be an essential reference for practitioners of modern
machine learning. MLAPA covers the basic concepts needed to
understand the field as whole and the powerful modern methods that
build on those concepts. In MLAPA, the language of probability and
statistics reveals important connections between seemingly disparate
algorithms and strategies. Thus its readers will become articulate
in a holistic view of the state-of-the-art and poised to build the
next generation of machine learning algorithms."
-- Prof. David Blei, Princeton University
"An amazingly comprehensive survey of the field, covering both the
basic theory as well as cutting edge research. Richly illustrated and
loaded with examples and exercises. I will tell my students (and
myself) to read this cover to cover!"
-- Prof. Max Welling, U.C. Irvine
"This book covers an impressive range of the state-of-the-art in
statistical machine learning. It defines a clear and broadly
accessible path that begins with the fundamentals of probability, and
leads to a rich toolbox of statistical models and learning
-- Prof. Erik Sudderth, Brown University
"This book does a really nice job explaining the basic principles and
methods of machine learning from a Bayesian perspective. It will
prove useful to statisticians interested in the current frontiers of
machine learning as well as machine learners seeking a probabilistic
foundation for their methods. It hits the 4 c's: clear, current,
concise, and comprehensive, and it deserves a place alongside 'All of
Statistics' and 'The Elements of Statistical Learning' on the practical
-- Dr Steven Scott, Google Inc
"This is a wonderful book
that starts with basic topics in statistical modeling, culminating in
the most advanced topics. It provides both the theoretical foundations
of probabilistic machine learning as well as practical tools, in the
form of Matlab code. The book should be on the shelf of any student
interested in the topic, and any practitioner working in the field."
-- Dr Yoram Singer, Google Inc
"I believe [this book]
will become an essential reference for students and
researchers in probabilistic machine learning. It covers both
frequentist and Bayesian statistical viewpoints, which is helpful to
expose the similarities and differences between the two. It
has a thorough treatment of the basic material of supervised and
unsupervised learning, but goes beyond the basics to cover interesting
generalizations, e.g. section 17.6 on Generalizations of HMMs, and
recent work, e.g. on deep learning (chapter 28). There is also an
impressive suite of Matlab code to accompany the book, which will greatly
facilitate readers applying the models to their own
data, and building their own refinements."
--- Prof Chris Williams, Univ. Edinburgh
"This is an excellent textbook on machine learning, covering a number
of very important topics. The depth and breadth of coverage of
probabilistic approaches to machine learning is impressive. Having
Matlab code for all the figures is excellent. I highly recommend this
book!" -- Prof. Zoubin Ghahramani, U. Cambridge
Comparison to other books on the market
My book (MLaPP) is similar
Pattern recognition and machine learning,
Hastie et al's
The Elements of Statistical Learning,
and to Wasserman's
All of statistics,
with the following key differences:
- MLaPP is more accessible to undergrads.
It pre-supposes a background in probability, linear algebra,
calculus, and programming;
however, the mathematical level ramps up slowly, with more difficult
sections clearly denoted as such. This makes the book suitable for
both undergrads and grads.
Summaries of the relevant mathematical background,
on topics such as linear algebra, optimization and classical
statistics make the book self-contained.
- MLaPP is more practically-oriented.
In particular, it comes with Matlab software
to reproduce almost every figure, and to implement
almost every algorithm, discussed in the book.
It includes many worked examples of the methods applied to real
data, with readable source code online.
- MLaPP covers various important topics that are not discussed in
these other books, such as conditional random fields,
- MLaPP is "more Bayesian" than the Hastie or Wasserman books,
but "more frequentist" than the Bishop book. In particular, in MLaPP,
we make extensive use of MAP estimation, which we regard as "poor
man's Bayes". We prefer this to the regularization interpretation of
MAP, because then all the methods in the book (except cross
validation...) can be viewed as probabilistic inference,
or some approximation thereof. The MAP interpretation also allows for
an easy "upgrade path" to more accurate methods of approximate
Bayesian inference, such as empirical Bayes, variational Bayes, MCMC,
- The emphasis is on simple parametric models (linear and logistic
regression, discriminant analysis/ naive Bayes, mixture models, factor
analysis, graphical models, etc.), which are the ones most often used
However, we also briefly discuss non-parametric models, such as Gaussian
processes, Dirichlet processes, SVMs, RVMs, etc.