These four lectures build on both the basic probability and statistics materials covered in A1 and the estimation concepts introduced in the estimation portion of B14. In particular these lectures re-introduce frequentist inference, in A1 called hypothesis testing, ground it in a slightly more theoretical framework, and explain it in terms of model-based reasoning. We continue by re-introducing maximum likelihood parameter estimation and regression, noting that parameter estimation can be used to perform inference. Following this we introduce regularization then Bayesian inference, the latter of which is contrasted to frequentist inference. We finish by introducing classification as a specialization of regression and discuss latent variable inference in this context.