Emtiyaz Khan: Piecewise Bounds for Estimating Discrete-Data Latent Gaussian Models
This talk concerns modeling discrete data using models such as Bayesian logistic regression, Gaussian process classification, probabilistic principal components analysis, and factor analysis. All these models are special cases of latent Gaussian models, for which parameter estimation is difficult due to an intractable logistic-Gaussian integral in the marginal likelihood. The standard variational framework does not solve the problem as the bound obtained using Jensen's inequality is still intractable. In this work, we propose the use of fixed piecewise linear and quadratic upper bounds to the logistic-log-partition (LLP) function as a way of circumventing this intractable integral. We describe a framework for approximately computing minimax optimal piecewise quadratic bounds, as well as a generalized expectation maximization algorithm based on using piecewise bounds to fit binary, categorical and ordinal data. Through application to real-world data, we show that proposed bounds achieve better estimation accuracy than existing variational bounds with little increase in computation. We also show that, unlike existing sampling methods, our methods guarantee convergence, offer simple convergence diagnostics, and scale well to datasets containing thousands of variables and instances. This is joint work with Benjamin Marlin and Kevin Murphy.
Visit the LCI Forum page