TITLE: Efficient Monte Carlo Inference for Infinite Relational Models
ABSTRACT: The infinite relational model provides a flexible, coherent
probabilistic framework for modeling mixed attributional and
relational data, and has even been combined with causal models to
provide for the efficient learning of relational schemas and causal
relations from implicit event data. However, large-scale inference in
nonparametric Bayesian latent variable models remains an important
challenge, attracting attention from both the Monte Carlo and
variational inference communities.
In this poster, we describe techniques which support extremely
efficient implementation of Monte Carlo based inference methods for
models in this class. In particular, we show how proper caching of
density function terms yields a Gibbs sampler which is linear in the
number of objects (in the dense case) and linear in the number of
observed relations (in the sparse case). These per-iteration
complexities are equivalent to reported complexities for variational
methods, which are believed to be less accurate but possibly more
efficient, and A* search.
We also discuss particle filtering for infinite relational models,
which is massively parallelizable and essentially linear
time. Particle filtering performs well when observations are dense
(while yielding reasonable initializations when observations are very
sparse). Combined with generic Monte Carlo improvement methods like
tempering, we argue these methods support conceptually and
programmatically straightforward means to fit large scale latent
variable models for relational data.