# Stochastic Local Search for the Most Probable Explanation Problem (SLS4MPE)

| newsabstract  |  people  |  papers  |  software  |  benchmark problems

### News

 July 25, 2005 Version 1.0.0 of the SLS4MPE code is now online. The most important new feature is the support of various input formats: we can now deal with general factor graphs and there's also a nice Matlab interface.

### Abstract

Finding most probable explanations (MPEs) in graphical models, such as Bayesian belief networks, is a fundamental problem in reasoning under uncertainty, and much effort has been spent on developing effective algorithms for this NP-hard problem. In different communities, the MPE problem has different names; it is also known as the maximum a posteriori (MAP) problem, or simply max-product. The problem is to find thecomplete instantiation of all variables in the graphical model that has maximal probability.

Stochastic local search (SLS) approaches to MPE often find optimal solutions surprisingly effectively, but they can never prove optimality. This property makes them especially useful for anytime scenarios as they appear frequently, e.g. in sketch recognition and computer vision. The implementations you will find on this website often optimally solve problems with hundreds of variables in milliseconds, regardless of their induced width !

SLS algorithms for MPE solving have previously been explored, but were found to be not competitive with state-of-the-art branch & bound methods. We improved on these previous algorihms by a variety of means (see IJCAI'05 paper and Frank's MSc. thesis for details), yielding two new powerful algorithms that challenge and often substantially improve the state-of-the-art in solving MPE problems. These two algorithms are called ILS (Iterated Local Search) and GLS+ (Guided Local Search+).

Iterated Local Search (ILS) performs a biased random walk over locally optimal solutions by the following means: Starting from some initial solution, it applies a basic local search that leads to a local minimum. From here, it iterates the following steps:
1) pertubation: escape from the local optima region by perturbing the current solution (this often just flips a number of variables at random)
2) basic local search: steer straight into the next locally optimal solution (this is often just greedy hill-climbing)
3) accepance criterion: decide whether to accept the new local optimum or reject it and return to the previous one (this often always accepts improved solutions and rejects declined solutions with a high probability)

Guided Local Search+ (GLS+) is a particular kind of Dynamic Local Search algorithm. It combines a basic local search (mostly greedy hill-climbing) with a penalty mechnism that is applied in local optimum. Solution features that appear in the current local optimum (but also in other solutions) are penalized to make them less desirable. The subsequent basic local search method uses a modified evaluation function that takes the penalties into account, such that the previous local optimum is not a local optimum anymore and the search can espace. For MPE, this mechanism can be thought of adding new temporary factors to the problem in order to escape from locally optimal solutions.

### Software

• SLS4MPE version 1.0 - can solve MPE in general factor graphs (but still supports reading in of .simple Bayes nets)
This package includes GLS+, ILS, and anytime Mini-Buckets (able to prove optimality for small problems, nice for debugging etc.)
Since it was minimal overhead, I also support reading in SAT instances. Since GLS+ builds an N-dimensional potential with 2^N entries for every clause over N variables, it performs really poorly for instances with large clauses (it may not even be able to read them in). But it's not all that bad for random 3-SAT. It should be yet a bit better for weighted 2-SAT (which is still NP-hard).
• Matlab interface to SLS4MPE (using .mex files). Uses general factor graph representation.
• Matlab converters to general factor graph representation - so far this includes converters for
Bayes nets in Kevin Murphy's BNT format,
sketch recognition networks (conditional random fields) from Martin Szummer,
side-chain prediction networks (pairwise Markov random fields) from Chen Yanover,
and the output of a factor graph in SLS4MPE's simple format for factor graphs (then you don't need the Matlab interface anymore).
• SLS4MPE Version 0.9 - can only read in Bayes nets in .simple format.
Note: This is the software used for the 2005 IJCAI paper (Hutter, Hoos, and Stützle).

### Benchmark Problems for MPE

Here's factor graphs from two different domains. If you use Matlab, you can just call SLS4MPE from there, too (see the MEX interface above).

Here's all the data sets we used for the IJCAI paper (all in our own super-simple input format .simple):

 Random networks, generated with the generator from Rina Dechter's group random.tar.gz (7.7MB) Random grid networks, generated with the generator from Rina Dechter's group grid.tar.gz (24.4MB) Random networks, generated with BNGenerator from Fabio Cozman's group These networks have bounds on the induced width ! iwscaling-rand.tar.gz (5.8MB) Random networks like the above, generated with BNGenerator from Fabio Cozman's group but with CPTs whose lines I sampled from networks in the Bayes net repository in order to simulate real-world structure! iwscaling-struc.tar.gz (1.5MB) Bayesian network repository instances bnrep.tar.gz (0.7MB) Converter from Bayesian network interchange format to .simple (this reads networks into Java Bayes and outputs them in .simple format - it works for all formats Java Bayes can read in) converter.zip (0.7MB) Ruby script for controlled generation of many networks with BNGenerator. This requires BNGenerator. generate_bns.rb (3KB)