Phase Transitions and Typical-case Complexity: Easy (Hard) Aspects of Hard (Easy) Problems

by Yong Gao

In this talk, I discuss our investigation into the typical-case
hardness of randomly-generated instances of several problems in AI.
For constraint satisfaction problems (CSP), we identified a new class
of algorithmically exploitable structures and proved that random instances
contain such structures with high probability. In an effort to find a way
to eliminate these structures from random CSP instances, we established an
interesting connection between the notion of constraint consistency and the
resolution complexity of random CSP instances. A new scheme is proposed to
generate random CSP instances with theoretically guaranteed resolution
complexity and empirically confirmed hardness.

For the typical-case behavior of dynamic-programming-based algorithms, we
established an improved lower bound on the threshold for a random graph to
have a treewidth linear in the graph size. Similar techniques were then applied
to random CSPs, random Bayesian networks, and fitness landscape models
in computational biology and evolutionary computation. It was concluded that
dynamic-programming-based algorithms all have exponential behavior even on problem
instances randomly generated under an instance distribution that have been shown,
theoretically and/or empirically, to be easy for backtracking search algorithms.

In the last part of this talk, I discuss our recent work in which we demonstrate
the dramatic impact of heuristics on the performance of search algorithms by
studying randomly-generated instances of the dominating clique problem.

This is a joint work with Dr. Joseph Culberson and Dr. Calin Anton

Back to the LCI Forum page