Learning Bayes Nets using Score Functions and Dependency Constraints

By Oliver Schulte, Simon Fraser University

There are two well-established frameworks for learning Bayes nets: the "search and score" paradigm and constraint-based learning. Score-based learning searches for a graph structure G that maximizes a model selection score S(G,d) for a given sample d. Constraint-based learning searches for a graph structure G that entails the statistically significant dependencies and independencies (correlations) in the sample d.

We propose a hybrid criterion for learning Bayes net structures that combines search based on a scoring function S with information from statistical tests: Search for a structure G that maximizes the score S, given the constraint that the structure must entail the observed dependencies. We rely on the statistical test only to accept conditional {\em dependencies}, not conditional independencies. We show how to adapt local Bayes net search algorithms to accommodate the observed dependencies. Simulation studies with GES and the BDeu scoring function provide evidence that the additional dependency information leads to an improved structure on small to medium sample sizes (e.g. $< 10^4$ data points with 10 variables).

Joint work with Wei Luo (Simon Fraser University) and Russ Greiner (University of Alberta)

Click here to go to the LCI Forum page