Technical Reports

The ICICS/CS Reading Room

UBC CS TR-2009-21 Summary

Tradeoffs in the Empirical Evaluation of Competing Algorithm Designs, October 20, 2009 Frank Hutter, Holger H. Hoos and Kevin Leyton-Brown, 27 pages

We propose an empirical analysis approach for characterizing tradeoffs between different methods for comparing a set of competing algorithm designs. Our approach can provide insight into performance variation both across candidate algorithms and across instances. It can also identify the best tradeoff between evaluating a larger number of candidate algorithm designs, performing these evaluations on a larger number of problem instances, and allocating more time to each algorithm run. We applied our approach to a study of the rich algorithm design spaces offered by three highly-parameterized, state-of-the-art algorithms for satisfiability and mixed integer programming, considering six different distributions of problem instances. We demonstrate that the resulting algorithm design scenarios differ in many ways, with important consequences for both automatic and manual algorithm design. We expect that both our methods and our findings will lead to tangible improvements in algorithm design methods.

If you have any questions or comments regarding this page please send mail to