CPSC 536H: Empirical Algorithmics (Spring 2012)
Notes by Holger H. Hoos, University of British Columbia
----------------------------------------------------
Introduction
----------------------------------------------------
0.1 Some motivating examples
Example 1:
You have just developed a new algorithm A that, given historical weather data
predicts whether it will rain tomorrow.
You believe A is better than any existing algorithms for this problem.
Questions:
- How do you show the superiority of your new algorithm? [analysis]
- If your algorithm is not superior (yet), how can you further improve its performance? [design]
Example 2:
You have implemented several heuristic algorithms for solving as efficiently as possible
an airline crew scheduling problem.
You observe that which algorithm performs best appears to vary considerably between
different problem instances.
Questions:
- How do you determine which algorithm performs best for which type of problem instance
and why? [analysis]
- How could you select (what will likely be) the best algorithm for a given instance
that needs to be solved? [design]
Example 3:
You have implemented a sophisticated algorithm for recognising various types of cancer
based on biomedical diagnostics for a cell sample.
This algorithm is trained on expert-labelled data, which is difficult to obtain and you
want to achieve good performance with a minimum amount of training data.
Question:
How do you determine at which point further training is no longer necessary or desirable?
Note:
- In all of these cases, providing the answers to the respective questions will very likely
heavily rely on empirical methods.
- In particular, need to resort to computational experiments + statistical analysis techniques,
and to automated algorithm design methods.
Exercise (in groups): Think about questions from your area of interest/expertise that may require empirical studies.
[slides: laughlin interview]
---
0.2 CS as an empirical science
The Three Pillars of CS:
- Theory: deals with abstract models and their properties
(“eternal thruths”)
- Engineering: deals with principled design of artifacts
(hardware, systems, algorithms, interfaces)
- (Empirical) Science: deals with principled study of phenomenae
(behaviour of hardware, systems, algorithms; interactions)
Note:
- CS has strong roots in Math (-> theory) and Engineering (hardware design + implementation).
- Properties of artifacts in computing are usually studied by means of theoretical analysis
and/or (more or less) systematic testing.
- Many hardware and software artifacts ar so complex that they cannot be analysed with theoretical means
or systematic testing.
What is science?
[slide:]
Definition of ”science” (according to the Merriam-Webster Unabridged Dictionary):
“3a: knowledge or a system of knowledge covering general truths
or the operation of general laws especially as obtained and tested
through scientific method”
[slide:]
The Scientific Method:
make observations
formulate hypothesis/hypotheses (model/theory)
While not satisfied (and deadline not exceeded) iterate:
1. design experiment to challenge model
2. conduct experiment
3. analyse experimental results
4. revise model based on results
Note:
- Hypotheses are often obtained through bold (and often incorrect) generalisation.
- Formulation and revision of hypotheses is a creative task, as is (to some extent) design of experiments.
- Experiments must be capable of producing outputs that invalidate the model.
Exercise (in groups): How does the scientific method apply to computing?
Results:
- study of the behaviour of complex algorithms
- study of properties / behaviour of complex software / hardware components / systems
- study of interactions between systems and their users
- study of hardness of certain computational problems
- development and study of engineering principles (algori
thms, software, hardware)
- others??
---
0.3 Empirical algorithmics
Goals of empirical algorithmics: [analysis goals]
- Show that given algorithm A has property P.
- Show that given algorithm A is better than some other algorithm B.
- Show that given algorithm A improves state of the art (in solving a given problem).
[Examples - ask students]
Furthermore: [design goals]
- Design better algorithms (using empirical methods)
- Improve existing algorithms (using empirical techniques, e.g., for parameter optimisation)
Note:
- empirical methods play an important role in algorithm design
- empirical analysis methods provide basis for properly assessing designs,
including intermediate designs [ask students: why is this important?]
- principled design methods are preferable over ad-hoc methods
(see also: algorithm engineering)
- empirical algorithmics also deals with automated algorithm design methods
Types of empirical algorithm studies [based on Johnson, 2001]:
- horse race: main interest is establishing superiority of an algorithm / algorithmic idea
goal: demonstrate superiority on given inputs (benchmarks)
- application study: main interest is in applying algorithm in real application,
goal: demonstrate impact of algorithm in context of application
[note: includes experimental mathematics where application = generation and proof
of conjectures]
- experimental analysis: main interest is in understanding behaviour of an algorithm
goal: better understand strengths, weaknesses, operation of given algorithm
special case: characterising (average-case) behaviour of algorithm where theoretical
analysis is too hard / imprecise
Issues and problems arising in empirical algorithmics:
[Ask students]
- algorithm implementation (correctness, fairness)
- parameter settings (fairness in tuning)
- selection of problem instances (benchmarks)
- performance criteria (what is measured?)
- experimental protocol
- data analysis & interpretation
...
How is empirical algorithmics different from other empirical sciences?
[Ask students]
1. access to lowest, ultimate level of reality = precise and complete mathematical description of object under study
2. complete and precise control over the object of study and experimental environment
3. experiments are often (but not always) relatively cheap
4. discrete (vs. continuous) behaviour with observable consequences (discretisation effects)
Consequences:
2 -> perfect reproducibility of experiments (no uncertainty, no noise);
typically no problems due to unknown/uncontrollable factors;
instrumentation is typically relatively easy and very flexible
3 -> statistical significance of results is often easier to achieve (by means of large sample sizes)
Note:
- large amounts of data often easy to generate - but this is not always beneficial, can cause problems
Criteria for empirical studies (shared with other empirical sciences):
- reproducibility
- comparability
- significance (statistical question)
- relevance (meaning, depends on context: application, community, bigger question, ...)
Two types of theory:
1. From first principles:
Unlike most other systems, algorithms can be studied purely based on their precise and complete mathematical description;
in practice, they are often too complex for this to succeed.
(In particular, high-performance heuristic algorithms for hard computational problems.)
2. From empirical observation (as in other empirical sciences)
The theory/practice gap:
For many problems, there is a considerable gap between
- the best provable performance guarantees obtained for any known algorithm and
- the practically observed performance for any known algorithms
(consider, e.g., approximation guarantees for optimisation algorithms
or bounds on time-complexity for decision algorithms)
Note:
- the algorithms mentioned above are often very different [why?]
Empirically derived theory can help to explore that gap and to close it by generating new insights.
Theoretical vs empirical analysis:
- theoretical results can inform empirical studies: suggest properties, effects
- empirical studies can guide computational theory: generate some hypotheses that can then be proven theoretically
Note:
- Computational theory and theoretical analysis techniques are important - know them and use them!
---
0.4 Course overview:
[see also course web page]
Module 1: General considerations for empirical analysis
Module 2: Deterministic Decision Procedures
Module 3: Randomised Decision Procedures
Module 4: Decision Procedures with Error
Module 5: Optimisation Procedures
Module 6: General considerations for empirical design methods
Module 7: Automated Parameter Tuning and Algorithm Configuration
Module 8: Restart Strategies and Multiple Independent Runs
Module 9: Automated Algorithm Selection
Module 10: Algorithm Portfolios
(each module ~1 week)
Format:
- ~2/3 lecture style, ~1/3 discussion-based
- paper review: one per student; written report and 15min presentation in class
(after mid-term break)
- course projects: mostly during second half of term;
includes project proposal, progress report, final report, presentation at mini-workshop (end of april)
Student assessment:
- assignments: ~20%
- paper presentation+review: ~25%
- project (proposal, final report, talk): ~45%
- in-class participation: ~10%
Note:
- New structure / material; parts will likely be a bit rough
- Tell me what you like / dislike
- Let me know when I'm too fast / too slow
- Tell me right away when you don't understand something
- Ask questions, contribute your comments and ideas
Questions?
Comments?
---
learning goals (for module 0):
- be able to explain the three foundations of CS
- understand the scientific method and how it applies to computing science in general, and to the analysis of algorithms in particular
- be able to explain similarities and differences between empirical algorithmics and other empirical sciences
- be able to explain goals of the empirical analysis of algorithms and types of empirical studies
- be able to explain issues/problems arising in the empirical analysis of algorithms
- also: understand how the course is structured and how students are evaluated