Prev Up Next
Go backward to Solution to part (a)
Go up to 1 Decision-tree Learning
Go forward to Solution to part (c)

Solution to part(b)

Draw a decision tree that the top-down myopic decision tree learning algorithm could build. For each node (including the leaves) show which examples are used to determine the classification at that node.

Suppose you split on attribute a first. There are 2 examples with attribute a true, namely e1 and e5, and these examples agree on the value of d. There are 3 examples with attribute a false, namely e2, e3, and e4, these don't agree on the value of d.

Next you have to choose an attribute to split the examples {e2, e3, e4}. b is a good choice as the examples with b true all agree on the value of d and the examples with b false all agree on the value of d.

Here is the resultant decision tree showing which examples are used at each node:


Computational Intelligence online material, ©David Poole, Alan Mackworth and Randy Goebel, 1998

Prev Up Next