Compute a two-level decision tree using the greedy approach described in this chapter. Use the classification error rate as the criterion for splitting. What is the overall error rate of the induced tree?

Consider the following set of training examples.



Splitting Attribute at Level 1.



To determine the test condition at the root node, we need to com-



pute the error rates for attributes X, Y , and Z. For attribute X, the



corresponding counts are:







Therefore, the error rate using attribute X is (60 + 40)/200 = 0.5.



For attribute Y , the corresponding counts are:







Therefore, the error rate using attribute Y is (30 + 30)/200 = 0.3.



Therefore, the error rate using attribute Y is (40 + 40)/200 = 0.4.



For attribute Z, the corresponding counts are:



Therefore, the error rate using attribute Y is (30 + 30)/200 = 0.3.



Since Z gives the lowest error rate, it is chosen as the splitting attribute



at level 1.



Computer Science & Information Technology

You might also like to view...

According to the chapter text, our ____ greatly shapes our behavior and affects our interactions with others.

A. interpretive structure B. moral theory C. ethical structure D. self-concept

Computer Science & Information Technology

If you receive an error message while running a procedure, click ____ to start the process of correcting the error.

A. Debug B. Find Error C. Edit D. Run

Computer Science & Information Technology