0% found this document useful (0 votes)
37 views23 pages

29.decision Tree Notes

Uploaded by

raoh22726
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views23 pages

29.decision Tree Notes

Uploaded by

raoh22726
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

Decision Tree

Introduction

 Decision tree is a powerful tool for prediction and classification.


 A given set of objects and their attributes determine the decision of new objects.
 It represents rules which can be understood by humans and used in knowledge system such as
database.
Key requirements
 Attribute value : object must be expressible in terms of properties or attributes.

 Predefined classes : the target function has a discrete output values.

 Sufficient data : enough training objects should be provided to learn the model.
Structure

 Decision tree is a classifier in the form of a tree.


 Decision node: specifies a test on a single attribute.
 Leaf node: indicates the value of the target attribute.
 Arc/edge: split of one attribute.
 Path: a disjunction of test to make the final decision.
Structure
Continue..
Decision system
Entropy computation
Information Gain computation
Information Gain
continue..
 Entropy( SWeak ) = -6/8 log2 (6/8) -2/8 log2 (2/8)
= - 0.75 * (- 0. 41503749927) - 0.25
(-2)
= 0.311278124 + 0.50
= 0.811

Entropy(SStrong ) = -3/6 log2 (3/6) – 3/6 log2(3/6)


= -0.5 * (-1) - 0.5 * (-1)
=1
Information Gain
continue..

Similarly for all attributes

Gain(S, Outlook) = 0.246


Gain(S, Humidity) = 0.151
Gain(S, Wind) = 0.048
Gain(S, Temperature) = 0.029
Information Gain
continue..
 Entropy (Ssunny ) = -2/5 log2 (2/5) – 3/5 log2 (3/5)
=-2/5 * (-1.32192) -3/5 (0.73696)
= 0.5287 + 0.442
=0.97…

Humidity :

Entropy(Ssunny,High) = - 0/3 log2 (0/3) – 3/3 log2 (3/3)


= 0

Entropy(Ssunny,Normal ) = 2/2 log2 (2/2) - 0/2 log2 (0/2)


=0
Temperature:
 Entropy (S Sunny, Hot ) =0
Entropy (S Sunny, Mild ) = -1/2log2 (1/2) – 1/2 log2 (1/2)
= 1/2 +1/2 =1
Entropy (S Sunny, Cool ) =0

Wind :
Entropy (SSunny,weak ) = -1/3 log2 (1/3) – 2/3 log2 (2/3)
= -1/3* (-1.58496250072) – 2/3 * (-0.58496250072)
= 0.5230376252 +0.3860752505
= 0.90
 Entropy(SSunny, Strong ) = -1/2 log2 (1/2) – 1/2 log2 (1/2)
= 1
Weakness
 Decision tree perform poorly with many class and small data.
 It computationally expensive to train.
Tutorial
Solution
Tutorial
 Classify these new examples as Oak or Pine using your
decision tree above.
 i) [Density=Light, Grain=Small, Hardness=Hard] Ans:
Pine
 ii)[Density=Light, Grain=Small, Hardness=Soft] Ans: Oak
THANK YOU

You might also like