0% found this document useful (0 votes)
23 views2 pages

Decision Tree

The document discusses different decision tree algorithms including ID3, CART, Chi-Square Decision Tree, and Reduction-variance decision tree. ID3 is used to generate a decision tree from a dataset. CART is commonly used in machine learning and can be learned from training data and used to make predictions. Chi-square splitting allows for more than two splits for categorical targets. Reduction in Variance splits nodes for continuous targets using variance.

Uploaded by

MUSKAN AGRAWAL
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views2 pages

Decision Tree

The document discusses different decision tree algorithms including ID3, CART, Chi-Square Decision Tree, and Reduction-variance decision tree. ID3 is used to generate a decision tree from a dataset. CART is commonly used in machine learning and can be learned from training data and used to make predictions. Chi-square splitting allows for more than two splits for categorical targets. Reduction in Variance splits nodes for continuous targets using variance.

Uploaded by

MUSKAN AGRAWAL
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

ID3 decision tree

In decision tree learning, ID3 (Iterative Dichotomiser 3) is an algorithm


invented by Ross Quinlan used to generate a decision tree from a dataset. ID3
is the precursor to the C4. 5 algorithm, and is typically used in the machine
learning and natural language processing domains.

CART
● The many names used to describe the CART algorithm for machine learning.
● The representation used by learned CART models that is actually stored on disk.
● How a CART model can be learned from training data.
● How a learned CART model can be used to make predictions on unseen data.
● Additional resources that you can use to learn more about CART and related
algorithms.

Chi-Square Decision Tree


Chi-square is another method of splitting nodes in a decision tree for datasets
having categorical target values. It can make two or more than two splits. It works
on the statistical significance of differences between the parent node and child nodes

Reduction-variance decision tree


Reduction in Variance is a method for splitting the node used when the target
variable is continuous, i.e., regression problems. It is so-called because it uses
variance as a measure for deciding the feature on which node is split into child
nodes. Variance is used for calculating the homogeneity of a node.

You might also like