0% found this document useful (0 votes)
18 views8 pages

Lec10 Random Forest Algorithm

Random Forest is a supervised learning algorithm that builds an ensemble of decision trees using the bagging method to improve prediction accuracy and stability. It can be applied to both classification and regression problems, making it versatile for various machine learning tasks. Unlike individual decision trees, Random Forest mitigates overfitting by randomly selecting observations and features to create smaller trees.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views8 pages

Lec10 Random Forest Algorithm

Random Forest is a supervised learning algorithm that builds an ensemble of decision trees using the bagging method to improve prediction accuracy and stability. It can be applied to both classification and regression problems, making it versatile for various machine learning tasks. Unlike individual decision trees, Random Forest mitigates overfitting by randomly selecting observations and features to create smaller trees.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Random Forest Algorithm

LEC.ASSIST : AHMED YOUSRY


What is Random forest
❑ Random forest is a supervised learning algorithm.
❑ The "forest" it builds, is an ensemble of decision trees, usually
trained with the “bagging” method.
❑ The general idea of the bagging method is that a combination
of learning models increases the overall result.
How it works
❑ Put simply: Random forest builds multiple decision trees and
merges them together to get a more accurate and stable prediction.
❑ One big advantage of random forest is that it can be used for both
classification and regression problems, which form the majority of
current machine learning systems.
Example of two Decision tress
Difference between DT and RF
❑ The random forest algorithm randomly selects observations and
features to build several decision trees and then averages the results.
❑ Another difference is "deep" decision trees might suffer from
overfitting.
❑ Most of the time, random forest prevents this by creating random
subsets of the features and building smaller trees using those subsets.
Thanks

You might also like