0% found this document useful (0 votes)
1 views1 page

Class Notes

The document outlines various types of regression in machine learning, including Linear, Multiple Linear, Polynomial, Ridge, Lasso, Elastic Net, Logistic, Support Vector, Decision Tree, Random Forest, and Gradient Boosting Regression. Each type is briefly described with its equation and an example application. Additionally, it lists practical applications of regression such as stock price prediction and weather forecasting.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views1 page

Class Notes

The document outlines various types of regression in machine learning, including Linear, Multiple Linear, Polynomial, Ridge, Lasso, Elastic Net, Logistic, Support Vector, Decision Tree, Random Forest, and Gradient Boosting Regression. Each type is briefly described with its equation and an example application. Additionally, it lists practical applications of regression such as stock price prediction and weather forecasting.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Regression Types in Machine Learning – Exam Notes 1.

Linear Regression • Predicts a


continuous value using a straight-line relationship. • Equation: y = mX + c • Example: Predicting
house prices from area. 2. Multiple Linear Regression • Extension of linear regression with
multiple input features. • Equation: y = b0 + b1x1 + b2x2 + … + bn xn • Example: Predicting salary
using experience, age, education level. 3. Polynomial Regression • Models nonlinear
relationships by adding polynomial terms. • Example: Predicting growth curves or trends that bend.
4. Ridge Regression (L2 Regularization) • Handles multicollinearity by adding L2 penalty. •
Reduces model complexity and prevents overfitting. 5. Lasso Regression (L1 Regularization) •
Performs feature selection by shrinking some coefficients to zero. • Useful when many features are
irrelevant. 6. Elastic Net Regression • Combination of Ridge (L2) and Lasso (L1). • Useful when
data has many correlated features. 7. Logistic Regression • Used for classification (not
regression). • Outputs probability (0–1). • Example: Spam or not-spam classification. 8. Support
Vector Regression (SVR) • Uses Support Vector Machine concepts for regression. • Good for
high■dimensional data and nonlinear relationships. 9. Decision Tree Regression • Splits data into
regions and fits simple models. • Easy to interpret but can overfit. 10. Random Forest Regression
• Ensemble of multiple decision trees. • Reduces overfitting and increases accuracy. 11. Gradient
Boosting Regression • Builds trees sequentially to reduce errors. • Very powerful for
structured/tabular data. Applications of Regression • Stock price prediction • Weather forecasting
• Sales forecasting • Load prediction in power systems • Medical risk prediction

You might also like