0% found this document useful (0 votes)
20 views16 pages

Lecture 3

The document discusses various regression techniques in AIML, including Polynomial Regression, Ridge Regression, Lasso Regression, and Elastic Net. Polynomial Regression models non-linear relationships using polynomial terms, while Ridge and Lasso Regression apply different penalties to prevent overfitting in linear models. It also covers Gradient Descent as an optimization method and introduces key regression performance metrics such as MAE, MSE, RMSE, and R2 Score.

Uploaded by

vikrammadhad2446
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views16 pages

Lecture 3

The document discusses various regression techniques in AIML, including Polynomial Regression, Ridge Regression, Lasso Regression, and Elastic Net. Polynomial Regression models non-linear relationships using polynomial terms, while Ridge and Lasso Regression apply different penalties to prevent overfitting in linear models. It also covers Gradient Descent as an optimization method and introduces key regression performance metrics such as MAE, MSE, RMSE, and R2 Score.

Uploaded by

vikrammadhad2446
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

AIML

Nitin Arvind Shelke


Polynomial Regression
• Polynomial Regression is a regression algorithm that models the
relationship between a dependent(y) and independent variable(x) as nth
degree polynomial. The Polynomial Regression equation is given below:
• Y=a0+a1x+a2x^2+a3x^3+a4x^4+a5x^5
• Polynomial Regression is a type of regression which models the non-
linear dataset using a linear model.
• It is also called the special case of Multiple Linear Regression in ML.
Because we add some polynomial terms to the Multiple Linear
regression equation to convert it into Polynomial Regression.
Need for Polynomial Regression:
If we apply a linear model on a linear dataset, then it provides us a good result as we have seen in Simple
Linear Regression, but if we apply the same model without any modification on a non-linear dataset, then
it will produce a drastic output. Due to which loss function will increase, the error rate will be high, and
accuracy will be decreased.
So for such cases, where data points are arranged in a non-linear fashion, we need the Polynomial
Regression model. We can understand it in a better way using the below comparison diagram of the linear
dataset and non-linear dataset.
Ridge Regression vs Lasso Regression
• Ridge and Lasso Regression are two popular techniques in AIML
used for regularizing linear models to avoid overfitting and
improve predictive performance. Both methods add a penalty
term to the model’s cost function to constrain the coefficients, but
they differ in how they apply this penalty.

• Ridge Regression, also known as L2 regularization, adds the


squared magnitude of the coefficients as a penalty. On the other
hand, Lasso Regression, or L1 regularization, introduces a penalty
based on the absolute value of the coefficients.
Ridge Regression
• Ridge regression, also known as L2 regularization, is a technique
used in linear regression to prevent overfitting by adding a penalty
term to the loss function. This penalty is proportional to the
square of the magnitude of the coefficients (weights).
• Objective Function=MSE+ Lambda*Sum of Square of Coefficients
Ridge Regression
Lasso Regression
• Lasso regression, also known as L1 regularization, is a linear regression
technique that adds a penalty to the loss function to prevent overfitting. This
penalty is based on the absolute values of the coefficients.

Objective Function=MSE + Lambda*Sum of Absolute coefficients

• Lasso regression is a version of linear regression that includes a penalty


equal to the absolute value of the coefficient magnitude. By encouraging
sparsity, this L1 regularization term reduces overfitting and helps some
coefficients to be absolutely zero, hence facilitating feature selection.
Lasso Regression
Lasso Vs Ridge
Characteristic Ridge Regression Lasso Regression
L2 (squared magnitude of
Penalty Type L1 (absolute magnitude of coefficients)
coefficients)
Coefficient Shrinks coefficients but doesn’t Can shrink some coefficients to exactly
Shrinkage force them to zero zero
Performs automatic feature selection by
Feature Selection Does not perform feature selection eliminating irrelevant or redundant
features
Tends to include all features in the Can simplify the model by excluding
Model Complexity
model some features
Can simplify the model which might
Impact on Tends to handle multicollinearity
improve prediction for high-dimensional
Prediction well
data
Use when all features are important,
Use when feature selection is important,
and you want to regularize the
Best for and you expect some features to be
coefficients without removing
irrelevant.
Elastic Net
• Elastic Net is a regularized regression technique that combines
both Lasso and Ridge penalties. It is particularly useful when
there are correlated features or when neither Lasso nor Ridge
alone provides optimal results.
Elastic Net
Gradient Descent for Regression
• Gradient Descent is an optimization algorithm used to minimize
the cost function of a model. For regression problems, the goal is
to find the values of coefficients (βj) that minimize the cost
function (e.g., Mean Squared Error).
Regression Performance Metrics
• Mean Absolute Error(MAE)
Regression Performance Metrics
• Mean Squared Error(MSE)
Regression Performance Metrics
• Root Mean Squared Error(RMSE)
Regression Performance Metrics
• R2Score or The coefficient of determination

You might also like