Bayesian Linear
regression
• Bayesian Linear Regression is a
probabilistic approach to linear
Bayesian regression.
Linear • Unlike classical linear regression, which
Regression estimates point values for model
parameters (like slope and intercept),
Bayesian regression treats these
parameters as random variables and
estimates their probability
distributions.
Bayesian Linear Regression
Classical Linear Bayesian Linear
Concept
Regression Regression
Fixed values Distributions
Model
(estimated via (posterior over
parameters
least squares) parameters)
No direct Uncertainty
Uncertainty uncertainty on quantified using
parameters probability
Incorporated via
Prior knowledge Not included
prior distribution
Bayesian
Linear
Regressi
on
•Uncertainty Estimation: Gives not just predictions, but confidence intervals.
•Regularization: Priors act like a form of regularization.
•Online Learning: Posterior from old data can serve as prior for new data.
Benefits of Bayesian Linear Regression
Why Bayesian treatment is required
in linear regression?
• Adding a regularization term to the log likelihood
function means the effective model complexity can then
be controlled by the value of regularization co-efficient,
although the choice of the number and form of the basis
functions is of course still important in determining the
overall behaviour of the model.
• This leave the issue of deciding the appropriate model
complexity for the particular problem which cannot be
decided by simply maximizing the likelihood function,
because this always leads to excessively complex
models and overfitting.