0% found this document useful (0 votes)
93 views13 pages

Least Squares for Engineers

The method of least squares provides a strategy to fit a linear regression model to observational data by minimizing the sum of the squared residuals between measured and modeled values. The normal equations derived from minimizing the residual sum of squares can be solved simultaneously to determine the slope and intercept of the best-fit linear regression line. This line provides the closest fit to the data in a least squares sense. The coefficient of determination, r-squared, quantifies how much variability in the data is explained by the linear regression model.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
93 views13 pages

Least Squares for Engineers

The method of least squares provides a strategy to fit a linear regression model to observational data by minimizing the sum of the squared residuals between measured and modeled values. The normal equations derived from minimizing the residual sum of squares can be solved simultaneously to determine the slope and intercept of the best-fit linear regression line. This line provides the closest fit to the data in a least squares sense. The coefficient of determination, r-squared, quantifies how much variability in the data is explained by the linear regression model.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Method of Least

Squares
Least Squares Regression

Linear Regression
• Fitting a straight line to a set of paired observations: (x1, y1), (x2,
y2),…,(xn, yn).
y=a0+a1x+e
a1- slope
a0- intercept
e- error, or residual, between the model and the observations

Engineering Mathematics III


Criteria for a “Best” Fit/
• Minimize the sum of the residual errors for all
available data:
n n

e  (y
i 1
i
i 1
i  ao  a1 xi )

n = total number of points


• However, this is an inadequate criterion, so is the sum
of the absolute values
n n

 e   y a
i 1
i
i 1
i 0  a1 xi

Engineering Mathematics III


Figure

Engineering Mathematics III


• Best strategy is to minimize the sum of the squares of
the residuals between the measured y and the y
calculated with the linear model:
n n n
S r   ei2   ( yi , measured  yi , model ) 2   ( yi  a0  a1 xi ) 2
i 1 i 1 i 1

• Yields a unique line for a given set of data.

Engineering Mathematics III


Least-Squares Fit of a Straight Line/
S r
 2 ( yi  ao  a1 xi )  0
ao
S r
 2 ( yi  ao  a1 xi ) xi   0
a1
0   yi   a 0   a1 xi
0   yi xi   a 0 xi   a1 xi2

a 0  na0 Normal equations, can be


na0   xi a1   yi solved simultaneously

n xi yi   xi  yi
a1 
n x   xi 
2 2
i Mean values
a0  y  a1 x Engineering Mathematics III
Figure :

Engineering Mathematics III


Figure :

Engineering Mathematics III


Figure:

Engineering Mathematics III


“Goodness” of our fit/
If
• Total sum of the squares around the mean for the
dependent variable, y, is St
• Sum of the squares of residuals around the regression
line is Sr
• St-Sr quantifies the improvement or error reduction
due to describing data in terms of a straight line rather
than as an average value.

St  S r
r 
2

St
r2-coefficient of determination
2) – correlation
Sqrt(rEngineering Mathematics IIIcoefficient
• For a perfect fit
Sr=0 and r=r2=1, signifying that the line explains 100
percent of the variability of the data.
• For r=r2=0, Sr=St, the fit represents no improvement.

Engineering Mathematics III


Polynomial Regression
• Some engineering data is poorly represented by a straight line. For
these cases a curve is better suited to fit the data. The least squares
method can readily be extended to fit the data to higher order
polynomials .

Engineering Mathematics III


General Linear Least Squares
y  a0 z0  a1 z1  a2 z 2    am z m  e
z0 , z1,  , z m are m  1 basis functions
Y   Z A E
Z   matrix of the calculated values of the basis functions
at the measured values of the independent variable
Y observed valued of the dependent variable
A unknown coefficien ts
E residuals
2 Minimized by taking its partial
n  m 
S r    yi   a j z ji  derivative w.r.t. each of the
coefficients and setting the
i 1  j 0  resulting equation equal to zero
Engineering Mathematics III

You might also like