Curve Fitting &
Approximation
Instructor- Sadia Tasnim, Lecturer
EEE, PU
Introduction
Curve fitting: Curve fitting is the process of constructing a curve,
or mathematical function, that best fits a series of data points
Applications:
• Data modeling
• Forecasting
• Trend analysis
Application : Forecasting
• Purpose: Using past data to predict future outcomes by fitting a
curve or line.
• Use Case: Businesses use curve fitting to predict sales growth,
stock prices, or population growth by extending the fitted curve
into the future.
• Benefits: Reliable predictions when underlying patterns are
consistent and can be modeled mathematically.
Methods of curve fitting
1. Linear Regression- Least Squares method
2. Polynomial Regression
Linear Regression
• Linear Regression is a method that is used to model the
relationship between a dependent variable and one or more
independent variables by fitting a straight line through the data
points.
• The goal is to find the best-fitting line that minimizes the difference
(error) between the actual data points and the predicted values.
Key Concepts of Linear Regression:
• Equation of a Line:
The general form of a linear regression model is:
Objective of Linear regression
• The objective of linear regression is to determine the coefficients
𝑎0 and 𝑎1 that minimize the sum of squared residuals/errors
(differences between observed and predicted values). This is done
using the Least Squares Method.
Least Squares Method
• The method of least squares is a parameter estimation (𝑎0 and 𝑎1 )
method, based on minimizing the sum of the squares of the
residuals (a residual being the difference between an observed
value and the fitted value provided by a model) made in the results
of each individual equation.
Math 1
Given data table:
x 1 2.5 3.9 4.2 6.1 7.3 8.2
y 205 213 220 226 229 234 238
Questions:
(1) Write a linear equation that best fits the data in the table above.
(2) Find predicted value for x = 4.2
(3) Find sum of squared errors (SSE)
We know,
𝑛 σ 𝑥𝑦 − σ 𝑥 σ 𝑦
Slope, 𝑎1 =
𝑛 σ 𝑥 2 − ( σ 𝑥 )2
σ 𝑦 − 𝑎1 σ 𝑥
Intercept, 𝑎0 =
𝑛
Predicted value, 𝑦ො = 𝑎0 + 𝑎1 𝑥
𝑛
Sum of squared errors, SSE = 𝑖=1 𝑦𝑖 − 𝑦ො𝑖 2
Soln:
x y xy 𝑥2
1 205
2.5 213
3.9 220
4.2 226
6.1 229
7.3 234
8.2 238
σ 𝑥 = 33.2 σ 𝑦 = 1565
𝑛 σ 𝑥𝑖 𝑦𝑖 − σ 𝑥𝑖 σ 𝑦𝑖 7×7601⋅ 4 −33.2×1565
Slope, 𝑎1 = = = 4.428955562
𝑛 σ 𝑥𝑖2 − ( σ 𝑥𝑖 )2 7×197. 84 − (33.2)2
σ 𝑦𝑖 −𝑎1 σ 𝑥𝑖 1565−4.429×33.2
Intercept, 𝑎0 = = = 202.5653143
𝑛 7
• Predicted value for x = 4.2 is
𝑦ො = 𝑎0 + 𝑎1 𝑥 = 202.5653143 + 4.428955562 × 4.2 = 221.1669277
𝒏
• Sum of squared errors, SSE = 𝒊=𝟏 ෝ𝒊
𝒚𝒊 − 𝒚 𝟐
x 1 2.5 3.9 4.2 6.1 7.3 8.2
y 205 213 220 226 229 234 238
𝑦ො
𝑦 − 𝑦ො 2
Polynomial Regression
• Some engineering data, is poorly represented by a straight line.
For these cases, a curve would be better suited to fit these data.
We know polynomials expressions show curve. Therefore, to fit
polynomials to the data we use polynomial regression.
• 𝐹𝑜𝑟 𝑚 𝑑𝑎𝑡𝑎 𝑝𝑜𝑖𝑛𝑡𝑠 𝑥1 , 𝑦1 , 𝑥2 , 𝑦2 , 𝑥3 , 𝑦3 , …., 𝑥𝑚 , 𝑦𝑚
constructing a system of equations that minimizes the sum of the
squared residuals using the least squares method. The resulting
matrix equation has the form:
Matrix multiplication using a calculator:
Casio fx-991EX https://youtu.be/uz_SE00zBPE?si=tE2Ffkc7jta5fLMW
Casio fx-991 ES Plus https://youtu.be/_-xW1jLbY2Y?si=o2uejK9y13_ggA3E
Example 1:
Fit a quadratic polynomial ( n = 2 degree polynomial ) to the
given data points:
x 1 2 3 4 5
y 2 3 5 7 11
We aim to fit a quadratic polynomial of the form:
• For n = 2, the matrix becomes:
Solving the matrix we get, 𝑎0 = 2, 𝑎1 = −0.371, 𝑎2 = 0.4285
∴ 𝑇ℎ𝑒 𝑝𝑜𝑙𝑦𝑛𝑜𝑚𝑖𝑎𝑙 𝑚𝑜𝑑𝑒𝑙 𝑖𝑠, 𝑦ො = 2 − 0.371 𝑥 + 0.4285 𝑥 2
Example 2:
Fit a quadratic polynomial ( n = 3 degree polynomial ) to the given data
points:
x 1 2 3 4 5
y 1.5 1.7 2.1 3 3.8
We want to fit a cubic polynomial:
• For n = 3, the matrix becomes:
1. Given data table: Height 160 162 164 166 168
(in cm)
Weight 52 55 57 60 61
(in Kg)
a) Find the line of best fit for the given data of heights and weights of students
of a school using the Least Square method.
b) If a student has height of 165 cm, what is his predicted weight ?
2. Suppose you are working in the field of finance, and you are analyzing the relationship between the years of
experience (in years) an employee has and their corresponding salary (in thousand taka). You suspect that the
relationship might not be linear and that higher degrees of the polynomial might better capture the salary
progression over time.
Fit a quadratic polynomial ( n = 2 degree polynomial ) to the given data points:
Years of 1 2 3 4 5 6
experience, x
Salary, y 50 55 65 80 110 150