0% found this document useful (0 votes)
8 views37 pages

One-Dimensional Search Methods Fibonacci Method

Uploaded by

anjolaolugbenga
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views37 pages

One-Dimensional Search Methods Fibonacci Method

Uploaded by

anjolaolugbenga
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Process Optimization

(CHE 544)

PART II – Lecture 1
One-Dimensional Search Methods:
Fibonacci Method
2024/2025 Omega Semester 1
Brief Overview and Definitions
• Optimization is the act of obtaining the best result under given
circumstances.
• Engineering system or components are defined by a set of quantities.
These engineering system quantities can be
1. Preassigned parameters: quantities that are fixed from the onset
2. Design/decision variables

2024/2025 Omega Semester 2


• The purpose of optimization is to choose the best one of the many
acceptable designs available.

• Thus, a criterion has to be chosen for comparing the different


alternative acceptable designs and for selecting the best one.

• Objective function: It is the criterion with respect to which the design is


optimized when expressed as a function of the design variables. It is a
statement of the mathematical relationship between a dependent and
one or more independent variables.

2024/2025 Omega Semester 3


• Optimization problems can be constrained or unconstrained
• Design constraints: They are restrictions that must be satisfied to
produce an acceptable design. In many practical problems, the design
variables cannot be chosen arbitrarily; rather, they have to satisfy
certain specified functional and other requirements.
• The following two types of constraints emerge from most
considerations:
1. Equality type constraints.
2. Inequality type constraints.

• Variable bounds: This is the minimum and the maximum bounds on


each design variable. Certain optimization algorithms do not require
this information.
2024/2025 Omega Semester 4
• In general engineering problems, there will be more than one
acceptable design.

• The selection of the objective function can be one of the most


important decisions in the whole optimum design process.

• This is because there may be cases where the optimization with


respect to a particular criterion may lead to results that may not be
satisfactory with respect to another criterion.

• In some situations, there may be more than one criterion to be


satisfied simultaneously.
2024/2025 Omega Semester 5
• Single-objective optimization problem: Has one objective function that is to
be minimized or maximized.

• Multi-objective optimization problem: Involves multiple objective functions


that are to be minimized or maximized.

• Unimodal function: Has only one minimum (valley) and the rest of the graph
goes up from there; or one maximum (peak) and the rest of the graph goes
down.
• Multimodal function: Has multiple peaks or valleys.
• Linear optimization problem: When optimization functions or constraints are
linear
2024/2025 Omega Semester 6
• Non-linear optimization problem: When optimization functions or
constraints are non-linear

• Discrete optimization problem: When the value of the variables takes


discrete integer values
• Continuous optimization problem: When variables can take any real
value
• One-dimensional optimization problem: Optimization involving one
independent variable
• Multi-dimensional optimization problem: Optimization involving
more than one independent variable

2024/2025 Omega Semester 7


• There is no single method available for solving all optimization problems
efficiently. Hence a number of optimization methods have been
developed for solving different types of optimization problems.

• Two broad categories for solving one dimensional optimization


problems are:
1. Analytical methods (differential calculus methods)
2. Numerical methods

2024/2025 Omega Semester 8


• The numerical methods adopt either the elimination approach or
interpolation approach

• Some of the elimination methods are the Fibonacci and Golden


section methods

• Some of the interpolation methods are Newton and secant methods.

2024/2025 Omega Semester 9


Optimization: Fibonacci Method

• Fibonacci method is an elimination technique (interval reduction


method) for solving single dimension (one variable), non-linear
optimization problem

• Basic requirement for applying the Fibonacci method is that the


function to be optimized must be unimodal in the initial interval of
uncertainty

• The method makes use of Fibonacci numbers

2024/2025 Omega Semester 10


• The sequence of Fibonacci numbers, 𝐹𝑛 is 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, …

• Assuming 𝐹𝑜 is the first number in the Fibonacci series, 𝐹1 is the second


number in the series and so on and so forth, then it can be observed that:
▪ Always 𝐹𝑜 = 𝐹1 = 1
▪ Other numbers in the series is the sum of the two previous numbers in
the series i.e.

• 𝐹𝑛 = 𝐹𝑛−1 + 𝐹𝑛−2

• Total numbers of the experiments must be declared before commencement


of approximation.

2024/2025 Omega Semester 11


Steps in using the Fibonacci method

Step 1: Given an initial range of uncertainty 𝐿𝑜 = 𝑎, 𝑏 , let the


number of experiments be 𝑛
𝐹𝑛−2
Step 2: Define 𝐿∗2 = 𝐿𝑜 and determine 𝑥1 & 𝑥2 such that each
𝐹𝑛

of the two experiments (𝑥1 & 𝑥2 ) is 𝐿∗2 away from either ends of the
uncertainty interval

𝐹𝑛−2
• ∴ 𝑥1 = 𝑎 + 𝐿∗2 = 𝑎+ 𝐿𝑜
𝐹𝑛
2024/2025 Omega Semester 12
𝐹𝑛−2 𝐹𝑛−2
𝑥2 = 𝑏 − =𝑏− 𝐿∗2 𝐿𝑜 = 𝑏 − (𝑏 − 𝑎)
𝐹𝑛 𝐹𝑛
𝐹𝑛−2 𝐹𝑛−2
= 1− 𝑏+ 𝑎
𝐹𝑛 𝐹𝑛
𝐹𝑛 − 𝐹𝑛−2 𝐹𝑛−2
= 𝑏+ 𝑎
𝐹𝑛 𝐹𝑛
Recall that: 𝐹𝑛 = 𝐹𝑛−1 + 𝐹𝑛−2
𝐹𝑛−1 𝐹𝑛−1
= 𝑏+𝑎− 𝑎
𝐹𝑛 𝐹𝑛

𝐹𝑛−1 𝐹𝑛−1
= 𝑏−𝑎 +𝑎 =𝑎+ 𝐿𝑜
𝐹𝑛 𝐹𝑛

2024/2025 Omega Semester 13


Step 3: Determine 𝑓(𝑥1 ) and 𝑓(𝑥2 ). Assuming unimodal property,
discard the portion of the graph not likely to give the optimum value.

Step 4: Reduce the interval 𝐿𝑜 and name the new interval as 𝐿2 . The
new interval of uncertainty, 𝐿2 will be either 𝑥1 , 𝑏 or 𝑎, 𝑥2

2024/2025 Omega Semester 14


𝐿2 = 𝐿𝑜 − 𝐿∗2
𝐹𝑛−2
= 𝐿𝑜 − 𝐿𝑜
𝐹𝑛
𝐹𝑛−2 𝐹𝑛−1
= 1− 𝐿𝑜 = 𝐿𝑜
𝐹𝑛 𝐹𝑛
Step 5: Evaluate 𝑥3 . Locate 𝑥3 in such a way that each of the current
two experiments are 𝐿∗3 distant away from an end of 𝐿2
𝐹𝑛−3
𝐿∗3 = 𝐿𝑜
𝐹𝑛

2024/2025 Omega Semester 15


Step 6: Evaluate 𝑓(𝑥2 ) and 𝑓(𝑥3 ). Assuming unimodal property,
discard the portion of the interval which is not likely to give the
optimum value and obtain 𝐿3
𝐿3 = 𝐿2 − 𝐿∗3
𝐹𝑛−1 𝐹𝑛−3
= 𝐿𝑜 − 𝐿𝑜
𝐹𝑛 𝐹𝑛
𝐹𝑛−1 − 𝐹𝑛−3
= 𝐿𝑜
𝐹𝑛
𝐹𝑛−2
= 𝐿𝑜
𝐹𝑛
2024/2025 Omega Semester 16
• The iteration process continues until the 𝑛𝑡ℎ experiment has been
done
• In general, for obtaining the 𝑗𝑡ℎ experiment
𝐹𝑛−𝑗
𝐿𝑗∗ = 𝐿𝑗−1
𝐹𝑛− 𝑗−2

• Length of interval of uncertainty after 𝑗𝑡ℎ experiment


𝐹𝑛− 𝑗−1
𝐿𝑗 = 𝐿𝑜
𝐹𝑛
Note:
𝐿𝑒𝑛𝑔𝑡ℎ 𝑜𝑓 𝑓𝑖𝑛𝑎𝑙 𝑖𝑛𝑡𝑒𝑟𝑣𝑎𝑙 𝑜𝑓 𝑢𝑛𝑐𝑒𝑟𝑡𝑎𝑖𝑛𝑡𝑦 𝐿𝑛
• Measure of efficiency = = =
𝐿𝑒𝑛𝑔𝑡ℎ 𝑜𝑓 𝑖𝑛𝑖𝑡𝑖𝑎𝑙 𝑖𝑛𝑡𝑒𝑟𝑣𝑎𝑙 𝑜𝑓 𝑢𝑛𝑐𝑒𝑟𝑡𝑎𝑖𝑛𝑡𝑦 𝐿𝑜
1
𝐹𝑛 Omega Semester
2024/2025 17
Worked Example
Find the minimum of 𝑓 𝑥 = 𝑥 2 + 2𝑥 within the interval −3, 4 using
Fibonacci method. Terminate the search when the interval of
uncertainty is reduced to 10% of its initial length.
Solution
Given: The final interval of uncertainty should not exceed 10% of the
initial length 𝐿𝑜
10
= 𝐿𝑛 ≤ × 𝐿𝑜
100
1
= 𝐿𝑛 ≤ 𝐿𝑜
10
𝐿𝑛 1
= ≤
𝐿𝑜 10
2024/2025 Omega Semester 18
Recall:
𝐿𝑛 1
=
𝐿𝑜 𝐹𝑛
1 1
∴ ≤
𝐹𝑛 10
𝐹𝑛 ≥ 10
Recall the Fibonacci number series 𝑭𝒏 :
𝐹𝑜 = 1, 𝐹1 = 1, 𝐹2 = 2, 𝐹3 = 3, 𝐹4 = 5, 𝐹5 = 8, 𝐹6 = 13, 𝐹7 = 21, 𝐹8
= 34, 𝐹9 = 55, …
∴𝑛=6
• Applying the principle of unimodality, it can be assumed that the
∗ 𝐿𝑛
location of the optimal value, 𝑥 will correspond to
2
2024/2025 Omega Semester 19
Given:
𝐿𝑜 = 𝑎, 𝑏 = −3, 4 = 7
• Locating 𝒙𝟏 & 𝒙𝟐
-Evaluate 𝑳∗𝟐
𝐹𝑛−2 𝐹4
𝐿∗2 = 𝐿𝑜 = 𝐿𝑜
𝐹𝑛 𝐹6

5 35
= ×7= = 2.69230769
13 13
2024/2025 Omega Semester 20
• This means 𝑥1 & 𝑥2 must be 2.69230769 away from the either ends
of the number line

-Determine the value and hence, the position of 𝒙𝟏 & 𝒙𝟐


𝑥1 = 𝑎 + 𝐿∗2

35 4
= −3 + =− = −0.30769231
13 13

𝑥2 = 𝑏 − 𝐿∗2

35 17
=4− = = 1.30769231
13 13
2024/2025 Omega Semester 21
• Determine 𝒇 𝒙
𝑓 𝑎 = −3 2 + 2 −3 = 3

𝑓 𝑏 = 4 2 + 2 4 = 24

2
4 4 88
𝑓 𝑥1 = − +2 − =− = −0.52071006
13 13 169
2
17 17 731
𝑓 𝑥2 = +2 = = 4.32544379
2024/2025 Omega Semester
13 13 169 22
• Discard the section of the uncertainty interval not favoring the
optimization problem
𝑓 𝑥1 < 𝑓 𝑥2
• Since the it is a minimization problem, the optimal value cannot lie
between 𝑥2 & 𝑏 (you may check with a sketch of 𝒇 𝒙 against 𝒙 )

∴ Discard [𝑥2 , 𝑏] and the new interval of uncertainty is [𝑎, 𝑥2 ]

• The new length of uncertainty, 𝐿2 = 𝐿𝑜 − 𝐿∗2

35 56
=7− = = 4.30769231
13 13
2024/2025 Omega Semester 23
• Locating 𝒙𝟑 : This is achieved by repeating the above steps
-Evaluate 𝑳∗𝟑

𝐹𝑛−3 𝐹3
𝐿3 = 𝐿𝑜 = 𝐿𝑜
𝐹𝑛 𝐹6
3 21
= ×7= = 1.61538462
13 13
• This means 𝑥1 & 𝑥3 must be1.61538462 away from either ends of
the number line. Observation shows that 𝑥1 is 1.61538462 away
21 4
from 𝑥2 (check: 𝑥2 − = − = −0.30769231 = 𝑥1 )
13 13

-Determine the value of 𝒙𝟑


𝑥3 = 𝑎 + 𝐿∗3
21 18
= −3 + =− = −1.38461538
2024/2025 Omega Semester
13 13 24
• Determine 𝒇 𝒙
𝑓 𝑎 =3

2
18 18 144
𝑓 𝑥3 = − +2 − =− = −0.85207101
13 13 169

𝑓 𝑥1 = −0.52071006

𝑓 𝑥2 = 4.32544379
2024/2025 Omega Semester 25
• Discard the section of the uncertainty interval not favoring the
optimization problem
𝑓 𝑥3 < 𝑓 𝑥1
• Since it is a minimization problem, the optimal value cannot lie
between 𝑥1 & 𝑥2 (you may check with a sketch of 𝒇 𝒙 against 𝒙 )

∴ Discard [𝑥1 , 𝑥2 ] and the new interval of uncertainty is [𝑎, 𝑥1 ]

The new length of uncertainty, 𝐿3 = 𝐿2 − 𝐿∗3

56 21 35
= − = = 2.69230769
13 13 13
2024/2025 Omega Semester 26
• Continue the iterative process by locating 𝒙𝟒
-Evaluate 𝑳∗𝟒

𝐹𝑛−4 𝐹2
𝐿4 = 𝐿𝑜 = 𝐿𝑜
𝐹𝑛 𝐹6
2 14
= ×7= = 1.07692308
13 13
• This means 𝑥3 & 𝑥4 must be 1.07692308 away from either ends of
the number line. Observation shows that 𝑥3 is 1.07692308 away
14 18
from 𝑥1 (check: 𝑥1 − = − = −1.38461538 = 𝑥3 )
13 13

-Determine the value of 𝒙𝟒


𝑥4 = 𝑎 + 𝐿∗4
14 25
= −3 + =− = −1.92307692
13 13
2024/2025 Omega Semester 27
• Determine 𝒇 𝒙
𝑓 𝑎 =3

2
25 25 25
𝑓 𝑥4 = − +2 − =− = −0.14792899
13 13 169

𝑓 𝑥3 = −0.85207101

𝑓 𝑥1 = −0.52071006
2024/2025 Omega Semester 28
• Discard the section of the uncertainty interval not favoring the
optimization problem
𝑓 𝑥3 < 𝑓 𝑥4
• Since the it is a minimization problem, the optimal value cannot lie
between 𝑎 & 𝑥4 (you may check with a sketch of 𝒇 𝒙 against 𝒙 )

∴ Discard [𝑎, 𝑥4 ] and the new interval of uncertainty is [𝑥4 , 𝑥1 ]

• The new length of uncertainty, 𝐿4 = 𝐿3 − 𝐿∗4

35 14 21
= − = = 1.61538462
2024/2025 Omega Semester 13 13 13 29
• Continue the iterative process by locating 𝒙𝟓
-Evaluate 𝑳∗𝟓
∗ 𝐹𝑛−5 𝐹1
𝐿5 = 𝐿𝑜 = 𝐿𝑜
𝐹𝑛 𝐹6
1 7
= ×7= = 0.53846154
13 13
• This means 𝑥3 & 𝑥5 must be 0.53846154 away from either ends of
the number line. Observation shows that 𝑥3 is 0.53846154 away
7 18
from 𝑥4 (check: 𝑥4 + = − = −1.38461538 = 𝑥3 )
13 13

-Determine the value of 𝒙𝟓


𝑥5 = 𝑥1 − 𝐿∗5
4 7 11
=− − =− = −0.84615385
13 13 13
2024/2025 Omega Semester 30
• Determine 𝒇 𝒙
𝑓 𝑥4 = −0.14792899

𝑓 𝑥3 = −0.85207101

2
11 11 165
𝑓 𝑥5 = − +2 − =− = −0.97633136
13 13 169

𝑓 𝑥1 = −0.52071006
2024/2025 Omega Semester 31
• Discard the section of the uncertainty interval not favoring the
optimization problem
𝑓 𝑥5 < 𝑓 𝑥3

• Since the it is a minimization problem, the optimal value cannot lie


between 𝑥4 & 𝑥3 (you may check with a sketch of 𝒇 𝒙 against 𝒙 )

∴ Discard [𝑥4 , 𝑥3 ] and the new interval of uncertainty is [𝑥3 , 𝑥1 ]

• The new length of uncertainty, 𝐿5 = 𝐿4 − 𝐿∗5


21 7 14
= − = = 1.07692308
13 13 13
2024/2025 Omega Semester 32
• Continue the iterative process by locating 𝒙𝟔
-Evaluate 𝑳∗𝟔

𝐹𝑛−6 𝐹0
𝐿6 = 𝐿𝑜 = 𝐿𝑜
𝐹𝑛 𝐹6
1 7
= ×7= = 0.53846154
13 13
• This means 𝑥5 & 𝑥6 must be 0.53846154 away from either ends of
the number line. Observation shows that 𝑥5 is 0.53846154 away
7 11
from 𝑥3 (check: 𝑥3 + = − = −0.84615385 = 𝑥5 )
13 13

-Determine the value of 𝒙𝟔


𝑥6 = 𝑥1 − 𝐿∗6
4 7 11
=− − =− = −0.84615385
13 13 13
2024/2025 Omega Semester 33
• Determine 𝒇 𝒙
𝑓 𝑥3 = −0.85207101

𝑓 𝑥5 = −0.97633136

2
11 11
𝑓 𝑥6 = − +2 − = −0.97633136
13 13

𝑓 𝑥1 = −0.52071006
2024/2025 Omega Semester 34
• Discard the section of the uncertainty interval not favoring the
optimization problem
𝑓 𝑥5 = 𝑓 𝑥6

• Since the it is a minimization problem, the optimal value cannot lie


between 𝑥6 & 𝑥1 (you may check with a sketch of 𝒇 𝒙 against 𝒙 )

∴ Discard [𝑥6 , 𝑥1 ]

18 11
• The final interval of uncertainty, 𝐿6 = 𝑥3 , 𝑥6 = − ,−
13 13

2024/2025 Omega Semester 35


• The midpoint of the final interval of uncertainty is declared the
optimal solution.

𝐿6 = 𝐿5 − 𝐿∗6
14 7 7
= − = = 0.53846154
13 13 13
𝐿6 18 7
The minimizer is therefore 𝑥∗
= 𝑥3 + = − +
2 13 26
29
=− = −1.11538462
2024/2025 Omega Semester 26 36
2
29 29 −667
𝑓 𝑥∗ = − +2 − = = −0.98668639
26 26 676

Exercise
1. Min 𝑓 𝑥 = 𝑥 2 over [-5, 15]. Take 𝑛 = 7
𝑥
𝑓𝑜𝑟 𝑥 ≤ 2
2. Max 𝑓 𝑥 = ൝ 2 in the interval [0,3] using Fibonacci
−𝑥 + 3 𝑓𝑜𝑟 𝑥 > 2
method. Given that 𝑛 = 6
3. Min 𝑓 𝑥 = 𝑥(𝑥 − 1.5) in the interval [0,1] within the final interval
of uncertainty 0.25 of the initial interval of uncertainty.
2024/2025 Omega Semester 37

You might also like