Florida International University
Department of Civil and Environmental Engineering
Optimization in Water Resources Engineering, Spring 2020
LECTURE: CLASSICAL OPTIMIZATION OVERVIEW
Arturo S. Leon, Ph.D., P.E., [Link]
Optimization problem
Design variables: variables with which the design problem is
parameterized:
Objective: quantity that is to be minimized (maximized)
Usually denoted by:
( “cost function”)
Constraint: condition that has to be satisfied
Inequality constraint:
Equality constraint:
Solving optimization problems
Optimization problems are typically solved using an iterative algorithm:
Constants Responses
Model
Design
variables Derivatives of
responses
Optimizer (design sensi-
tivities)
Defining an optimization problem
1. Choose design variables and their bounds
2. Formulate objective
3. Formulate constraints (restrictions)
4. Choose suitable optimization algorithm
Example – Design of a SODA Can
5
Design a SODA can to hold an specified
amount of SODA and other requirements.
The cans will be produced in billions, so it
is desirable to minimize the cost of
manufacturing.
Since the cost is related directly to the
surface area of the sheet metal used, it is
reasonable to minimize the sheet metal
required to fabricate the can.
Example – Design of a SODA Can (Cont.)
6
Requirements:
1. The diameter of the can should be no
more than 8 cm and no less than 3.5
cm.
2. The height of the can should be no
more than18 cm and no less than 8 cm.
3. The can is required to hold at least
400 ml of fluid.
Example – Design of a SODA Can (Cont.)
7
Design variables
D = diameter of the can (cm)
H = height of the can (cm)
Objective function
The design objective is to minimize the surface area
Example – Design of a SODA Can (Cont.)
8
The constraints must be formulated in terms of design variables.
The first constraint is that the can must hold at least 400 ml of fluid.
The other constraints on the size of the can are:
The problem has two independent design variables and five explicit
constraints.
Optimization Problem Characteristics
Linearity
Nonlinear objective functions can have multiple
local optima:
x2 f
f
x2 x1
x x1
● Challenge: finding the global optimum.
Unimodality
Bracketing and sectioning methods work best for unimodal functions:
“An unimodal function consists of exactly one monotonically increasing and
decreasing part”
Convexity
Convex set:
“A set S is convex if for every two points x1, x2 in S, the
connecting line also lies completely inside S”
Lagrange Multipliers
12
The method of Lagrange multipliers gives a set of
necessary conditions to identify optimal points of
equality constrained optimization problems.
This is done by converting a constrained problem to an
equivalent unconstrained problem with the help of
certain unspecified parameters known as Lagrange
multipliers.
Finding an Optimum using Lagrange Multipliers
13 The classical problem formulation
minimize f(x1, x2, ..., xn)
Subject to h1(x1, x2, ..., xn) = 0
can be converted to
minimize L(x, ) = f(x) - h1(x)
where
L(x, ) is the Lagrangian function
is an unspecified positive or negative constant called the
Lagrangian Multiplier
Lagrange Multipliers Method
14
1. Original problem is rewritten as:
minimize L(x, ) = f(x) - h1(x)
2. Take derivatives of L(x, ) with respect to xi and set them equal to zero.
• If there are n variables (i.e., x1, ..., xn) then you will get n equations with n + 1
unknowns (i.e., n variables xi and one Lagrangian multiplier )
3. Express all xi in terms of Langrangian multiplier
4. Plug x in terms of in constraint h1(x) = 0 and solve .
5. Calculate x by using the just found value for .
Note that the n derivatives and one constraint equation result in n+1 equations
for n+1 variables!
Multiple constraints
15
The Lagrangian multiplier method can be used for any number of equality
constraints.
Suppose we have a classical problem formulation with k equality constraints
minimize f(x1, x2, ..., xn)
Subject to h1(x1, x2, ..., xn) = 0
......
hk(x1, x2, ..., xn) = 0
This can be converted in
minimize L(x, ) = f(x) - T h(x)
Where T is the transpose vector of Lagrangian multpliers and has length k
EXAMPLE
16
A factory manufactures HONDA CITY and HONDA CIVIC cars. Determine the optimal
number of HONDA CITY and HONDA CIVIC cars produced if the factory capacity is
90 cars per day, and the cost of manufacturing is C (x, y)= 6x2 + 12y2, where x is
the number of HONDA CITY cars and y is the number of HONDA CIVIC cars
produced.
EXAMPLE (Cont.)
17
VARIABLES
x = No. of HONDA CITY cars produced
y = No. of HONDA CIVIC cars produced
COST of Manufacturing;
C (x, y)= 6x2 + 12y2
OBJECTIVE:
MINIMIZE COST
CONSTRAINT: 90 cars per day
x + y = 90
Original problem is rewritten as:
minimize L(x, ) = f(x) - h1(x)
EXAMPLE (Cont.)
18
Unconstrained optimization algorithms
Single-variable methods
0th order (involving only f )
1st order (involving f and f ’ )
2nd order (involving f, f ’ and f ” )
Multiple variable methods
Gradient Descent Methods
Simplex Method
Sequential Linear Programming
Sequential Quadratic Programming
Etc.
Single-variable methods
Bisection method
Optimality conditions: minimum at stationary point
Root finding of f’
● Similar to sectioning methods, but uses derivative:
f f’
Interval is halved in each iteration. Note, this is
better than any of the direct methods
Newton’s method
Again, root finding of f’
Basis: Taylor approximation of f ’:
f '( x h ) f '( x ) f ''( x ) h o ( h 2 ) Linear
approximation
f '(x)
h
f "(x)
f '( xk )
● New guess: x k 1 x k h k x k
f " ( xk )
Newton’s method (cont.)
Best convergence of all methods:
f’ f’
xk+1 xk+2
xk+2 xk
xk
xk+1
● Unless it diverges
Summary single variable methods
Bracketing +
Dichotomous sectioning
Fibonacci sectioning
0th order
Golden ratio sectioning
Quadratic interpolation
Cubic interpolation
Bisection method 1st order
Secant method
Newton method 2nd order
● And many, many more!
MATLAB DEMO: Single variable Minimization
This demo will show a number of ways to minimize f(x) starting at multiple initial points.
Demo Folder: Single_variable_Classical_Optimization (Download file from Canvas)
Demo File: Main_File_Single_Variable.m
Function: xcos(2x)
Single variable Minimization (cont.)
(1) Change starting points
(2) Discuss and show sensitivity of solutions
Multiple variable methods
GRADIENT DESCENT METHODS
Consider a function J ( x), x x1 , x2 ,....xn
The gradient of J(x) at a point x0 is a vector of length n.
J 0
x ( x )
1
J 0
x ( x )
2
J ( x 0 ) .
.
.
J ( x 0 )
xn
Each element in the vector is evaluated at the point x0.
GRADIENT DESCENT METHODS (cont.)
Linear Programming
Simplex Method
Newton-Raphson Method
Secant Method
Bisection Method
Line Search Methods
Sequential Linear Programming
Sequential Quadratic Programming
Karush-Kuhn-Tucker Conditions (KKT)
SIMPLEX METHOD
Solutions at the “vertices” of the design
space are called basic feasible
solutions.
The Simplex algorithm moves from BFS to
BFS so that the objective always
improves.
SEQUENTIAL LINEAR PROGRAMMING
Consider a general nonlinear problem linearized via first order Taylor series:
This is an LP problem with the design variables contained in δx. The functions and
gradients evaluated at x0 are constant coefficients.
SEQUENTIAL LINEAR PROGRAMMING (Cont.)
1. Initial guess x0
2. Linearize about x0 using first order Taylor series
3. Solve resulting LP to find δx
4. Update x1=x0 + δx
5. Linearize about x1 and repeat:
xq=xq-1 + δx
Where δx is the solution of LP (model linearized about xq-1.
SEQUENTIAL QUADRATIC PROGRAMMING
Create a quadratic approximation to the Lagrangian
Create linear approximations to the constraints
Solve the quadratic problem to find the search direction, S
Perform the 1-D search
Update the approximation to the Lagrangian
Newton method
Expand f(x) by its Taylor series about the point xk
where the gradient is the vector
and the Hessian is the symmetric matrix
Newton method (Cont.)
For a minimum we require that , and so
with solution . This gives the iterative update
If f(x) is quadratic, then the solution is found in one step.
The method has quadratic convergence (as in the 1D case).
The solution is guaranteed to be a downhill direction.
Rather than jump straight to the minimum, it is better to perform a line minimization which ensures
global convergence
Summary of MATLAB Multiple
Variable Methods
• Fminsearch: Find minimum of unconstrained multivariable function
using derivative-free method
• Fminunc: Nonlinear programming solver. Finds minimum of
unconstrained multivariable function. Gradient and Hessian may
be supplied.
• Lsqnonlin: Solves nonlinear least-squares curve fitting problems
of the form
MATLAB DEMO: Banana Function Minimization
Minimize Rosenbrock's "banana function"
f(x) is called the banana function because
of its curvature around the origin.
It is notorious in optimization examples
because of the slow convergence most
methods exhibit when trying to solve this
problem
f(x) has a unique minimum at the point x
= [1, 1] where f(x) = 0
Banana Function Minimization (cont.)
This demo will show a number of ways to minimize f(x) starting at
multiple initial points.
Demo Folder: BananaFunction_Classical_Optimization (Download file
from Canvas)