0% found this document useful (0 votes)
20 views14 pages

Sensitivity Analysis

Uploaded by

hetveegohil2006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views14 pages

Sensitivity Analysis

Uploaded by

hetveegohil2006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 14

Sensitivity

Analysis
PARTICIPANTS
16.SIDDHI GAWADE
17.SHRUTI GHADGE
18.HETVEE GOHIL
19.DIKSHA GUNDAP
20.HARSH GUPTA
SENSITIVITY ANALYSIS
• Sensitivity Analysis studies how changes in problem parameters (like profit, cost, or
resources) affect the optimal solution in Linear and Nonlinear Programming.
It helps determine the stability and flexibility of the current solution.

• In Linear Programming (LP)


Analyzes changes in:
 Objective coefficients (c) → Range of Optimality
 RHS values (b) → Range of Feasibility

• In Nonlinear Programming (NLP)


Uses Lagrange multipliers (λ) and KKT conditions.
λ indicates the rate of change of the optimal value w.r.t. constraint parameters.

• Key Idea
Small changes in input → predictable changes in outcome → confident and optimal
decisions
FORMULAS

1️⃣Linear Programming (LP) Sensitivity


• Primal LP (Max):

• Dual LP (Min):

• Shadow Prices / Dual Variables:

• Allowable Change in RHS (b):

• Allowable Change in Objective Coefficient (c):


2️⃣Nonlinear Programming (NLP) Sensitivity
• General NLP:

• Lagrangian:

• Conditions:

• Sensitivity w.r.t parameter in constraints:


Example 1 — Linear Program

Maximize completely)
subject to

Step 1 — Find corner (vertex) points


Check usual vertices:
→.
Intersection with from constraint 1: . Point → .
Intersection with from constraint 2: . Point → .
Intersection of the two constraints (solve equality system):

Solve: from second . Substitute:

Then So intersection .
Evaluate
Step 2 — Choose optimum
Comparing vertex values: .
Maximum is at .
Step 3 — Dual (shadow prices)
Primal is max with ≤ constraints. Dual (minimization) variables :
Minimize
subject to

Because both primal constraints are active at the optimum and both primal
variables , by complementary slackness both dual inequalities are equalities.
Solve

Solve: from first . Substitute in second:

Then
So shadow prices:
(value of one extra unit of resource in constraint 1)

Dual objective — matches primal.


Interpretation: If RHS of constraint 1 increases by 1 (small change) and basis
stays the same, objective increases by . If RHS2 increases by 1, increases by .
Step 4 — Allowable change in RHS (so current basis stays feasible)
Let the basis be the two constraints with basic variables . The basis matrix
is the left-hand side matrix of basic columns:

Compute . Determinant . So

Current where gives as found).


For a change , new basic solution:

Require componentwise) to keep basis feasible.


(a) Vary only take

New
New
So . Thus allowable range for is:
(b) Vary only take

New
New
So . Thus allowable range for :

Conclusion (LP):
Optimal solution: , .
Shadow prices: .
Allowable ranges for RHS keeping same basis:
• .
• .
Within these ranges you can estimate objective change via .
Example 2 — Nonlinear Programming (NLP)

Minimize :
subject to equality constraint …..
where is a parameter.

We will:
1.Find ;
2.Get sensitivity ;
3.Compute how optimal objective changes with ;
4.Find Lagrange multiplier and relate it to sensitivity.

Step A — Solve for optimum using Lagrangian


Lagrangian:

Stationarity conditions:
Constraint: . Substitute the expressions:

Then

So optimal point: .

Step B — Sensitivities w.r.t parameter


Differentiate the optimal expressions:

So both decision variables change linearly with slope when changes.

Step C — Optimal objective and its derivative


Compute optimal objective:

Derivative:
Using the Lagrange multiplier found above , we note the relation:

Indeed, , matching the derivative.


Interpretation: small increase in parameter increases optimal
objective by approximately (exact here).

Short summary of the two solved examples


LP example: optimum , . Duals . RHS allowable ranges: , .
NLP example: optimum . Sensitivities . Optimal objective and .
Lagrange multiplier
and .
Applications
✅ Production Planning: Adjusting resources, labor, or materials.
✅ Financial Modeling: Studying profit/cost variations.
✅ Transportation & Logistics: Evaluating changes in supply or route capacities.
✅ Project Management: Assessing effect of delays or budget changes.
✅ Economics & Operations Research: Policy optimization under varying
constraints.

Conclusion
• Sensitivity Analysis helps us understand the stability of an optimal solution.
• It shows how far objective coefficients or resources can change without
affecting the best decision.
• In LP, it uses Simplex and Dual relationships (B⁻¹, Shadow Prices).
• In NLP, it uses Lagrange multipliers (λ) to measure sensitivity.
• Thus, it is a powerful decision-support tool in uncertain or variable
environments.
Thank
you

You might also like