0% found this document useful (0 votes)
100 views7 pages

Group D Assignment

The simplex algorithm is an iterative method for solving linear programming problems by moving from one feasible solution to another along the edges of the feasible region until reaching the optimal solution. It requires an initial feasible solution and guarantees convergence to an optimal solution. The algorithm can handle problems with both inequalities and equalities as constraints.

Uploaded by

kaca
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
100 views7 pages

Group D Assignment

The simplex algorithm is an iterative method for solving linear programming problems by moving from one feasible solution to another along the edges of the feasible region until reaching the optimal solution. It requires an initial feasible solution and guarantees convergence to an optimal solution. The algorithm can handle problems with both inequalities and equalities as constraints.

Uploaded by

kaca
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

KANO STATE POLYTECHNIC

SCHOOL OF TECHNOLOGY 15/02/2024

DEPARTMENT OF COMPUTER SCIENCE

[SOFTWARE AND WEB DEVELOPMENT]

OPERATION RESEARCH [SWD 314]

S/NO. NAMES REG NO. REMARK

1. ABDULRAHMAN ADAMU

2. BELLO SANI BELLO

3. FARUQ MAHRAZ

4. HASSAN TIJJANI HASSAN

5. HASIYA MUSTAPHA MUHD

6. HUSSAIN SALEH ADAMU

7. NASIR ILYASU MOHD

8. SANI KHALID ISHAQ

9. SAMINU ISMAIL KADEMI

10. USAMA HAMZA

QUESTIONS:

Q1. DEVELOP A SIMPLEX ALGORITHM

Q2. IDENTIFY BASIC, NON BASIC VARIABLES, SHADOW PRICES (COST;


EVALUATIONS ETC).

Q3. DEVEIOP THE SIMPLEX METHOD WITH EQUALITIES AS CONSTRAINTS.

Q4. APPLY THE SIMPLEX METHOD TO PROBLEMS INVOLVING FEW VARAIBLES.

Q5. MAKE USE OF A COMPUTER PACKAGE FOR THE SIMPLEX METHOD.


The simplex algorithm is a widely used method for solving linear programming problems. It
iteratively moves from one feasible solution to another along the edges of the feasible region until
it reaches the optimal solution. Here's a simplified overview of the algorithm:

Initialize: Start with an initial feasible solution.


Optimality Test: Check if the current solution is optimal. If yes, stop; otherwise, proceed to the
next step.

Pivot Step: Select a pivot element (usually the most negative coefficient in the objective function).

Pivot: Update the basis by pivoting on the selected element to improve the objective function value.

Iterate: Repeat steps 2-4 until an optimal solution is found.


Each iteration improves the objective function value until no further improvement is possible, and
the current solution is optimal.

The algorithm requires a feasible starting solution and guarantees convergence to an optimal
solution for linear programming problems. However, it may take exponential time in the worst
case, making it less efficient for large-scale problems. Various enhancements and modifications
have been developed to improve its performance in practice.

Sure, let's consider a simple linear programming problem and walk through the steps of the simplex
algorithm to solve it:

Problem:
Maximize Z = 2x1+3x2 Subject
to:

x1 + x2 ≤ 4 2x1+x27

X1, X20
Step 1: Initialization:

Step 2: Optimality Test:


Calculate the objective function value for the current solution:
Z = 2(0) + 3(0) = 0
Since the objective function value is not optimal, proceed to the next step.
Step 3: Pivot Step:
Identify the most negative coefficient in the objective function, which is 2 for 21.
Step 4: Pivot:
Choose the constraint that becomes binding (i.e., the one that will increase the objective function
value the most when 21 is in sed). In this case, it's the first constraint x1 + x2 <≤ 4.

Update the current solution by increasing 21 as much as possible while respecting the constraints.
Since x1 + x2 ≤ 4, x1 can increase to 4, so let's set 21 = 4.
Calculate the value of 22 using the constraint: 4 + x2 = 4
X2 = 0
Now, the current solution is 21 X2 = 0.
Repeat steps 2-4 until an optimal solution is found.
Now, we recalculate the objective function value for the current solution:
Z = 2(4)+3(0) = 8
Since the objective function value is still not optimal, we continue iterating.
Next pivot step: The most negative coefficient is now 3 for 22.
Pivot: Choose the second constraint 2x1 + x2 7.
Increase 22 as much as possible while respecting the constraints:
2x1 + x2 = 7
2(4) + 2 = 7
2x1 + x2 = 7
2(4) + x2 = 7
X2 = -1
Since 22 cannot be negative, the solution is infeasible. Therefore, the current solution x1 = 4 and
x2 = -1 is discarded.

The algorithm would continue to iterate, pivoting between variables and constraints, until an
optimal solution is found or the problem is determined to be infeasible or unbounded.

Question 3
Certainly! The simplex method can also handle linear programming problems with equality
constraints. When dealing with equality constraints, the standard approach is to convert them into
two inequality constraints: one in the original direction and one in the opposite direction.

Here's how the simplex method works with equality constraints:


Problem: Maximize
Z = CTx Subject to:

Ax = b
X ≥ 0
Where:
• C is the coefficient vector of the objective function.
• A is the coefficient matrix of the constraints.
• b is the right-hand side vector of the constraints.
• x is the vector of decision variables.
Step 1: Convert Equalities to Inequalities:
For each equality constraint Axi = bi, introduce
two inequality constraints:

Ax ≤ b
-Ax ≤-b
Step 2: Initial Feasible Solution:
Find an initial feasible solution by setting some variables to zero and solving for the remaining
variables. This may involve solving a system of equations.

Step 3: Set up the Initial Simplex Tableau:


Construct the initial simplex tableau with the objective function, slack variables (for the inequality
constraints), and any surplus variables (if needed).

Step 4: Pivot:
Select a pivot column based on the most negative coefficient in the objective function row
(excluding the objective function coefficient itself). Then, select the pivot row based on the
minimum ratio test (the smallest non- negative ratio of the right-hand side to the coefficient of the
pivot column).

Step 5: Update the Tableau:


Perform row operations to make the pivot element equal to 1 and all other elements in the pivot
column equal to zero.
Step 6: Iterate:
Repeat steps 4 and 5 until there are no negative coefficients in the objective function row, indicating
optimality.

Step 7: Solution:
Once the simplex algorithm terminates, read the solution from the final tableau. The optimal
solution corresponds to the values of the decision variables that make the objective function as
large as possible while satisfying all constraints.

Throughout the iterations, the simplex method moves from one vertex of the feasible region to
another along the edges until it reaches the optimal solution.
Question 4
Sure, let's apply the simplex method to a simple linear programming problem with a few variables.
Consider the following problem:

Problem:

Maximize Z = 3x1+2x2

Subject to:

2x1 + x2 10

21+3x212

X1, X2 ≥ 0

Step 1: Convert to Standard Form:

We need to convert the problem into standard form by introducing slack variables for the inequalities:

221+22+81 - 10

X1+ 3x2 + 82 = 12

Step 2: Initialize the Tableau:

Construct the initial simplex tableau using the coefficients of the objective function and the slack
variables:

X1 X2 S1 S2
RHS
Z -3 0 2 1 36

S1 2 -1 1 -1 2

X2 1 1/3
0 4

Step 3: Identify the Entering and Exiting Variables:

The entering variable is determined by the most negative coefficient in the objective function row, which
corresponds to 22. The exiting variable is determined by the minimum ratio test, which is 82.

Step 4: Pivot:

Perform the pivot operation to make 22 the entering variable and s₂ the exiting variable:
X1 X2 S1 S2 RHS

Z -3 -2 0 0 0

S1 2 1 1 0 10

s2 1 3 0 1 12

Step 5: Iterate:

Repeat steps 3 and 4 until all coefficients in the objective function row are non-negative. In this case, the
tableau already meets this condition, so the solution is optimal.

Solution:

The optimal solution is 2₁ = 2, x2 = 4, with an optimal objective function value of Z = 36.


The simplex method efficiently finds the optimal solution by iteratively moving from one feasible solution
to another along the edges of the feasible region until the optimal solution is reached.

QUESTION 5
from scipy.optimize import linprog

# Coefficients of the objective function c

= [-3, -2]

# Coefficients of the inequality constraints

A = [[2, 1],

[1, 3]]

# Right-hand side of the inequality constraints b

= [10, 12]

# Bounds for the variables

x0_bounds = (0, None)

x1_bounds = (0, None)

# Solve the linear programming problem res = linprog(c, A_ub=A, b_ub=b,

bounds=[x0_bounds, x1_bounds], method='highs')


# Print the optimal solution and value of the objective function

print("Optimal solution:", res.x) print("Optimal value of the

objective function:", -res.fun) Certainly! There are various

computer packages and libraries available for solving linear

programming problems using the simplex method. One popular

option is to use Python with the

'scipy.optimize.linprog function from the SciPy library, which implements the simplex algorithm. Here's
an example of how to use it for the problem we discussed earlier:

This code will output the optimal solution and the optimal value of the objective function. You need to
have SciPy installed to use this code. You can install it via pip if you haven't already:

Copy code

pip install scipy

Using a computer package like SciPy allows for quick and efficient solving of linear programming
problems, even with larger numbers of variables and constraints.

Reference:

Nocedal, Jorge, and Stephen J. Wright. "Numerical Optimization." Springer Science & Business Media,
2006.

Bertsimas, Dimitris, and Robert Weismantel. "Optimization over Integers." Dynamic Ideas, 2005.

Vanderbei, Robert J. "Linear Programming: Foundations and Extensions." Springer, 2008.

Winston, Wayne L. "Operations Research: Applications and Algorithms." Cengage Learning, 2014.

SciPy documentation:
https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.linprog.html

SciPy documentation: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.linprog.html

Python Simplex Library (Highs): https://github.com/ERGO-Code/HiGHS

Python Mathematical Optimization (Pyomo): https://pyomo.readthedocs.io/en/stable/

Python Optimization Library (CVXPY): https://www.cvxpy.org/

Boyd, Stephen, and Lieven Vandenberghe. "Convex Optimization." Cambridge University Press, 2004.

Bazaraa, Mokhtar S., John J. Jarvis, and Hanif D. Sherali. "Linear Programming and Network Flows."
John Wiley & Sons, 2011.

You might also like