Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
Ingeniería e Investigación
Linear programming (LP) is one of the most widely-applied techniques in operations research. Many methods have been developed and several others are being proposed for solving LP problems, including the famous simplex method and interior point algorithms. This study was aimed at introducing a new method for solving LP problems. The proposed algorithm starts from an interior point and then carries out orthogonal projections using parametric straight lines to move between the interior and polyhedron frontier defining the feasible region until reaching the extreme optimal point.
2015
The simplex method is most popular method for solving Linear Programs. In today's life the simplex algorithm of linear programming has greatest influence on the development and practice of science and engineering technology. In this paper, an alternative method for simplex method is introduced. This method is easy to solve linear programming problem (LPP). It is powerful method to reduce number of iterations and save valuable time.
BALKAN JOURNAL OF APPLIED MATHEMATICS AND INFORMATICS, 2018
In this paper we consider application of linear programming in solving optimization problems with constraints. We used the simplex method for finding a maximum of an objective function. This method is applied to a real example. We used the “linprog” function in MatLab for problem solving. We have shown, how to apply simplex method on a real world problem, and to solve it using linear programming. Finally we investigate the complexity of the method via variation of the computer time versus the number of control variables.
In this chapter, we present a systematic procedure for solving linear programs. This procedure, called the simplex method, proceeds by moving from one feasible solution to another, at each step improving the value of the objective function. Moreover, the method terminates after a finite number of such transitions.
Computers & Operations Research, 2002
In this paper, we present an auxiliary algorithm, in terms of the speed of obtaining the optimal solution, that is e!ective in helping the simplex method for commencing a better initial basic feasible solution. The idea of choosing a direction towards an optimal point presented in this paper is new and easily implemented. From our experiments, the algorithm will release a corner point of the feasible region within few iterative steps, independent of the starting point. The computational results show that after the auxiliary algorithm is adopted as phase I process, the simplex method consistently reduce the number of required iterations by about 40%.
Analele Stiintifice ale Universitatii Ovidius Constanta, Seria Matematica
In this paper we treat numerical computation methods for linear programming. Started from the analysis of the efficiency and defficiency of the simplex procedure, we present new possibilities offered by the interior-point methods, which appears from practical necessity, from the need of efficient means of solving large-scale problems. We realise the implementation in Java of the Karmarkar's method.
Journal of Mathematics and System Science, 2015
In this paper we present a new method combining interior and exterior approaches to solve linear programming problems. With the assumption that a feasible interior solution to the input system is known, this algorithm uses it and appropriate constraints of the system to construct a sequence of the so called station cones whose vertices tend very fast to the solution to be found. The computational experiments show that the number of iterations of the new algorithm is significantly smaller than that of the second phase of the simplex method. Additionally, when the number of variables and constraints of the problem increase, the number of iterations of the new algorithm increase in a slower manner than that of the simplex method.
According to the Operations Research Society of America, "Operations research [OR] is concerned with scientifically deciding how to best design and operate man-machine systems, usually under conditions requiring the allocation of scarce resources."
SIGMAP bulletin, 1973
This is a preliminary report on a new method which is able to solve the standard LP problem. subject to MiN Z = <C,X> Ax=b where A is a M-BY-N MATRIX Aa 0 in a number of operations which is bounded by a polynomial in M and N. The fundamental idea of this algorithm is to select one column vector of A after the other in such a way that we can be sure that they belong to the final basis. After M steps the final solution is obtained. Hence, we do not have to follow a path through a sequence of vertices like in the Simplex method. V. Klee has shown that there are problems for which the height of the underlying polytope is fantastically large and therefore they are unsolvable using the Simplex algorithm in any of its forms, from a practical viewpoint. Also, Zadeh presented real problems of this class. All these inconveniences are avoided with the new algorithm.
iaeme
Since the late 1940s, linear programming models have been used for many different purposes. Airline companies apply these models to optimise their use of planes and staff. NASA has been using them for many years to optimize their use of limited resources. Oil companies use them to optimise their refinery operations. Small and medium-sized businesses use linear programming to solve a huge variety of problems, often involving resource allocation. In this paper, a typical product-mix problem in a manufacturing system producing two products (each product consists of two sub-assemblies) is solved for its optimal solution through the use of the latest versions of MATLAB having the command simlp, which is very much like linprog. As analysts, we try to find a good enough solution for the decision maker to make a final decision. Our attempt is to give the mathematical description of the productmix optimization problem and bring the problem into a form ready to call MATLAB’s simlp command. The objective of this paper is to find the best product mix that maximizes profit. The graph obtained using MATLAB commands, give the shaded area enclosed by the constraints called the feasible region, which is the set of points satisfying all the constraints. To find the optimal solution we look at the lines of equal profit to find the corner of the feasible region which yield the highest profit. This corner can be found out at the farthest line of equal profit which still touches the feasible region.
2014
In this paper, a new approach is suggested while solving linear programming problems using simplex method.The method sometimes involves less iteration than in the simplex method or at the most an equal number because the method attempts to replace more than one basic variable simultaneously.
In this chapter, we present a systematic procedure for solving linear programs. This procedure, called the simplex method, proceeds by moving from one feasible solution to another, at each step improving the value of the objective function. Moreover, the method terminates after a finite number of such transitions. Two characteristics of the simplex method have led to its widespread acceptance as a computational tool. First, the method is robust. It solves any linear program; it detects redundant constraints in the problem formulation; it identifies instances when the objective value is unbounded over the feasible region; and it solves problems with one or more optimal solutions. The method is also self-initiating. It uses itself either to generate an appropriate feasible solution, as required, to start the method, or to show that the problem has no feasible solution. Each of these features will be discussed in this chapter. Second, the simplex method provides much more than just optimal solutions. As byproducts, it indicates how the optimal solution varies as a function of the problem data (cost coefficients, constraint coefficients, and righthand-side data). This information is intimately related to a linear program called the dual to the given problem, and the simplex method automatically solves this dual problem along with the given problem. These characteristics of the method are of primary importance for applications, since data rarely is known with certainty and usually is approximated when formulating a problem. These features will be discussed in detail in the chapters to follow. Before presenting a formal description of the algorithm, we consider some examples. Though elementary, these examples illustrate the essential algebraic and geometric features of the method and motivate the general procedure. 2.1 SIMPLEX METHOD—A PREVIEW Optimal Solutions Consider the following linear program: Maximize z = 0x 1 + 0x 2 − 3x 3 − x 4 + 20, (Objective 1) subject to: x 1 − 3x 3 + 3x 4 = 6, (1) x 2 − 8x 3 + 4x 4 = 4, (2) x j ≥ 0 (j = 1, 2, 3, 4). Note that as stated the problem has a very special form. It satisfies the following: 1. All decision variables are constrained to be nonnegative. 2. All constraints, except for the nonnegativity of decision variables, are stated as equalities. 38
1994
Abstract We present a simplex-type algorithm for linear programming that works with primal-feasible and dual-feasible points associated with bases that differ by only one column. The algorithm is almost unaffected by degeneracy, and a preliminary implementation compares favorably with the primal simplex method.
The Journal of the Australian Mathematical Society. Series B. Applied Mathematics, 1990
Design of an interior point method for linear programming is discussed, and results of a simulation study reported. Emphasis is put on guessing the optimal vertex at as early a stage as possible.
Journal of Physics: Conference Series, 2014
The simplex algorithm is a popular algorithm for solving linear programming problems. If the origin point satisfies all constraints then the simplex can be started. Otherwise, artificial variables will be introduced to start the simplex algorithm. If we can start the simplex algorithm without using artificial variables then the simplex iterate will require less time. In this paper, we present the artificial-free technique for the simplex algorithm by mapping the problem into the objective plane and splitting constraints into three groups. In the objective plane, one of variables which has a nonzero coefficient of the objective function is fixed in terms of another variable. Then it can split constraints into three groups: the positive coefficient group, the negative coefficient group and the zero coefficient group. Along the objective direction, some constraints from the positive coefficient group will form the optimal solution. If the positive coefficient group is nonempty, the algorithm starts with relaxing constraints from the negative coefficient group and the zero coefficient group. We guarantee the feasible region obtained from the positive coefficient group to be nonempty. The transformed problem is solved using the simplex algorithm. Additional constraints from the negative coefficient group and the zero coefficient group will be added to the solved problem and use the dual simplex method to determine the new optimal solution. An example shows the effectiveness of our algorithm.
2009
In this paper, we describe a new method for finding search directions for interior point methods (IPMs) in linear optimization (LO). The theoretical complexity of the new algorithms are calculated and we prove that the iteration bound is O(log(n=†)) in this case too. In this paper we discuss interior point methods (IPMs) for solving linear optimization (LO) problems. Linear optimization is an area of mathematical programming which deals with the minimization or maximization of a linear function, subject to linear constraints. These constrains can be expressed by equalities or inequalities. Dantzig proposed the well- known simplex method for solving LO problems in 1947. The simplex method has been continuously improved in the past fifty years. For different variants of the simplex method there are constructed examples illustrating that in the worst case the number of iterations required by the algorithm can be exponential. The first polynomial algorithm for solving LO problems is the...
International Journal of Decision Support System Technology, 2014
Linear programming algorithms have been widely used in Decision Support Systems. These systems have incorporated linear programming algorithms for the solution of the given problems. Yet, the special structure of each linear problem may take advantage of different linear programming algorithms or different techniques used in these algorithms. This paper proposes a web-based DSS that assists decision makers in the solution of linear programming problems with a variety of linear programming algorithms and techniques. Two linear programming algorithms have been included in the DSS: (i) revised simplex algorithm and (ii) exterior primal simplex algorithm. Furthermore, ten scaling techniques, five basis update methods and eight pivoting rules have been incorporated in the DSS. All linear programming algorithms and methods have been implemented using MATLAB and converted to Java classes using MATLAB Builder JA, while the web interface of the DSS has been designed using Java Server Pages.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.