Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2006
Accurate modelling of real-world problems often requires nonconvex terms to be introduced in the model, either in the objective function or in the constraints. Nonconvex programming is one of the hardest fields of optimization, presenting many challenges in both practical and theoretical aspects. The presence of multiple local minima calls for the application of global optimization techniques. This paper is a mini-course about global optimization techniques in nonconvex programming; it deals with some theoretical aspects of nonlinear programming as well as with some of the current stateof-the-art algorithms in global optimization. The syllabus is as follows. Some examples of Nonlinear Programming Problems (NLPs). General description of two-phase algorithms. Local optimization of NLPs: derivation of KKT conditions. Short notes about stochastic global multistart algorithms with a concrete example (SobolOpt). In-depth study of a deterministic spatial Branch-and-Bound algorithm, and con...
Journal of Global Optimization, 1995
A branch and bound global optimization method, BB, for general continuous optimization problems involving nonconvexities in the objective function and/or constraints is presented. The nonconvexities are categorized as being either of special structure or generic. A convex relaxation of the original nonconvex problem is obtained by (i) replacing all nonconvex terms of special structure (i.e. bilinear, fractional, signomial) with customized tight convex lower bounding functions and (ii) by utilizing the parameter as defined in [17] to underestimate nonconvex terms of generic structure. The proposed branch and bound type algorithm attains finite -convergence to the global minimum through the successive subdivision of the original region and the subsequent solution of a series of nonlinear convex minimization problems. The global optimization method, BB, is implemented in C and tested on a variety of example problems.
This paper presents an overview of the research progress in deterministic global optimization during the last decade (1998)(1999)(2000)(2001)(2002)(2003)(2004)(2005)(2006)(2008). It covers the areas of twice continuously differentiable nonlinear optimization, mixed-integer nonlinear optimization, optimization with differential-algebraic models, semi-infinite programming, optimization with grey box/nonfactorable models, and bilevel nonlinear optimization.
Journal of Global Optimization, 2000
During the last decade, the field of deterministic global optimization has witnessed an explosive growth in theoretical advances, algorithmic developments, and applications across all disciplines of science and engineering. The book of CA Floudas, a leading authority in the field, provides a unified and insightful treatment of deterministic global optimization. It combines novel theoretical and algorithmic advances with the now ubiquitous domain of applications ranging from process design to process synthesis to ...
Optimization deals with problems of minimization or maximization an object function of several variables usually subject to equality and/or inequality constraints. This paper is concerned with the construction of an effective algorithm for finding a global optimal solution (or global optimal solutions) of the constrained nonlinear programming problem. Objective function is assumed to satisfy Lipschitz condition and its Lipschitz constant is utilized in order to bound searching space effectively and terminate searching iteration. We can regard Lipschitz algorithm as the Branch-and-Bound method that is a kind of a divide and conquer method.
2000
Many optimization problems in engineering and science require solutions that are globally optimal. These optimization problems are characterized by the nonconvexity of the feasible domain or the objective function and may involve continuous and/or discrete variables. In this paper we highlight some recent results and discuss current research trends on deterministic and stochastic global optimization and global continuous approaches to discrete optimization.
Special Issue “Some Novel Algorithms for Global Optimization and Relevant Subjects”, Applied and Computational Mathematics (ACM), 2017
Global optimization is necessary in some cases when we want to achieve the best solution or we require a new solution which is better the old one. However global optimization is a hazard problem. Gradient descent method is a well-known technique to find out local optimizer whereas approximation solution approach aims to simplify how to solve the global optimization problem. In order to find out the global optimizer in the most practical way, I propose a so-called descending region (DR) algorithm which is combination of gradient descent method and approximation solution approach. The ideology of DR algorithm is that given a known local minimizer, the better minimizer is searched only in a so-called descending region under such local minimizer. Descending region is begun by a so-called descending point which is the main subject of DR algorithm. Descending point, in turn, is solution of intersection equation (A). Finally, I prove and provide a simpler linear equation system (B) which is derived from (A). So (B) is the most important result of this research because (A) is solved by solving (B) many enough times. In other words, DR algorithm is refined many times so as to produce such (B) for searching for the global optimizer. I propose a so-called simulated Newton – Raphson (SNR) algorithm which is a simulation of Newton – Raphson method to solve (B). The starting point is very important for SNR algorithm to converge. Therefore, I also propose a so-called RTP algorithm, which is refined and probabilistic process, in order to partition solution space and generate random testing points, which aims to estimate the starting point of SNR algorithm. In general, I combine three algorithms such as DR, SNR, and RTP to solve the hazard problem of global optimization. Although the approach is division and conquest methodology in which global optimization is split into local optimization, solving equation, and partitioning, the solution is synthesis in which DR is backbone to connect itself with SNR and RTP.
Proceedings of the VIth International Workshop 'Critical Infrastructures: Contingency Management, Intelligent, Agent-Based, Cloud Computing and Cyber Security' (IWCI 2019), 2019
The paper presents an approach to the numerical study of the problems of finding a global extremum of multiextremal functions, based on the use of a parabolas algorithm. As local methods of onedimensional search, the methods of parabolic interpolation and the golden section are used. The numerical testing of modifications of the implemented approach using known non-convex functions has been carried out. The proposed technique has been applied to investigate a more complex optimization problem of a controlled dynamical power system. The obtained numerical results allowed us to demonstrate the efficiency of the proposed computational technology.
2001
Performance comparison of Epperly's method [64] (an interval method), DONLP2 (SQP) and CSA in solving Floudas and Pardalos' continuous constrained NLPs [68]. CSA is based on (Cauchy 1 , S-uniform, M) and α = 0.8. All times are in seconds on a Pentium-III 500-MHz computer running Solaris 7. '-' stands for no feasible solution found for Epperly's method. Both SQP and CSA use the same sequence of starting points.. .. .. .. .. .. .. .. . 5.7 Comparison results of LANCELOT, DONLP2, and CSA in solving discrete constrained NLPs that are derived from selected continuous problems from CUTE using the starting point specified in each problem. CSA is based on (Cauchy 1 , S-uniform, M) and α = 0.8. All times are in seconds on a Pentium-III 500-MHz computer running Solaris 7. − means that no feasible solution can be found by both the public version (01/05/2000) and the commercial version of LANCELOT (by submitting problems through the Internet, http://www-neos.mcs.anl.gov/neos/solvers/NCO:LANCELOT/), and that no feasible solution can be found by DONLP2. Numbers in bold represent the best solutions among the three methods if they have different solutions. 5.8 Comparison results of LANCELOT, DONLP2, and CSA in solving mixedinteger constrained NLPs that are derived from selected continuous problems from CUTE using the starting point specified in each problem. CSA is based on (Cauchy 1 , S-uniform, M) and α = 0.8. All times are in seconds on a Pentium-III 500-MHz computer running Solaris 7. − means that no feasible solution can be found by both the public version (01/05/2000) and the commercial version of LANCELOT (by submitting problems through the Internet, http://www-neos.mcs.anl.gov/neos/solvers/NCO:LANCELOT/), and that no feasible solution can be found by DONLP2. Numbers in bold represent the best solutions among the three methods if they have different solutions. xiii 5.9 Comparison results of LANCELOT, DONLP2, and CSA in solving selected continuous problems from CUTE using the starting point specified in each problem. CSA is based on (Cauchy 1 , S-uniform, M) and α = 0.8, and does not use derivative information in each run. All times are in seconds on a Pentium-III 500-MHz computer running Solaris 7. − means that no feasible solution can be found by both the public version (01/05/2000) and the commercial version of LANCELOT (by submitting problems through the Internet, http://www-neos.mcs.anl.gov/neos/solvers/NCO:LANCELOT/), and that no feasible solution can be found by DONLP2. * means that solutions are obtained by the commercial version (no CPU time is available) but cannot be solved by the public version. Numbers in bold represent the best solutions among the three methods if they have different solutions.. .. .. .. .. . 5.11 Experimental Results of applying LANCELOT on selected CUTE problems that cannot be solved by CSA at this time. All times are in seconds on a Pentium-III 500-MHz computer running Solaris 7. − means that no feasible solution can be found by both the public version (01/05/2000
Quarterly Journal of the Belgian, French and Italian Operations Research Societies, 2004
We survey the main results obtained by the author in his PhD dissertation supervised by Prof. Costas Pantelides. It was defended at the Imperial College, London. The thesis is written in English and is available from http://or.dhs.org/people/Liberti/phdtesis.ps.gz. The most widely employed deterministic method for the global solution of nonconvex NLPs and MINLPs is the spatial Branch-and-Bound (sBB) algorithm, and one of its most crucial steps is the computation of the lower bound at each sBB node. We investigate different reformulations of the problem so that the resulting convex relaxation is tight. In particular, we suggest a novel technique for reformulating a wide class of bilinear problems so that some of the bilinear terms are replaced by linear constraints. Moreover, an in-depth analysis of a convex envelope for piecewise-convex and concave terms is performed. All the proposed algorithms were implemented in ooOPS, an object-oriented callable library for constructing MINLPs in structured form and solving them using a variety of local and global solvers.
2003
In this work the problem of overcoming local minima in the solution of nonlinear optimisation problems is addressed. As a first step, the existing nonlinear local and global optimisation methods are reviewed so as to identify their advantages and disadvantages. Then, the major capabilities of a number of successful methods such as genetic, deterministic global optimisation methods and simmulated annealing, are combined to develop an alternative global optimisation approach based on a Stochastic-Probabilistic heuristic. The capabilities, in terms of robustness and efficiency, of this new approach are validated through the solution of a number of nonlinear optimisation problems. A well know evolutionary technique (Differential Evolution) is also considered for the solution of these case studies offering a better insight of the possibilities of the method proposed here.
Journal of Mathematical Sciences, 2022
International Transactions in Operational Research, 2005
In this paper, we compare two different approaches to nonconvex global optimization. The first one is a deterministic spatial Branch-and-Bound algorithm, whereas the second approach is a Quasi Monte Carlo (QMC) variant of a stochastic multi level single linkage (MLSL) algorithm. Both algorithms apply to problems in a very general form and are not dependent on problem structure. The test suite we chose is fairly extensive in scope, in that it includes constrained and unconstrained problems, continuous and mixed-integer problems. The conclusion of the tests is that in general the QMC variant of the MLSL algorithm is generally faster, although in some instances the Branch-and-Bound algorithm outperforms it.
2013
Function Modification Method is one of the deterministic approaches in global optimization. It includes all the methods that use some proper modifications of the objective function to move from a determined local optimal solution to another better local optimal solution. In the literature, there are several classes of methods such as tunneling function method; bridging method and filled function method have been introduced as function modification method. In this paper, a new nonparametric function modification method and its algorithm will be introduced. The numerical analysis result shown that the efficiency of the new proposed algorithm is able to solve the unconstrained global optimization problem and save the operational cost in selecting parameter.
Applied Mathematics and Computation, 2004
Many practical problems often lead to large nonconvex nonlinear programming problems that have many equality constraints. The global optimization algorithms of these problems have received much attention over the last few years. Generally, stochastic algorithms are suitable for these problems, but not efficient when there are too many equality constraints. Therefore, a global optimization algorithm for solving these problems is proposed in this paper. The new algorithm, based on a feasible set strategy, uses a stochastic algorithm and a deterministic local algorithm. The convergence of the algorithm is analyzed. This algorithm is applied to practical problem, and the numerical results illustrate the accuracy and efficiency of the algorithm.
Top, 2012
We study a simple, yet unconventional approach to the global optimization of unconstrained nonlinear least-squares problems. Non-convexity of the sum of least-squares objective in parameter estimation problems may often lead to the presence of multiple local minima. Here, we focus on the spatial branch-and-bound algorithm for global optimization and experiment with one of its implementations, BARON (Sahinidis in J. Glob. Optim. 8(2):201–205, 1996), to solve parameter estimation problems. Through the explicit use of first-order optimality conditions, we are able to significantly expedite convergence to global optimality by strengthening the relaxation of the lower-bounding problem that forms a crucial part of the spatial branch-and-bound technique. We analyze the results obtained from 69 test cases taken from the statistics literature and discuss the successes and limitations of the proposed idea. In addition, we discuss software implementation for the automation of our strategy.
Computational Optimization and Applications, 2006
This paper proposes a new algorithm for solving constrained global optimization problems where both the objective function and constraints are one-dimensional non-differentiable multiextremal Lipschitz functions. Multiextremal constraints can lead to complex feasible regions being collections of isolated points and intervals having positive lengths. The case is considered where the order the constraints are evaluated is fixed by the nature of the problem and a constraint i is defined only over the set where the constraint i − 1 is satisfied. The objective function is defined only over the set where all the constraints are satisfied. In contrast to traditional approaches, the new algorithm does not use any additional parameter or variable. All the constraints are not evaluated during every iteration of the algorithm providing a significant acceleration of the search. The new algorithm either finds lower and upper bounds for the global optimum or establishes that the problem is infeasible. Convergence properties and numerical experiments showing a nice performance of the new method in comparison with the penalty approach are given.
Journal of Numerical Analysis and Approximation Theory
In this paper, we propose an algorithm based on branch and bound method to underestimate the objective function and reductive transformation which is transformed the all multivariable functions on univariable functions. We also demonstrate several quadratic lower bound functions are proposed which they are better/preferable than the others well-known in literature. We obtain that our experimental results are more effective when we face different nonconvex functions.
Annals of Operations Research, 1984
Several different approaches have been suggested for the numerical solution of the global optimization problem: space covering methods, trajectory methods, random sampling, random search and methods based on a stochastic model of the objective function are considered in this paper and their relative computational effectiveness is discussed. A closer analysis is performed of random sampling methods along with cluster analysis of sampled data and of Bayesian nonparametric stopping rules.
Journal of Global Optimization, 1999
There is a lack of a representative set of test problems for comparing global optimization methods. To remedy this a classi cation of essentially unconstrained global optimization problems into unimodal, easy, moderately di cult, and di cult problems is proposed. The problem features giving this classi cation are the chance to miss the basin of the global minimum, the dispersion of minima, and the number of minima. The classi cation of some often used test problems are given and it is recognized that most of them are easy and some even unimodal. Working global optimization solution techniques treated are global, local, and adaptive search and their use for tackling di erent classes of problems is discussed. The problem of fair comparison of methods is then adressed. Further possible components of a general global optimization tool based on the problem classes and working solution techniques is presented.
Many deterministic minimization algorithms can be seen as discrete dynamical systems coming from the discretization of first or second order Cauchy problems. On the other hand, global optimization can be considered as an over-determined Boundary Value Problem (BVP) for these problems. For instance, the steepest descent method leads to the following BVP:
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.