Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2004, Journal of Global Optimization
…
10 pages
1 file
In this paper, a hybrid descent method, consisting of a simulated annealing algorithm and a gradient-based method, is proposed. The simulated annealing algorithm is used to locate descent points for previously converged local minima. The combined method has the descent property and the convergence is monotonic. To demonstrate the effectiveness of the proposed hybrid descent method, several multi-dimensional non-convex optimization problems are solved. Numerical examples show that global minimum can be sought via this hybrid descent method.
Soft Computing (SOCO),, 2020
A new hybrid gradient simulated annealing algorithm is introduced. The algorithm is designed to find the global minimizer of a nonlinear function of many variables. The function is assumed to be smooth. The algorithm uses the gradient method together with a line search to ensure convergence from a remote starting point. It is hybridized with a simulated annealing algorithm to ensure convergence to the global minimizer. The performance of the algorithm is demonstrated through extensive numerical experiments on some well-known test problems. Comparisons of the performance of the suggested algorithm and other meta-heuristics methods were reported. It validates the effectiveness of our approach and shows that the suggested algorithm is promising and merits to be implemented in practice.
Applied Mathematics and Computation, 2011
In this paper we present a new hybrid method, called the SASP method. The purpose of this method is the hybridization of the simulated annealing (SA) with the descent method, where we estimate the gradient using simultaneous perturbation. Firstly, the new hybrid method finds a local minimum using the descent method, then SA is executed in order to escape from the currently discovered local minimum to a better one, from which the descent method restarts a new local search, and so on until convergence. The new hybrid method can be widely applied to a class of global optimization problems for continuous functions with constraints. Experiments on 30 benchmark functions, including high dimensional functions, show that the new method is able to find near optimal solutions efficiently. In addition, its performance as a viable optimization method is demonstrated by comparing it with other existing algorithms. Numerical results improve the robustness and efficiency of the method presented.
Optimization Methods and Software, 2002
In this paper, we give a new approach of hybrid direct search methods with metaheuristics of simulated annealing for finding a global minimum of a nonlinear function with continuous variables. First, we suggest a Simple Direct Search (SDS) method, which comes from some ideas of other well known direct search methods. Since our goal is to find global minima and the SDS method is still a local search method, we hybridize it with the standard simulated annealing to design a new method, called Simplex Simulated Annealing (SSA) method, which is expected to have some ability to look for a global minimum. To obtain faster convergence, we first accelerate the cooling schedule in SSA, and in the final stage, we apply Kelley's modification of the Nelder-Mead method on the best solutions found by the accelerated SSA method to improve the final results. We refer to this last method as the Direct Search Simulated Annealing (DSSA) method. The performance of SSA and DSSA is reported through extensive numerical experiments on some well known functions. Comparing their performance with that of other meta-heuristics shows that SSA and DSSA are promising in practice. Especially, DSSA is shown to be very efficient and robust.
International Journal of Manufacturing, Materials, and Mechanical Engineering, 2015
The present paper focus on the improvement of the efficiency of structural optimization, in typical structural optimization problems there may be many locally minimum configurations. For that reason, the application of a global method, which may escape from the locally minimum points, remain essential. In this paper, a new hybrid simulated annealing algorithm for large scale global optimization problems with constraints is proposed. The authors have developed a stochastic algorithm called SAPSPSA that uses Simulated Annealing algorithm (SA). In addition, the Simultaneous Perturbation Stochastic Approximation method (SPSA) is used to refine the solution. Commonly, structural analysis problems are constrained. For the reason that SPSA method involves penalizing constraints a penalty method is used to design a new method, called Penalty SPSA (PSPSA) method. The combination of both methods (Simulated Annealing algorithm and Penalty Simultaneous Perturbation Stochastic Approximation algo...
1993
A study of global optimization schemes is presented. Simulated annealing, a general and proven technique for minimizing functions with many coexisting states, is used as a foundation for the development of a new, more automatic approach, called simulated tempering. This novel method upholds the eminent attribute of simulated annealing| the probabilistic guarantee of convergence upon a global minimum. It is unique, however, in that system equilibrium is never disturbed. Although simulated tempering is in its infancy, it is promising indeed, as the preliminary results suggest. The twodimensional Ising Spin Glass Model, an NP-complete optimization problem, is used as the test case. Its theoretical formulation is brie y addressed.
Optimization Methods and Software, 2004
In this paper, we present a new approach of hybrid simulated annealing method for minimizing multimodel functions called the simulated annealing heuristic pattern search (SAHPS) method. Two subsidiary methods are proposed to achieve the final form of the global search method SAHPS. First, we introduce the approximate descent direction (ADD) method, which is a derivative-free procedure with high ability of producing a descent direction. Then, the ADD method is combined with a pattern search method with direction pruning to construct the heuristic pattern search (HPS) method. The last method is hybridized with simulated annealing to obtain the SAHPS method. The experimental results through well-known test functions are shown to demonstrate the e ciency of the proposed method SAHPS.
IEEE Transactions on Systems, Man, and Cybernetics, 1991
A numerical method for finding the global minimum of nonconvex functions is presented. The method is based on the principles of simulated annealing, but handles continuously valued variables in a natural way. The method is completely general, and optimizes functions of up to 30 variables. Several examples are presented. A general-purpose program, INTEROPT, is described, which finds the minimum of arbitrary functions, with user-friendly, quasi-natural-language input
Journal of Mathematics and Computer Science
Many problems in system analysis in real world lead to continuous-domain optimization. Existence of sophisticated and many-variable problems in this field emerge need of efficient optimization methods. One of the optimization algorithms for multi-dimensional functions is simulated annealing (SA). In this paper, a modified simulated annealing named Dynamic Simulated Annealing (DSA) is proposed which dynamically switch between two types of generating function on traversed path of continuous Markov chain. Our experiments indicate that this approach can improve convergence and stability and avoid delusive areas in benchmark functions better than SA without any extra mentionable computational cost.
1987
Abstract A new global optimization algorithm for functions of continuous variables is presented, derived from the “Simulated Annealing” algorithm recently introduced in combinatorial optimization. The algorithm is essentially an iterative random search procedure with adaptive moves along the coordinate directions. It permits uphill moves under the control of a probabilistic criterion, thus tending to avoid the first local minima encountered.
A memory-based simulated annealing algorithm is proposed which fundamentally di ers from the previously developed simulated annealing algorithms for continuous variables by the fact that a set of points rather than a single working point is used. The implementation of the new method does not need any properties of the function being optimized. The method is well tested on a range of problems classi ed as easy, moderately di cult and di cult. The new algorithm is compared with other simulated annealing methods on both test problems and practical problems. Results showing an improved performance especially for di cult problems are given.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
arXiv: Optimization and Control, 2020
Computers & Chemical Engineering, 1996
Journal of Global Optimization, 2006
Computers & Operations Research, 2002
European Journal of Operational Research, 2008
Journal of Global Optimization, 2010
ACM Transactions on Mathematical Software, 1987
IEEE Transactions on Applications and Industry, 1999
Mathematical Problems in Engineering, 2018
International Journal for Numerical Methods in Engineering, 2001
2006 18th IEEE International Conference on Tools with Artificial Intelligence (ICTAI'06), 2006