Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
Proceedings of the Third Metaheuristics International …
…
13 pages
1 file
Mathematical programming methods have been used to solve optimization problems. However, in the presence of local optima, these methods usually fail due to the nature of its search process. On the other hand, genetic algorithms adopt a probabilistic treatment for the variables that qualify them for the solution of global optimization problems. This work develops a hybrid genetic algorithm adapted to optimize multimodal continuous functions that combines the characteristics of global search and versatility of genetic algorithms with the e ciency and precision of local search of mathematical programming algorithms. The results reveal that the proposed hybrid genetic approach is e cient i n determination of the global optimum.
9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization, 2002
The optimization of many realistic large-scale engineering systems can be computationally expensive.
Nonlinear Analysis: Theory, Methods & Applications, 1997
Journal of Heuristics, 2000
Genetic algorithms are stochastic search approaches based on randomized operators, such as selection, crossover and mutation, inspired by the natural reproduction and evolution of the living creatures. However, few published works deal with their application to the global ...
2009
Differential Evolution (DE) is a novel evolutionary approach capable of handling non-differentiable, non-linear and multi-modal objective functions. DE has been consistently ranked as one of the best search algorithm for solving global optimization problems in several case studies. This paper presents a simple and modified hybridized Differential Evolution algorithm for solving global optimization problems. The proposed algorithm is a hybrid of Differential Evolution (DE) and Evolutionary Programming (EP).
Applied Mathematics and Computation, 2005
The genetic algorithm (GA) have good global search characteristics and local optimizing algorithm (LOA) have good local search characteristics. In the present work, best characteristics of GA and LOA are combined to develop a hybrid genetic algorithm (HGA). A bank of GAÕs are used to get a good starting solution for a conjugate gradient algorithm. The number of GA banks is selected using an automated procedure based on Fibonacci numbers. This automated hybrid genetic algorithm (AHGA) is used for solving general multimodal optimization problems while assuring global optimality to a significant degree. The designed algorithm is also tested against a variety of standard test functions. Besides assuring global optimality to a significant extent AHGA is also found to be an efficient algorithm requiring only one tuning error parameter saving considerable time on the part of the user. The method also addresses the problem of selecting a good starting design for gradient based algorithm. Further in the few cases where the algorithm does not converge to a global minima, a local minima is assured because of the use of the gradient based local search in the final stage of the algorithm. Further, the algorithm assures one final solution to the optimization problem and addresses the 0096-3003/$ -see front matter Ó problem of providing a deterministic output which inhibits the use of GA in engineering optimization software and engineering applications.
Applied Mathematics and Computation, 2008
The genetic algorithms (GAs) can be used as a global optimization tool for continuous and discrete functions problems. However, a simple GA may suffer from slow convergence, and instability of results. GAs' problem solution power can be increased by local searching. In this study a new local random search algorithm based on GAs is suggested in order to reach a quick and closer result to the optimum solution.
Computer Physics Communications, 2008
A new method that employs grammatical evolution and a stopping rule for finding the global minimum of a continuous multidimensional, multimodal function is considered. The genetic algorithm used is a hybrid genetic algorithm in conjunction with a local search procedure. We list results from numerical experiments with a series of test functions and we compare with other established global optimization methods. The accompanying software accepts objective functions coded either in Fortran 77 or in C++.
2015 ASABE International Meeting
A new hybrid genetic algorithm was developed which combines a stochastic evolutionary algorithm with a deterministic adaptive step steepest descent hill climbing algorithm in order to optimize complex multivariate problems. By combining both algorithms computational resources are conserved and the solution converges rapidly as compared to either algorithm alone. In genetic algorithms natural selection is mimicked by random events such as breeding and mutation. In the adaptive step steepest descent algorithm the solution moves toward the lowest surrounding point. Step sizes start big and get progressively smaller, increasing computational efficiency. The genetic algorithm ensures the solution samples the entire global search space, thus a global minimum is found. The steepest descent method fine tunes the solution by moving it to the nearest local minimum. The code was developed, including a graphical user interface, in MATLAB. Additional features such as bounding the input, weighting the objective functions individually, and constraining the output are also built into the interface. The algorithm developed was used to optimize the response surface models which use process variables (feedstock moisture content, die speed, and preheating temperature) to predict pellet properties (pellet moisture content, unit, bulk and tapped density, durability, and specific energy consumption). The solution found by the hybrid algorithm was validated experimentally. Execution times were decreased by approximately 40%, based on 1,0000 trials with each method, using the new hybrid algorithm as compared to using a genetic algorithm alone with the same parameters, both developed at INL. Performance of the hybrid algorithm versus the commercial Matlab genetic algorithm is investigated. Results show that the hybrid genetic algorithm converged to the global maximum for bulk density in one iteration, whereas the commercial genetic algorithm took twenty nine iterations to converge.
Expert Systems with Applications, 2009
In this study, a new crossover approach to the real-coded genetic algorithm is proposed. The approach is simply based on efficiently tuned real-coded crossover operation using the probability distribution function of Gauss distribution to generate rather dissimilar strings which may be candidates of possible solutions. Also linear and quadratic mapping algorithms comparatively used both to constrain individuals in the given search spaces and to produce different individuals in order to increase average fitness relatively for the same population. Moreover, to refine genetically found optimum points the local search technique based on Newton's method was performed. The designed software was first implemented on 11 wellknown test functions and their results were compared with previous findings as shown in tables. In few test functions, the elitism operator was put into effect to maintain fitness stability helping increase the search performance of the proposed algorithm. The results indicate that the solutions to the test functions were almost the same with theoretical ones and the number of function evaluations for each test function was less than that obtained from using previous approaches.
Structural and Multidisciplinary Optimization, 2001
In this paper a new Genetic Algorithm (GA) to optimize multimodal continuous functions is proposed. It is based on a splitting of the traditional GA into a sequence of three processes. The first process creates several appropriate sub-populations using the information entropy theory. The second process applies the genetic operators (selection, crossover and mutation) on every subpopulation that is so gradually enriched with better individuals. We then determine the best point s * among the best solutions issued from each of the preceding subpopulations. In the neighbourhood of this point s * is generated a population used to initialize a traditional GA in the third process. In this last process, the population is entirely renewed after each generation, the new population being generated in the neighborhood of the best point found. The neighborhood size is decreased after each generation. A detailed comparison of performances with several stochastic global search methods is presented, using test functions of which local and global minima are known.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics), 1996
Journal of Applied Mathematics, 2013
Applied Mathematics and Computation, 2011
Proceedings of the 6th International Conference on Informatics in Control, Automation and Robotics, 2009
Computational Optimization and Applications, 2014
European Journal of Operational Research, 2003
Bio-Inspired Computing for Information Retrieval Applications, 2000
Journal of Computational and Applied Mathematics, 2008
Proceedings of the 2017 Federated Conference on Computer Science and Information Systems, 2017
Journal of Global Optimization, 1997
International Journal of Organizational and Collective Intelligence, 2019