Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2012
…
10 pages
1 file
Traditional Genetic Programming (GP) searches the space of functions/programs by using search operators that manipulate their syntactic representation, regardless of their actual semantics/behaviour. Recently, semantically aware search operators have been shown to outperform purely syntactic operators.
Progress in Artificial Intelligence, 2019
In this paper we continue the investigation of the effect of local search in geometric semantic genetic programming (GSGP), with the introduction of a general new local search operator that can be easily customized. We show that it is able to obtain results on par with the current best-performing GSGP with local search and, in most cases, better than standard GSGP.
Since its introduction, Geometric Semantic Genetic Programming (GSGP) has aroused the interest of numerous researchers and several studies have demonstrated that GSGP is able to effectively optimize training data by means of small variation steps, that also have the effect of limiting overfitting. In order to speed up the search process, in this paper we propose a system that integrates a local search strategy into GSGP (called GSGP-LS). Furthermore, we present a hybrid approach, that combines GSGP and GSGP-LS, aimed at exploiting both the optimization speed of GSGP-LS and the ability to limit over-fitting of GSGP. The experimental results we present, performed on a set of complex real-life applications, show that GSGP-LS achieves the best training fitness while converging very quickly, but severely overfits. On the other hand, GSGP converges slowly relative to the other methods, but is basically not affected by overfitting. The best overall results were achieved with the hybrid approach, allowing the search to converge quickly, while also exhibiting a noteworthy ability to limit overfitting. These results are encouraging, and suggest that future GSGP algorithms should focus on finding the correct balance between the greedy optimization of a local search strategy and the more robust geometric semantic operators.
SoftwareX, 2019
Geometric semantic operators (GSOs) for Genetic Programming have been widely investigated in recent years, producing competitive results with respect to standard syntax based operator as well as other well-known machine learning techniques. The usage of GSOs has been facilitated by a C++ framework that implements these operators in a very efficient manner. This work presents a description of the system and focuses on a recently implemented feature that allows the user to store the information related to the best individual and to evaluate new data in a time that is linear with respect to the number of generations used to find the optimal individual. The paper presents the main features of the system and provides a step by step guide for interested users or developers.
There are two important limitations of standard tree-based genetic programming (GP). First, GP tends to evolve unnecessarily large programs, what is referred to as bloat. Second, GP uses inefficient search operators that focus on modifying program syntax. The first problem has been studied extensively, with many works proposing bloat control methods. Regarding the second problem, one approach is to use alternative search operators, for instance geometric semantic operators , to improve convergence. In this work, our goal is to experimentally show that both problems can be effectively addressed by incorporating a local search op-timizer as an additional search operator. Using real-world problems, we show that this rather simple strategy can improve the convergence and performance of tree-based GP, while also reducing program size. Given these results, a question arises: why are local search strategies so uncommon in GP? A small survey of popular GP libraries suggests to us that local search is underused in GP systems. We conclude by outlining plausible answers for this question and highlighting future work.
2017 IEEE Congress on Evolutionary Computation (CEC), 2017
Geometric semantic genetic programming is a hot topic in evolutionary computation and recently it has been used with success on several problems from Biology and Medicine. Given the young age of geometric semantic genetic programming, in the last few years theoretical research, aimed at improving the method, and applicative research proceeded rapidly and in parallel. As a result, the current state of the art is confused and presents some "holes". For instance, some recent improvements of geometric semantic genetic programming have never been applied to some popular biomedical applications. The objective of this paper is to fill this gap. We consider the biomedical applications that have more frequently been used by genetic programming researchers in the last few years and we systematically test, in a consistent way, using the same parameter settings and configurations, all the most popular existing variants of geometric semantic genetic programming on all those applications. Analysing all these results, we obtain a much more homogeneous and clearer picture of the state of the art, that allows us to draw stronger conclusions.
Genetic Programming and Evolvable Machines, 2014
Several methods to incorporate semantic awareness in genetic programming have been proposed in the last few years. These methods cover fundamental parts of the evolutionary process: from the population initialization, through different ways of modifying or extending the existing genetic operators, to formal methods, until the definition of completely new genetic operators. The objectives are also distinct: from the maintenance of semantic diversity to the study of semantic locality; from the use of semantics for constructing solutions which obey certain constraints to the exploitation of the geometry of the semantic topological space aimed at defining easy-to-search fitness landscapes. All these approaches have shown, in different ways and amounts, that incorporating semantic awareness may help improving the power of genetic programming. This survey analyzes and discusses the state of the art in the field, organizing the existing methods into different categories. It restricts itself to studies where semantics is intended as the set of output values of a program on the training data, a definition that is common to a rather large set of recent contributions. It does not discuss methods for incorporating semantic information into grammar-based
Advances in Intelligent Systems and Computing, 2014
Genetic programming (GP) is an evolutionary computation paradigm for the automatic induction of syntactic expressions. In general, GP performs an evolutionary search within the space of possible program syntaxes, for the expression that best solves a given problem. The most common application domain for GP is symbolic regression, where the goal is to find the syntactic expression that best fits a given set of training data. However, canonical GP only employs a syntactic search, thus it is intrinsically unable to efficiently adjust the (implicit) parameters of a particular expression. This work studies a Lamarckian memetic GP, that incorporates a local search (LS) strategy to refine GP individuals expressed as syntax trees. In particular, a simple parametrization for GP trees is proposed, and different heuristic methods are tested to determine which individuals should be subject to a LS, tested over several benchmark and real-world problems. The experimental results provide necessary insights in this insufficiently studied aspect of GP, suggesting promising directions for future work aimed at developing new memetic GP systems.
Proceedings of the 11th International Joint Conference on Computational Intelligence, 2019
Geometric Semantic Genetic Programming (GSGP) is a recent variant of Genetic Programming, that is gaining popularity thanks to its ability to induce a unimodal error surface for any supervised learning problem. Nevertheless, so far GSGP has been applied to the real world basically only on regression problems. This paper represents an attempt to apply GSGP to real world classification problems. Taking inspiration from Perceptron neural networks, we represent class labels as numbers and we use an activation function to constraint the output of the solutions in a given range of possible values. In this way, the classification problem is turned into a regression one, and traditional GSGP can be used. In this work, we focus on binary classification; logistic constraining outputs in [0,1] is used as an activation function and the class labels are transformed into 0 and 1. The use of the logistic activation function helps to improve the generalization ability of the system. The presented results are encouraging: our regression-based classification system was able to obtain results that are better than, or comparable to, the ones of a set of competitor machine learning methods, on a rather rich set of real-life test problems.
Genetic Programming and Evolvable Machines
Cartesian genetic programming, a well-established method of genetic programming, is approximately 20 years old. It represents solutions to computational problems as graphs. Its genetic encoding includes explicitly redundant genes which are wellknown to assist in effective evolutionary search. In this article, we review and compare many of the important aspects of the method and findings discussed since its inception. In the process, we make many suggestions for further work which could improve the efficiency of the CGP for solving computational problems.
2008
Abstract Crossover forms one of the core operations in genetic programming and has been the subject of many different investigations. We present a novel technique, based on semantic analysis of programs, which forces each crossover to make candidate programs take a new step in the behavioural search space. We demonstrate how this technique results in better performance and smaller solutions in two separate genetic programming experiments.
Genetic Programming, 2020
Soft Computing, 2007
IEEE Transactions on Evolutionary Computation, 2006
KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2019
Genetic Programming and Evolvable Machines, 2010
Genetic Programming Theory and Practice VIII, 2010
Genetic Programming and Evolvable Machines, 2014