Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2014, Advances in Intelligent Systems and Computing
…
18 pages
1 file
Genetic programming (GP) is an evolutionary computation paradigm for the automatic induction of syntactic expressions. In general, GP performs an evolutionary search within the space of possible program syntaxes, for the expression that best solves a given problem. The most common application domain for GP is symbolic regression, where the goal is to find the syntactic expression that best fits a given set of training data. However, canonical GP only employs a syntactic search, thus it is intrinsically unable to efficiently adjust the (implicit) parameters of a particular expression. This work studies a Lamarckian memetic GP, that incorporates a local search (LS) strategy to refine GP individuals expressed as syntax trees. In particular, a simple parametrization for GP trees is proposed, and different heuristic methods are tested to determine which individuals should be subject to a LS, tested over several benchmark and real-world problems. The experimental results provide necessary insights in this insufficiently studied aspect of GP, suggesting promising directions for future work aimed at developing new memetic GP systems.
Information Sciences, 2015
KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2019
Genetic Programming (GP) is an intelligence technique whereby computer programs are encoded as a set of genes which are evolved utilizing a Genetic Algorithm (GA). In other words, the GP employs novel optimization techniques to modify computer programs; imitating the way humans develop programs by progressively rewriting them for solving problems automatically. Trial programs are frequently altered in the search for obtaining superior solutions due to the base is GA. These are evolutionary search techniques inspired by biological evolution such as mutation, reproduction, natural selection, recombination, and survival of the fittest. The power of GAs is being represented by an advancing range of applications; vector processing, quantum computing, VLSI circuit layout, and so on. But one of the most significant uses of GAs is the automatic generation of programs. Technically, the GP solves problems automatically without having to tell the computer specifically how to process it. To meet this requirement, the GP utilizes GAs to a "population" of trial programs, traditionally encoded in memory as tree-structures. Trial programs are estimated using a "fitness function" and the suited solutions picked for re-evaluation and modification such that this sequence is replicated until a "correct" program is generated. GP has represented its power by modifying a simple program for categorizing news stories, executing optical character recognition, medical signal filters, and for target identification, etc. This paper reviews existing literature regarding the GPs and their applications in different scientific fields and aims to provide an easy understanding of various types of GPs for beginners.
Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, 2015
Semantic Backpropagation (SB) was introduced in GP so as to take into account the semantics of a GP tree at all intermediate states of the program execution, i.e., at each node of the tree. The idea is to compute the optimal "should-be" values each subtree should return, whilst assuming that the rest of the tree is unchanged, so as to minimize the fitness of the tree. To this end, the Random Desired Output (RDO) mutation operator, proposed in [17], uses SB in choosing, from a given library, a tree whose semantics are preferred to the semantics of a randomly selected subtree from the parent tree. Pushing this idea one step further, this paper introduces the Local Tree Improvement (LTI) operator, which selects from the parent tree the overall best subtree for applying RDO, using a small randomly drawn static library. Used within a simple Iterated Local Search framework, LTI can find the exact solution of many popular Boolean benchmarks in reasonable time whilst keeping solution trees small, thus paving the road for truly memetic GP algorithms.
Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753)
In Evolutionary Computation, genetic operators, such as mutation and crossover, are employed to perturb individuals to generate the next population. However these fixed, problem independent genetic operators may destroy the subsolution, usually called building blocks, instead of discovering and preserving them. One way to overcome this problem is to build a model based on the good individuals, and sample this model to obtain the next population. There is a wide range of such work in Genetic Algorithms; but because of the complexity of the Genetic Programming (GP) tree representation, little work of this kind has been done in GP. In this paper, we propose a new method, Grammar Model-based Program Evolution (GMPE) to evolved GP program. We replace common GP genetic operators with a Probabilistic Context-free Grammar (SCFG). In each generation, an SCFG is learnt, and a new population is generated by sampling this SCFG model. On two benchmark problems we have studied, GMPE significantly outperforms conventional GP, learning faster and more reliably.
Journal of Intelligent and Robotic Systems, 2006
2009
distributed inany formorby any means, electronic or mechanical, including photocopying, withoul written permission from the pubusher. Product or company ñames used in this set are for identiflcation purposes only. Inclusión ofthe ñames ofthe producís orcompaniesdoesnot indícate a claim of ownership by IGI Global of the trademark or registered trademark, Libraiy ofCongressCataloging-in-Publication Data Encyclopedia of artificial intelligence / Juan Ramón Rabunal Dopíco, Julián Dorado de la Calle, and Alejandro Pazos Sierra, edilors. p. cm. Includes bibliographjcal references and índex. Summary: "This book is a comprehensive and in-depth reference lo the most recent developments in tbe field covering theoretical developments, lechniques, technologies, among otbers'-Provided by pubtisher.
There are two important limitations of standard tree-based genetic programming (GP). First, GP tends to evolve unnecessarily large programs, what is referred to as bloat. Second, GP uses inefficient search operators that focus on modifying program syntax. The first problem has been studied extensively, with many works proposing bloat control methods. Regarding the second problem, one approach is to use alternative search operators, for instance geometric semantic operators , to improve convergence. In this work, our goal is to experimentally show that both problems can be effectively addressed by incorporating a local search op-timizer as an additional search operator. Using real-world problems, we show that this rather simple strategy can improve the convergence and performance of tree-based GP, while also reducing program size. Given these results, a question arises: why are local search strategies so uncommon in GP? A small survey of popular GP libraries suggests to us that local search is underused in GP systems. We conclude by outlining plausible answers for this question and highlighting future work.
Lecture Notes in Computer Science, 2002
Tree-adjunct grammar guided genetic programming (TAG3P) [5] is a grammar guided genetic programming system that uses context-free grammars along with tree-adjunct grammars as means to set language bias for the genetic programming system. In this paper, we show the experimental results of TAG3P on two problems: symbolic regression and trigonometric identity discovery. The results show that TAG3P works well on those problems.
Lecture Notes in Computer Science, 2004
For problems where the evaluation of an individual is the dominant factor in the total computation time of the evolutionary process, minimizing the number of evaluations becomes critical. This paper introduces a new crossover operator for genetic programming, memetic crossover, that reduces the number of evaluations required to find an ideal solution. Memetic crossover selects individuals and crossover points by evaluating the observed strengths and weaknesses within areas of the problem. An individual that has done poorly in some parts of the problem may then imitate an individual that did well on those same parts. This results in an intelligent search of the feature-space and, therefore, fewer evaluations.
Genetic Programming (GP) is an automated method for creating computer programs starting from a high-level description of the problem to be solved. Many variants of GP have been proposed in the recent years. In this paper we are reviewing the main GP variants with linear representation. Namely, Linear Genetic Programming, Gene Expression Programming, Multi Expression Programming, Grammatical Evolution, Cartesian Genetic Programming and Stack-Based Genetic Programming. A complete description is provided for each method. The set of applications where the methods have been applied and several Internet sites with more information about them are also given.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Genetic Programming, 2020
Soft Computing, 2007
Artificial Intelligence Review, 2002
Evolutionary Intelligence, 2020
Advances in Complex Systems, 2003
Genetic Programming and Evolvable Machines, 2010