Join GitHub today
GitHub is home to over 31 million developers working together to host and review code, manage projects, and build software together.Sign up
- Multi Expression Programming→
- M. Oltean, 2006-06-04, [Online]
Multi Expression Programming (MEP) is a Genetic Programming variant that uses a linear representation of chromosomes. MEP individuals are strings of genes encoding complex computer programs. When MEP individuals encode expressions, their representation is similar to the way in which compilers translate C or Pascal expressions into machine code. A unique MEP feature is the ability of storing multiple solutions of a problem in a single chromosome. Usually, the best solution is chosen for fitness assignment. When solving symbolic regression or classification problems (or any other problems for which the training set is known before the problem is solved) MEP has the same complexity as other techniques storing a single solution in a chromosome (such as GP, CGP, GEP or GE). Evaluation of the expressions encoded into a MEP individual can be performed by a single parsing of the chromosome. Offspring obtained by crossover and mutation are always syntactically correct MEP individuals (computer programs). Thus, no extra processing for repairing newly obtained individuals is needed.
- Steady-State ALPS for Real-Valued Problems→
- G.S. Hornby, 2009, Proceedings of the 11th Annual conference on Genetic and evolutionary computation, pp. 795-802
- DOI: https://doi.org/10.1145/1569901.1570011
The objectives of this paper are to describe a steady-state version of the Age-Layered Population Structure (ALPS) Evolutionary Algorithm (EA) and to compare it against other GAs on real-valued problems. Motivation for this work comes from our previous success in demonstrating that a generational version of ALPS greatly improves search performance on a Genetic Programming problem. In making steady-state ALPS, some modifications were made to the method for calculating age and the method for moving individuals up age layers. To demonstrate that ALPS works well on real-valued problems we compare it against CMA-ES and Differential Evolution (DE) on five challenging, real-valued functions and on one real-world problem. While CMA-ES and DE outperform ALPS on the two unimodal test functions, ALPS is much better on the three multimodal test problems and on the real-world problem. Further examination shows that, unlike the other GAs, ALPS maintains a genotypically diverse population throughout the entire search process. These findings strongly suggest that the ALPS paradigm is better able to avoid premature convergence then the other GAs.
- Genetic Programming with Adaptive Representations→
- J.P. Rosca and D.H. Ballard, 1994, Technical Report University of Rochester, Rochester, NY, USA
Machine learning aims towards the acquisition of knowledge based on either experience from the interaction with the external environment or by analyzing the internal problem-solving traces. Both approaches can be implemented in the Genetic Programming (GP) paradigm. Hillis  proves in an ingenious way how the first approach can work. There have not been any significant tests to prove that GP can take advantage of its own search traces. This paper presents an approach to automatic discovery of functions in GP based on the ideas of discovery of useful building blocks by analyzing the evolution trace, generalizing of blocks to define new functions and finally adapting of the problem representation on-the-fly. Adaptation of the representation determines a hierarchical organization of the extended function set which enables a restructuring of the search space so that solutions can be found more easily. Complexity measures of solution trees are defined for an adaptive representation framework and empirical results are presented.
- Greedy Recombination and Genetic Search on the Space of Computer Programs
- W.A. Tackett, 1994, Proceedings of the Third Workshop on Foundations of Genetic Algorithms. Estes Park, Colorado, USA, July 31 - August 2 1994
- DOI: https://doi.org/10.1016/B978-1-55860-356-1.50017-0
Many natural organisms overproduce zygotes and subsequently decimate the ranks of offspring at some later stage of development. The basic purpose of this behavior is the reduction of parental resource investment in offspring which are less fit than others according to some metabolically cheap fitness measure. An important insight into this process is that for single-pair matings all offspring are products of the same parental genotypes: the selection taking place therefore seeks the most fit recombination of parental traits. This paper presents the Greedy Recombination operator RB(n) for genetic programming, which performs greedy selection among potential crossover sites of a mating pair. The properties of RB(n) are described both from a statistical standpoint and in terms of their effect upon search; comparisons are drawn to existing methods. We formulate the class of constructional problems, which allow precise control over the fitness structure in the space of expressions being searched. The constructional approach is used to create simple GP analogies of the “Royal Road” problems used to study classical GA. The effects of RB(n) upon search properties, fitness distributions, and genotypic variations are examined and contrasted with effects of selection and recombination methods.
- Trivial Geography in Genetic Programming→
- L. Spector and J. Klein, 2006, Genetic Programming Theory and Practice III. Genetic Programming, vol 9. Springer, Boston, MA
- DOI: https://doi.org/10.1007/0-387-28111-8_8
Geographical distribution is widely held to be a major determinant of evolutionary dynamics. Correspondingly, genetic programming theorists and practitioners have long developed, used, and studied systems in which populations are structured in quasi-geographical ways. Here we show that a remarkably simple version of this idea produces surprisingly dramatic improvements in problem-solving performance on a suite of test problems. The scheme is trivial to implement, in some cases involving little more than the addition of a modulus operation in the population access function, and yet it provides significant benefits on all of our test problems (ten symbolic regression problems and a quantum computing problem). We recommend the broader adoption of this form of “trivial geography” in genetic programming systems.
- Adapting Crossover in Evolutionary Algorithms
- W.M. Spears, 1995, Proceedings of the Fourth Annual Conference on Evolutionary Programming, MIT Press, pp. 367-384
One of the issues in evolutionary algorithms (EAs) is the relative importance of two search operators: mutation and crossover. Genetic algorithms (GAs) and genetic programming (GP) stress the role of crossover, while evolutionary programming (EP) and evolution strategies (ESs) stress the role of mutation. The existence of many different forms of crossover further complicates the issue. Despite theoretical analysis, it appears difficult to decide a priori which form of crossover to use, or even if crossover should be used at all. One possible solution to this difficulty is to have the EA be self-adaptive, i.e., to have the EA dynamically modify which forms of crossover to use and how often to use them, as it solves a problem. This paper describes an adaptive mechanism for controlling the use of crossover in an EA and explores the behavior of this mechanism in a number of different situations. An improvement to the adaptive mechanism is then presented. Surprisingly this improvement can also be used to enhance performance in a non-adaptive EA.
- Dynamic training subset selection for supervised learning in Genetic Programming→
- C. Gathercole and P. Ross, 1994, Proceedings of the International Conference on Evolutionary Computation. The Third Conference on Parallel Problem Solving from Nature: Parallel Problem Solving from Nature (PPSN III), Yuval Davidor, Hans-Paul Schwefel and Reinhard Männer (Eds.). Springer-Verlag, London, UK, pp. 312-321
When using the Genetic Programming (GP) Algorithm on a difficult problem with a large set of training cases, a large population size is needed and a very large number of function-tree evaluations must be carried out. This paper describes how to reduce the number of such evaluations by selecting a small subset of the training data set on which to actually carry out the GP algorithm. Three subset selection methods described in the paper are:
- Dynamic Subset Selection (DSS), using the current GP run to select difficult and/or disused cases,
- Historical Subset Selection (HSS), using previous GP runs,
- Random Subset Selection (RSS).
Various runs have shown that GP+DSS can produce better results in less than 20percent of the time taken by GP. GP+HSS can nearly match the results of GP, and, perhaps surprisingly, GP+RSS can occasionally approach the results of GP. GP+DSS also produced better, more general results than those reported in a paper for a variety of Neural Networks when used on a substantial problem, known as the Thyroid problem.
- An Efficient Constraint Handling Method for Genetic Algorithms→
- Kalyanmoy Deb, 2000, Computer Methods in Applied Mechanics and Engineering, vol. 186, issue 2-4, pp. 311-338
- DOI: https://doi.org/10.1016/S0045-7825(99)00389-8
Many real-world search and optimization problems involve inequality and/or equality constraints and are thus posed as constrained optimization problems. In trying to solve constrained optimization problems using genetic algorithms (GAs) or classical optimization methods, penalty function methods have been the most popular approach, because of their simplicity and ease of implementation. However, since the penalty function approach is generic and applicable to any type of constraint (linear or nonlinear), their performance is not always satisfactory. Thus, researchers have developed sophisticated penalty functions specific to the problem at hand and the search algorithm used for optimization. However, the most difficult aspect of the penalty function approach is to find appropriate penalty parameters needed to guide the search towards the constrained optimum. In this paper, GA's population-based approach and ability to make pair-wise comparison in tournament selection operator are exploited to devise a penalty function approach that does not require any penalty parameter. Careful comparisons among feasible and infeasible solutions are made so as to provide a search direction towards the feasible region. Once sufficient feasible solutions are found, a niching method (along with a controlled mutation operator) is used to maintain diversity among feasible solutions. This allows a real-parameter GA's crossover operator to continuously find better feasible solutions, gradually leading the search near the true optimum solution. GAs with this constraint handling approach have been tested on nine problems commonly used in the literature, including an engineering design problem. In all cases, the proposed approach has been able to repeatedly find solutions closer to the true optimum solution than that reported earlier.
- Genetic Programming - An Introduction
- Wolfgang Banzhaf, Peter Nordin, Robert E. Keller and Frank D. Francone, 1998, Morgan Kaufmann Publishers Inc. San Francisco, CA, USA
Since the early 1990s, genetic programming (GP)-a discipline whose goal is to enable the automatic generation of computer programs-has emerged as one of the most promising paradigms for fast, productive software development. GP combines biological metaphors gleaned from Darwin's theory of evolution with computer-science approaches drawn from the field of machine learning to create programs that are capable of adapting or recreating themselves for open-ended tasks. This unique introduction to GP provides a detailed overview of the subject and its antecedents, with extensive references to the published and online literature. In addition to explaining the fundamental theory and important algorithms, the text includes practical discussions covering a wealth of potential applications and real-world implementation techniques. Software professionals needing to understand and apply GP concepts will find this book an invaluable practical and theoretical guide.
- Differential Evolution: A simple evolution strategy for fast optimization
- Kenneth Price, Rainer Storn, Dr. Dobb's Journal #264 Volume 22, Number 4, April 1997
- ISSN: 1044-789X
A new heuristic approach for minimizing possibly nonlinear and non differentiable continuous space functions is presented. By means of an extensive testbed, which includes the De Jong functions, it is demonstrated that the new method converges faster and with more certainty than both Adaptive Simulated Annealing and the Annealed Nelder&Mead approach. The new method requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
- Artificial Intelligence: A Modern Approach
- Stuart J. Russell, Peter Norvig, Prentice Hall, 2009 (3rd edition)
- ISBN: 0-13-604259-7
Artificial Intelligence: A Modern Approach (AIMA) is a university textbook on artificial intelligence, written by Stuart J. Russell and Peter Norvig. It was first published in 1995 and the third edition of the book was released 11 December 2009. It is used in over 1350 universities worldwide and has been called the most popular artificial intelligence textbook in the world.It is considered the standard text in the field of artificial intelligence.
The book is intended for an undergraduate audience but can also be used for graduate-level studies with the suggestion of adding some of the primary sources listed in the extensive bibliography.