Skip to content

bibliography

Manlio Morini edited this page Oct 15, 2024 · 24 revisions

1

Straight Line Programs: A New Linear Genetic Programming Approach
  • C. L. Alonso, J. Puente, J. L. Montaña; 2008 20th IEEE International Conference on Tools with Artificial Intelligence, Dayton, OH, USA, 2008, pp. 517-524
  • DOI: https://doi.org/10.1109/ICTAI.2008.14

Tree encodings of programs are well known for their representative power and are used very often in Genetic Programming. In this paper we experiment with a new data structure, named straight line program (slp), to represent computer programs. The main features of this structure are described and new recombination operators for GP related to slp's are introduced. Experiments have been performed on symbolic regression problems. Results are encouraging and suggest that the GP approach based on slp's consistently outperforms conventional GP based on tree structured representations.

2

  • A new linear genetic programming approach based on straight line programs: Some theoretical and experimental aspects
  • César Luis Alonso González, José Luis Montaña Arnaiz, Jorge Puente Peinador, Cruz Enrique Borges Hernández; International Journal on Artificial Intelligence Tools, 18, p. 757-781 (2009)
  • DOI: https://doi.org/10.1142/S0218213009000391

Tree encodings of programs are well known for their representative power and are used very often in Genetic Programming. In this paper we experiment with a new data structure, named straight line program (slp), to represent computer programs. The main features of this structure are described, new recombination operators for GP related to slp's are introduced and a study of the Vapnik-Chervonenkis dimension of families of slp's is done. Experiments have been performed on symbolic regression problems. Results are encouraging and suggest that the GP approach based on slp's consistently outperforms conventional GP based on tree structured representations.

3

  • Trivial Geography in Genetic Programming
  • L. Spector, J. Klein; 2006, Genetic Programming Theory and Practice III. Genetic Programming, vol 9. Springer, Boston, MA
  • DOI: https://doi.org/10.1007/0-387-28111-8_8

Geographical distribution is widely held to be a major determinant of evolutionary dynamics. Correspondingly, genetic programming theorists and practitioners have long developed, used, and studied systems in which populations are structured in quasi-geographical ways. Here we show that a remarkably simple version of this idea produces surprisingly dramatic improvements in problem-solving performance on a suite of test problems. The scheme is trivial to implement, in some cases involving little more than the addition of a modulus operation in the population access function, and yet it provides significant benefits on all of our test problems (ten symbolic regression problems and a quantum computing problem). We recommend the broader adoption of this form of “trivial geography” in genetic programming systems.

4

  • Differential Evolution: A simple evolution strategy for fast optimization
  • Kenneth Price, Rainer Storn; Dr. Dobb's Journal #264 Volume 22, Number 4, April 1997
  • ISSN: 1044-789X

A new heuristic approach for minimizing possibly nonlinear and non differentiable continuous space functions is presented. By means of an extensive testbed, which includes the De Jong functions, it is demonstrated that the new method converges faster and with more certainty than both Adaptive Simulated Annealing and the Annealed Nelder&Mead approach. The new method requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.

5

  • Differential Evolution. A Practical Approach to Global Optimization
  • Kenneth V. Price, Rainer M. Storn, Jouni A. Lampinen; Springer-Verlag Berlin Heidelberg 2005
  • ISBN: 978-3-540-20950-8
  • DOI: https://doi.org/10.1007/3-540-31306-0

Problems demanding globally optimal solutions are ubiquitous, yet many are intractable when they involve constrained functions having many local optima and interacting, mixed-type variables. The differential evolution (DE) algorithm is a practical approach to global numerical optimization which is easy to understand, simple to implement, reliable, and fast. Packed with illustrations, computer code, new insights, and practical advice, this volume explores DE in both principle and practice. It is a valuable resource for professionals needing a proven optimizer and for students wanting an evolutionary perspective on global numerical optimization.

6

  • GP for Object Classification: Brood Size in Brood Recombination Crossover
  • Mengjie Zhang, Xiaoying Gao, Weijun Lou; AI 2006: Advances in Artificial Intelligence. AI 2006. Lecture Notes in Computer Science(), vol 4304. Springer, Berlin, Heidelberg
  • ISBN: 978-3-540-49787-5
  • DOI: https://doi.org/10.1007/11941439_31

The brood size plays an important role in the brood recombination crossover method in genetic programming. However, there has not been any thorough investigation on the brood size and the methods for setting this size have not been effectively examined. This paper investigates a number of new developments of brood size in the brood recombination crossover method in GP. We first investigate the effect of different fixed brood sizes, then construct three dynamic models for setting the brood size. These developments are examined and compared with the standard crossover operator on three object classification problems of increasing difficulty. The results suggest that the brood recombination methods with all the new developments outperforms the standard crossover operator for all the problems. As the brood size increases, the system effective performance can be improved. When it exceeds a certain point, however, the effective performance will not be improved and the system will become less efficient. Investigation of three dynamic models for the brood size reveals that a good variable brood size which is dynamically set with the number of generations can further improve the system performance over the fixed brood sizes.

7

  • Investigation of Brood Size in GP with Brood Recombination Crossover for Object Recognition
  • Mengjie Zhang, Xiaoying Gao, Weijun Lou, Dongping Qian; PRICAI 2006. Lecture Notes in Computer Science(), vol 4099. Springer, Berlin, Heidelberg
  • ISBN: 978-3-540-36667-6
  • DOI: https://doi.org/10.1007/978-3-540-36668-3_107

This paper describes an approach to the investigation of brood size in the brood recombination crossover method in genetic programming for object recognition problems. The approach is examined and compared with the standard crossover operator on three object classification problems of increasing difficulty. The results suggest that the brood recombination method outperforms the standard crossover operator for all the problems in terms of the classification accuracy. As the brood size increases, the system effective performance can be improved. When it exceeds a certain point, however, the effective performance will not be improved and the system will become less efficient.

8

  • Differential Evolution in Discrete Optimization
  • Daniel Lichtblau; June 2012 International Journal of Swarm Intelligence and Evolutionary Computation 1(Z110301):1-10
  • DOI: https://doi.org/10.4303/ijsiec/Z110301

We show ways in which differential evolution, a member of the genetic/evolutionary family of global optimization methods, can be used for the purpose of discrete optimization. We consider several nontrivial problems arising from actual practice, using differential evolution as our primary tool to obtain good results. We indicate why methods more commonly seen in discrete optimization, such as integer linear programming, might be less effective for problems of the type we will consider.

9

The objectives of this paper are to describe a steady-state version of the Age-Layered Population Structure (ALPS) Evolutionary Algorithm (EA) and to compare it against other GAs on real-valued problems. Motivation for this work comes from our previous success in demonstrating that a generational version of ALPS greatly improves search performance on a Genetic Programming problem. In making steady-state ALPS, some modifications were made to the method for calculating age and the method for moving individuals up age layers. To demonstrate that ALPS works well on real-valued problems we compare it against CMA-ES and Differential Evolution (DE) on five challenging, real-valued functions and on one real-world problem. While CMA-ES and DE outperform ALPS on the two unimodal test functions, ALPS is much better on the three multimodal test problems and on the real-world problem. Further examination shows that, unlike the other GAs, ALPS maintains a genotypically diverse population throughout the entire search process. These findings strongly suggest that the ALPS paradigm is better able to avoid premature convergence then the other GAs.

10

Genetic programming is a powerful method for automatically generating computer programs via the process of natural selection (Koza, 1992). However, in its standard form, there is no way to restrict the programs it generates to those where the functions operate on appropriate data types. In the case when the programs manipulate multiple data types and contain functions designed to operate on particular data types, this can lead to unnecessarily large search times and/or unnecessarily poor generalisation performance. Strongly typed genetic programming (STGP) is an enhanced version of genetic programming that enforces data-type constraints and whose use of generic functions and generic data types makes it more powerful than other approaches to type-constraint enforcement. After describing its operation, we illustrate its use on problems in two domains, matrix/vector manipulation and list manipulation, which require its generality. The examples are (1) the multidimensional least-squares regression problem, (2) the multidimensional Kalman filter, (3) the list manipulation function NTH, and (4) the list manipulation function MAPCAR.

11

  • Genetic Programming - An Introduction
  • Wolfgang Banzhaf, Peter Nordin, Robert E. Keller and Frank D. Francone; 1998, Morgan Kaufmann Publishers Inc. San Francisco, CA, USA
  • ISBN: 1-55860-510-X

Since the early 1990s, genetic programming (GP)-a discipline whose goal is to enable the automatic generation of computer programs-has emerged as one of the most promising paradigms for fast, productive software development. GP combines biological metaphors gleaned from Darwin's theory of evolution with computer-science approaches drawn from the field of machine learning to create programs that are capable of adapting or recreating themselves for open-ended tasks. This unique introduction to GP provides a detailed overview of the subject and its antecedents, with extensive references to the published and online literature. In addition to explaining the fundamental theory and important algorithms, the text includes practical discussions covering a wealth of potential applications and real-world implementation techniques. Software professionals needing to understand and apply GP concepts will find this book an invaluable practical and theoretical guide.

12

  • Parameter Setting in Evolutionary Algorithms
  • Fernando G. Lobo, Cláudio F. Lima, Zbigniew Michalewicz; 2007 Springer Berlin, Heidelberg
  • ISBN: 978-3-540-69431-1
  • DOI: https://doi.org/10.1007/978-3-540-69432-8

One of the main difficulties of applying an evolutionary algorithm (or, as a matter of fact, any heuristic method) to a given problem is to decide on an appropriate set of parameter values. Typically these are specified before the algorithm is run and include population size, selection rate, operator probabilities, not to mention the representation and the operators themselves. This book gives the reader a solid perspective on the different approaches that have been proposed to automate control of these parameters as well as understanding their interactions. The book covers a broad area of evolutionary computation, including genetic algorithms, evolution strategies, genetic programming, estimation of distribution algorithms, and also discusses the issues of specific parameters used in parallel implementations, multi-objective evolutionary algorithms, and practical consideration for real-world applications. It is a recommended read for researchers and practitioners of evolutionary computation and heuristic methods.

13

  • Using Gaussian distribution to construct fitness functions in genetic programming for multiclass object classification
  • Mengjie Zhang, Will Smart; 2006, Victoria University of Wellington, Technical Report CS-TR-05/5
  • DOI: https://doi.org/10.1016/j.patrec.2005.07.024

This paper describes a new approach to the use of Gaussian distribution in genetic programming (GP) for multiclass object classification problems. Instead of using predefined multiple thresholds to form different regions in the program output space for different classes, this approach uses probabilities of different classes, derived from Gaussian distributions, to construct the fitness function for classification. Two fitness measures, overlap area and weighted distribution distance, have been developed. Rather than using the best evolved program in a population, this approach uses multiple programs and a voting strategy to perform classification. The approach is examined on three multiclass object classification problems of increasing difficulty and compared with a basic GP approach. The results suggest that the new approach is more effective and more efficient than the basic GP approach. Although developed for object classification, this approach is expected to be able to be applied to other classification problems.

14

One of the issues in evolutionary algorithms (EAs) is the relative importance of two search operators: mutation and crossover. Genetic algorithms (GAs) and genetic programming (GP) stress the role of crossover, while evolutionary programming (EP) and evolution strategies (ESs) stress the role of mutation. The existence of many different forms of crossover further complicates the issue. Despite theoretical analysis, it appears difficult to decide a priori which form of crossover to use, or even if crossover should be used at all. One possible solution to this difficulty is to have the EA be self-adaptive, i.e., to have the EA dynamically modify which forms of crossover to use and how often to use them, as it solves a problem. This paper describes an adaptive mechanism for controlling the use of crossover in an EA and explores the behavior of this mechanism in a number of different situations. An improvement to the adaptive mechanism is then presented. Surprisingly this improvement can also be used to enhance performance in a non-adaptive EA.

16

  • Evolving Teams of Predictors with Linear Genetic Programming
  • Markus Brameier, Wolfgang Banzhaf; 2001, Genetic Programming and Evolvable Machines, volume 2, pages 381–407
  • DOI: https://doi.org/10.1023/A:1012978805372

This paper applies the evolution of GP teams to different classification and regression problems and compares different methods for combining the outputs of the team programs. These include hybrid approaches where (1) a neural network is used to optimize the weights of programs in a team for a common decision and (2) a real numbered vector (the representation of evolution strategies) of weights is evolved with each term in parallel. The cooperative team approach results in an improved training and generalization performance compared to the standard GP method. The higher computational overhead of team evolution is counteracted by using a fast variant of linear GP.

17

Machine learning is the study of algorithms that learn from data and experience. It is applied in a vast variety of application areas, from medicine to advertising, from military to pedestrian. Any area in which you need to make sense of data is a potential consumer of machine learning.

CIML is a set of introductory materials that covers most major aspects of modern machine learning (supervised learning, unsupervised learning, large margin methods, probabilistic modeling, learning theory, etc.). It's focus is on broad applications with a rigorous backbone. A subset can be used for an undergraduate course; a graduate course could probably cover the entire material and then some.

18

  • Adaptation in Natural and Artificial Systems - An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence
  • John H. Holland; 1992, MIT Press
  • ISBN: 978-0-262-08213-6

Genetic algorithms are playing an increasingly important role in studies of complex adaptive systems, ranging from adaptive agents in economic theory to the use of machine learning techniques in the design of complex devices such as aircraft turbines and integrated circuits. Adaptation in Natural and Artificial Systems is the book that initiated this field of study, presenting the theoretical foundations and exploring applications.

In its most familiar form, adaptation is a biological process, whereby organisms evolve by rearranging genetic material to survive in environments confronting them. In this now classic work, Holland presents a mathematical model that allows for the nonlinearity of such complex interactions. He demonstrates the model's universality by applying it to economics, physiological psychology, game theory, and artificial intelligence and then outlines the way in which this approach modifies the traditional views of mathematical genetics.

Initially applying his concepts to simply defined artificial systems with limited numbers of parameters, Holland goes on to explore their use in the study of a wide range of complex, naturally occuring processes, concentrating on systems having multiple factors that interact in nonlinear ways. Along the way he accounts for major effects of coadaptation and coevolution: the emergence of building blocks, or schemata, that are recombined and passed on to succeeding generations to provide, innovations and improvements.

John H. Holland is Professor of Psychology and Professor of Electrical Engineering and Computer Science at the University of Michigan. He is also Maxwell Professor at the Santa Fe Institute and is Director of the University of Michigan/Santa Fe Institute Advanced Research Program.

19

  • Essentials of Metaheuristics
  • Sean Luke; 2013

This is an open set of lecture notes on metaheuristics algorithms, intended for undergraduate students, practitioners, programmers, and other non-experts. It was developed as a series of lecture notes for an undergraduate course I taught at GMU. The chapters are designed to be printable separately if necessary. As it's lecture notes, the topics are short and light on examples and theory. It's best when complementing other texts. With time, I might remedy this.

What is a Metaheuristic? A common but unfortunate name for any stochastic optimization algorithm intended to be the last resort before giving up and using random or brute-force search. Such algorithms are used for problems where you don't know how to find a good solution, but if shown a candidate solution, you can give it a grade. The algorithmic family includes genetic algorithms, hill-climbing, simulated annealing, ant colony optimization, particle swarm optimization, and so on.

Ultra

Highlights

Clone this wiki locally