A Simple Approach to Lifetime Learning in Genetic Programming based Symbolic Regression

Created by W.Langdon from gp-bibliography.bib Revision:1.3963

@Article{Azad:2014:EC,
  author =       "Raja Muhammad Atif Azad and Conor Ryan",
  title =        "A Simple Approach to Lifetime Learning in Genetic
                 Programming based Symbolic Regression",
  journal =      "Evolutionary Computation",
  year =         "2014",
  volume =       "22",
  number =       "2",
  pages =        "287--317",
  month =        "Summer",
  keywords =     "genetic algorithms, genetic programming, hill
                 climbing, Lamarckian, genetic repair, Memetic
                 Algorithms, lifetime learning, local search, hybrid
                 genetic algorithms",
  ISSN =         "1063-6560",
  URL =          "http://www.mitpressjournals.org/doi/abs/10.1162/EVCO_a_00111",
  DOI =          "doi:10.1162/EVCO_a_00111",
  size =         "31 pages",
  abstract =     "Genetic Programming (GP) coarsely models natural
                 evolution to evolve computer programs. Unlike in
                 nature, where individuals can often improve their
                 fitness through lifetime experience, the fitness of GP
                 individuals generally does not change during their
                 lifetime, and there is usually no opportunity to pass
                 on acquired knowledge.

                 This paper introduces the Chameleon system to address
                 this discrepancy and augment GP with lifetime learning
                 by adding a simple local search that operates by tuning
                 the internal nodes of individuals. Although not the
                 first attempt to combine local search with GP, its
                 simplicity means that it is easy to understand and
                 cheap to implement.

                 A simple cache is added which leverages the local
                 search to reduce the tuning cost to a small fraction of
                 the expected cost, and we provide a theoretical upper
                 limit on the maximum tuning expense given the average
                 tree size of the population and show that this limit
                 grows very conservatively as the average tree size of
                 the population increases.

                 We show that Chameleon uses available genetic material
                 more efficiently by exploring more actively than with
                 standard GP, and demonstrate that not only does
                 Chameleon outperform standard GP (on both training and
                 test data) over a number of symbolic regression type
                 problems, it does so by producing smaller individuals
                 and that it works harmoniously with two other well
                 known extensions to GP, namely, linear scaling and a
                 diversity-promoting tournament selection method.",
  notes =        "Chameleon, cache, UCI, depth first hill climbing,
                 internal nodes (functions) only.

                 Disappeared Sep 2014
                 http://casnew.iti.upv.es/index.php/evocompetitions/105-symregcompetition",
}

Genetic Programming entries for R Muhammad Atif Azad Conor Ryan

Citations