Distance measures for HyperGP with fitness sharing

Created by W.Langdon from gp-bibliography.bib Revision:1.3973

@InProceedings{Drchal:2012:GECCO,
  author =       "Jan Drchal and Miroslav Snorek",
  title =        "Distance measures for HyperGP with fitness sharing",
  booktitle =    "GECCO '12: Proceedings of the fourteenth international
                 conference on Genetic and evolutionary computation
                 conference",
  year =         "2012",
  editor =       "Terry Soule and Anne Auger and Jason Moore and 
                 David Pelta and Christine Solnon and Mike Preuss and 
                 Alan Dorin and Yew-Soon Ong and Christian Blum and 
                 Dario Landa Silva and Frank Neumann and Tina Yu and 
                 Aniko Ekart and Will Browne and Tim Kovacs and 
                 Man-Leung Wong and Clara Pizzuti and Jon Rowe and Tobias Friedrich and 
                 Giovanni Squillero and Nicolas Bredeche and 
                 Stephen L. Smith and Alison Motsinger-Reif and Jose Lozano and 
                 Martin Pelikan and Silja Meyer-Nienberg and 
                 Christian Igel and Greg Hornby and Rene Doursat and 
                 Steve Gustafson and Gustavo Olague and Shin Yoo and 
                 John Clark and Gabriela Ochoa and Gisele Pappa and 
                 Fernando Lobo and Daniel Tauritz and Jurgen Branke and 
                 Kalyanmoy Deb",
  isbn13 =       "978-1-4503-1177-9",
  pages =        "545--552",
  keywords =     "genetic algorithms, genetic programming, generative
                 and developmental systems",
  month =        "7-11 " # jul,
  organisation = "SIGEVO",
  address =      "Philadelphia, Pennsylvania, USA",
  DOI =          "doi:10.1145/2330163.2330241",
  publisher =    "ACM",
  publisher_address = "New York, NY, USA",
  abstract =     "In this paper we propose a new algorithm called
                 HyperGPEFS (HyperGP with Explicit Fitness Sharing). It
                 is based on a HyperNEAT, which is a well-established
                 evolutionary method employing indirect encoding of
                 artificial neural networks. Indirect encoding in
                 HyperNEAT is realized via special function called
                 Compositional and Pattern Producing Network (CPPN),
                 able to describe a neural network of arbitrary size.
                 CPPNs are represented by network structures, which are
                 evolved by means of a slightly modified version of
                 another, well-known algorithm NEAT (NeuroEvolution of
                 Augmenting Topologies). HyperGP is a variant of
                 HyperNEAT, where the CPPNs are optimized by Genetic
                 Programming (GP). Published results reported promising
                 improvement in the speed of convergence.

                 Our approach further extends HyperGP by using fitness
                 sharing to promote a diversity of a population. Here,
                 we thoroughly compare all three algorithms on six
                 different tasks. Fitness sharing demands a definition
                 of a tree distance measure. Among other five, we
                 propose a generalized distance measure which, in
                 conjunction with HyperGPEFS, significantly outperforms
                 HyperNEAT and HyperGP on all, but one testing problems.
                 Although this paper focuses on indirect encoding, the
                 proposed distance measures are generally applicable.",
  notes =        "Also known as \cite{2330241} GECCO-2012 A joint
                 meeting of the twenty first international conference on
                 genetic algorithms (ICGA-2012) and the seventeenth
                 annual genetic programming conference (GP-2012)",
}

Genetic Programming entries for Jan Drchal Miroslav Snorek

Citations