Scalable Genetic Programming by Gene-pool Optimal Mixing and Input-space Entropy-based Building-block Learning

Created by W.Langdon from gp-bibliography.bib Revision:1.4448

@InProceedings{Virgolin:2017:GECCO,
  author =       "Marco Virgolin and Tanja Alderliesten and 
                 Cees Witteveen and Peter A. N. Bosman",
  title =        "Scalable Genetic Programming by Gene-pool Optimal
                 Mixing and Input-space Entropy-based Building-block
                 Learning",
  booktitle =    "Proceedings of the Genetic and Evolutionary
                 Computation Conference",
  series =       "GECCO '17",
  year =         "2017",
  isbn13 =       "978-1-4503-4920-8",
  address =      "Berlin, Germany",
  pages =        "1041--1048",
  size =         "8 pages",
  URL =          "http://doi.acm.org/10.1145/3071178.3071287",
  DOI =          "doi:10.1145/3071178.3071287",
  acmid =        "3071287",
  publisher =    "ACM",
  publisher_address = "New York, NY, USA",
  keywords =     "genetic algorithms, genetic programming, building
                 blocks, linkage learning, optimal mixing, program
                 synthesis",
  month =        "15-19 " # jul,
  abstract =     "The Gene-pool Optimal Mixing Evolutionary Algorithm
                 (GOMEA) is a recently introduced model-based EA that
                 has been shown to be capable of outperforming
                 state-of-the-art alternative EAs in terms of
                 scalability when solving discrete optimization
                 problems. One of the key aspects of GOMEA's success is
                 a variation operator that is designed to extensively
                 exploit linkage models by effectively combining partial
                 solutions. Here, we bring the strengths of GOMEA to
                 Genetic Programming (GP), introducing GP-GOMEA. Under
                 the hypothesis of having little problem-specific
                 knowledge, and in an effort to design easy-to-use EAs,
                 GP-GOMEA requires no parameter specification. On a set
                 of well-known benchmark problems we find that GP-GOMEA
                 outperforms standard GP while being on par with more
                 recently introduced, state-of-the-art EAs. We
                 furthermore introduce Input-space Entropy-based
                 Building-block Learning (IEBL), a novel approach to
                 identifying and encapsulating relevant building blocks
                 (subroutines) into new terminals and functions. On
                 problems with an inherent degree of modularity, IEBL
                 can contribute to compact solution representations,
                 providing a large potential for knock-on effects in
                 performance. On the difficult, but highly modular Even
                 Parity problem, GP-GOMEA+IEBL obtains excellent
                 scalability, solving the 14-bit instance in less than 1
                 hour.",
  notes =        "Also known as \cite{Virgolin:2017:SGP:3071178.3071287}
                 GECCO-2017 A Recombination of the 26th International
                 Conference on Genetic Algorithms (ICGA-2017) and the
                 22nd Annual Genetic Programming Conference (GP-2017)",
}

Genetic Programming entries for Marco Virgolin Tanja Alderliesten Cees Witteveen Peter A N Bosman

Citations