Balancing Accuracy and Parsimony in Genetic Programming

Created by W.Langdon from gp-bibliography.bib Revision:1.3872

@Article{Zhang-Muehlenbein-95-ECJ,
  author =       "Byoung-Tak Zhang and Heinz M{\"u}hlenbein",
  title =        "Balancing Accuracy and Parsimony in Genetic
                 Programming",
  journal =      "Evolutionary Computation",
  volume =       "3",
  number =       "1",
  pages =        "17--38",
  year =         "1995",
  keywords =     "genetic algorithms, genetic programming, Machine
                 learning, Tree induction, Minimum description length
                 principle, Bayesian model comparison, Evolving neural
                 networks.",
  URL =          "http://www.ais.fraunhofer.de/~muehlen/publications/gmd_as_ga-94_09.ps",
  DOI =          "doi:10.1162/evco.1995.3.1.17",
  abstract =     "Genetic programming is distinguished from other
                 evolutionary algorithms in that it uses tree
                 representations of variable size instead of linear
                 strings of fixed length. The flexible representation
                 scheme is very important because it allows the
                 underlying structure of the data to be discovered
                 automatically. One primary difficulty, however, is that
                 the solutions may grow too big without any improvement
                 of their generalization ability. In this paper we
                 investigate the fundamental relationship between the
                 performance and complexity of the evolved structures.
                 The essence of the parsimony problem is demonstrated
                 empirically by analyzing error landscapes of programs
                 evolved for neural network synthesis. We consider
                 genetic programming as a statistical inference problem
                 and apply the Bayesian model-comparison framework to
                 introduce a class of fitness functions with error and
                 complexity terms. An adaptive learning method is then
                 presented that automatically balances the
                 model-complexity factor to evolve parsimonious programs
                 without losing the diversity of the population needed
                 for achieving the desired training accuracy. The
                 effectiveness of this approach is empirically shown on
                 the induction of sigma-pi neural networks for solving a
                 real-world medical diagnosis problem as well as
                 benchmark tasks.",
}

Genetic Programming entries for Byoung-Tak Zhang Heinz Muhlenbein

Citations