Evolving Optimal Neural Networks Using Genetic Algorithms with Occam's Razor

Created by W.Langdon from gp-bibliography.bib Revision:1.3973

@Article{Zhang-Muehlenbein-94-JCS,
  author =       "Byoung-Tak Zhang and Heinz M{\"u}hlenbein",
  title =        "Evolving Optimal Neural Networks Using Genetic
                 Algorithms with {O}ccam's Razor",
  journal =      "Complex Systems",
  volume =       "7",
  keywords =     "genetic algorithms, genetic programming",
  number =       "3",
  pages =        "199--220",
  year =         "1993",
  URL =          "http://www.complex-systems.com/pdf/07-3-2.pdf",
  URL =          "http://www.complex-systems.com/abstracts/v07_i03_a02.html",
  URL =          "http://citeseer.ist.psu.edu/zhang93evolving.html",
  URL =          "http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.309.234",
  URL =          "http://www.ais.fraunhofer.de/~muehlen/publications/gmd_as_ga-93_05.ps",
  abstract =     "Genetic algorithms have had two primary applications
                 for neural networks: optimization of network
                 architecture, and training weights of a fixed
                 architecture. While most previous work focuses on one
                 or the other of these options, this paper investigates
                 an alternative evolutionary approach --- breeder
                 genetic programming (BGP) --- in which the architecture
                 and the weights are optimized simultaneously. In this
                 method, the genotype of each network is represented as
                 a tree whose depth and width are dynamically adapted to
                 the particular application by specifically defined
                 genetic operators. The weights are trained by a
                 next-ascent hillclimbing search. A new fitness function
                 is proposed that quantifies the principle of Occam's
                 razor; it makes an optimal trade-off between the error
                 fitting ability and the parsimony of the network.
                 Simulation results on two benchmark problems of
                 differing complexity suggest that the method finds
                 minimal networks on clean data. The experiments on
                 noisy data show that using Occam's razor not only
                 improves the generalization performance, it also
                 accelerates convergence.",
}

Genetic Programming entries for Byoung-Tak Zhang Heinz Muhlenbein

Citations