Evolutionary Induction of Sparse Neural Trees

Created by W.Langdon from gp-bibliography.bib Revision:1.4549

  author =       "Byoung-Tak Zhang and Peter Ohm and 
                 Heinz M{\"u}hlenbein",
  title =        "Evolutionary Induction of Sparse Neural Trees",
  journal =      "Evolutionary Computation",
  volume =       "5",
  number =       "2",
  pages =        "213--236",
  year =         "1997",
  keywords =     "genetic algorithms, genetic programming, program
                 induction, higher-order neural networks, neural tree
                 representation, Minimum description length principle,
                 time series prediction, breeder genetic algorithm",
  URL =          "http://bi.snu.ac.kr/Publications/Journals/International/EC5-2.ps",
  URL =          "http://www.mitpressjournals.org/doi/pdfplus/10.1162/evco.1997.5.2.213",
  DOI =          "doi:10.1162/evco.1997.5.2.213",
  abstract =     "This paper is concerned with the automatic induction
                 of parsimonious neural networks. In contrast to other
                 program induction situations, network induction entails
                 parametric learning as well as structural adaptation.
                 We present a novel representation scheme called neural
                 trees that allows efficient learning of both network
                 architectures and parameters by genetic search. A
                 hybrid evolutionary method is developed for neural tree
                 induction that combines genetic programming and the
                 breeder genetic algorithm under the unified framework
                 of the minimum description length principle. The method
                 is successfully applied to the induction of higher
                 order neural trees while still keeping the resulting
                 structures sparse to ensure good generalization
                 performance. Empirical results are provided on two
                 chaotic time series prediction problems of practical
  notes =        "Evolutionary Computation (Journal)

                 Special Issue: Trends in Evolutionary Methods for
                 Program Induction

                 Referenced in \cite{zhang:1997:WSC2}

                 Demonstrated on Mackey-Glass and chaotic fluctuations
                 in a far-infared ammonia NH3 laser.

                 Libraries of building blocks (selected by their local
                 fitness), local fitness-based crossover, injection and
                 pruning of submodules (subtree replaced by its
                 descendent subtree, if the descendent is fitter than
                 the subtree itself), scheduling of genetic operators.
                 Parsimony fitness bias. Local search to optimise
                 sigma+pi? neural net weights using exponetial noise.

  size =         "31 pages",

Genetic Programming entries for Byoung-Tak Zhang Peter Ohm Heinz Muhlenbein