Using Imaginary Ensembles to Select GP Classifiers

Created by W.Langdon from gp-bibliography.bib Revision:1.3872

@InProceedings{Johansson:2010:EuroGP,
  author =       "Ulf Johansson and Rikard Konig and Tuve Lofstrom and 
                 Lars Niklasson",
  title =        "Using Imaginary Ensembles to Select GP Classifiers",
  booktitle =    "Proceedings of the 13th European Conference on Genetic
                 Programming, EuroGP 2010",
  year =         "2010",
  editor =       "Anna Isabel Esparcia-Alcazar and Aniko Ekart and 
                 Sara Silva and Stephen Dignum and A. Sima Uyar",
  volume =       "6021",
  series =       "LNCS",
  pages =        "278--288",
  address =      "Istanbul",
  month =        "7-9 " # apr,
  organisation = "EvoStar",
  publisher =    "Springer",
  keywords =     "genetic algorithms, genetic programming",
  isbn13 =       "978-3-642-12147-0",
  DOI =          "doi:10.1007/978-3-642-12148-7_24",
  abstract =     "When predictive modeling requires comprehensible
                 models, most data miners will use specialized
                 techniques producing rule sets or decision trees. This
                 study, however, shows that genetically evolved decision
                 trees may very well outperform the more specialized
                 techniques. The proposed approach evolves a number of
                 decision trees and then uses one of several suggested
                 selection strategies to pick one specific tree from
                 that pool. The inherent inconsistency of evolution
                 makes it possible to evolve each tree using all data,
                 and still obtain somewhat different models. The main
                 idea is to use these quite accurate and slightly
                 diverse trees to form an imaginary ensemble, which is
                 then used as a guide when selecting one specific tree.
                 Simply put, the tree classifying the largest number of
                 instances identically to the ensemble is chosen. In the
                 experimentation, using 25 UCI data sets, two selection
                 strategies obtained significantly higher accuracy than
                 the standard rule inducer J48.",
  notes =        "BNF grammar, parsimony pressure to lessen bloat,
                 persistence, roulette wheel selection, p287 suggests
                 opaque techniques (ANN, SVM, ensembles) will 'almost
                 always' do better than rule sets or decision trees.
                 Part of \cite{Esparcia-Alcazar:2010:GP} EuroGP'2010
                 held in conjunction with EvoCOP2010 EvoBIO2010 and
                 EvoApplications2010",
}

Genetic Programming entries for Ulf Johansson Rikard Konig Tuve Lofstrom Lars Niklasson

Citations