Evolving the Topology of Large Scale Deep Neural Networks

Created by W.Langdon from gp-bibliography.bib Revision:1.4549

  author =       "Filipe Assuncao and Nuno Lourenco and 
                 Penousal Machado and Bernardete Ribeiro",
  title =        "Evolving the Topology of Large Scale Deep Neural
  booktitle =    "EuroGP 2018: Proceedings of the 21st European
                 Conference on Genetic Programming",
  year =         "2018",
  month =        "4-6 " # apr,
  editor =       "Mauro Castelli and Lukas Sekanina and 
                 Mengjie Zhang and Stefano Cagnoni and Pablo Garcia-Sanchez",
  series =       "LNCS",
  volume =       "10781",
  publisher =    "Springer Verlag",
  address =      "Parma, Italy",
  pages =        "19--34",
  organisation = "EvoStar, Species",
  keywords =     "genetic algorithms, genetic programming, Grammatical
                 Evolution, Convolutional Neural Networks, Deep Neural
                 Networks, Genetic Algorithm, Dynamic Structured
                 Grammatical Evolution",
  isbn13 =       "978-3-319-77552-4",
  URL =          "http://www.human-competitive.org/sites/default/files/assuncao-paper-a.pdf",
  DOI =          "doi:10.1007/978-3-319-77553-1_2",
  size =         "16 pages",
  abstract =     "In the recent years Deep Learning has attracted a lot
                 of attention due to its success in difficult tasks such
                 as image recognition and computer vision. Most of the
                 success in these tasks is merit of Convolutional Neural
                 Networks (CNNs), which allow the automatic construction
                 of features. However, designing such networks is not an
                 easy task, which requires expertise and insight. In
                 this paper we introduce DENSER, a novel representation
                 for the evolution of deep neural networks. In concrete
                 we adapt ideas from Genetic Algorithms (GAs) and
                 Grammatical Evolution (GE) to enable the evolution of
                 sequences of layers and their parameters. We test our
                 approach in the well-known image classification
                 CIFAR-10 dataset. The results show that our method: (i)
                 outperforms previous evolutionary approaches to the
                 generations of CNNs; (ii) is able to create CNNs that
                 have state-of-the-art performance while using less
                 prior knowledge (iii) evolves CNNs with novel
                 topologies, unlikely to be designed by hand. For
                 instance, the best performing CNNs obtained during
                 evolution has an unexpected structure using six
                 consecutive dense layers. On the CIFAR-10 the best
                 model reports an average error of 5.87percent on test
  notes =        "2018 HUMIES finalist

                 Part of \cite{Castelli:2018:GP} EuroGP'2018 held in
                 conjunction with EvoCOP2018, EvoMusArt2018 and

Genetic Programming entries for Filipe Assuncao Nuno Lourenco Penousal Machado Bernardete Ribeiro