ABSTRACT
This paper proposes a NeuroEvolution algorithm, Modular Grammatical Evolution (MGE), that enables the evolution of both topology and weights of neural networks for more challenging classification benchmarks like MNIST and Letter with 10 and 26 class counts. The success of MGE is mainly due to (1) restricting the solution space to regular network topologies with a special form of modularity, and (2) improving the search properties of state-of-the-art GE methods by improving the mapping locality and the representation scalability. We have defined and evaluated five forms of structural constraints and observe that single-layer modular restriction of solution space helps in finding smaller and more efficient neural networks faster. Our experimental evaluations on ten well-known classification benchmarks demonstrate that MGE-generated neural networks provide better classification accuracy with respect to other NeuroEvolution methods. Finally our experimental results indicate that MGE outperforms other GE methods in terms of locality and scalability properties.
This Hot-off-the-Press paper summarizes "Modular Grammatical Evolution for The Generation of Artificial Neural Networks" by K. Soltanian, A. Ebnenasir, and M. Afsharchi [9], accepted for publication in Evolutionary Computation journal of the MIT press.
Supplemental Material
Available for Download
Supplemental material.
- Fardin Ahmadizar, Khabat Soltanian, Fardin AkhlaghianTab, and Ioannis Tsoulos. 2015. Artificial neural network development by means of a novel combination of grammatical evolution and genetic algorithm. Engineering Applications of Artificial Intelligence 39 (2015), 1 -- 13.Google Scholar
Cross Ref
- Filipe Assunção, Nuno Lourenço, Penousal Machado, and Bernardete Ribeiro. 2017. Automatic generation of neural networks with structured Grammatical Evolution. In 2017 IEEE Congress on Evolutionary Computation (CEC). 1557--1564. Google Scholar
Digital Library
- Filipe Assunção, Nuno Lourenço, Penousal Machado, and Bernardete Ribeiro. 2017. Towards the Evolution of Multi-Layered Neural Networks: A Dynamic Structured Grammatical Evolution Approach. In Proceedings of the Genetic and Evolutionary Computation Conference (Berlin, Germany) (GECCO '17). Association for Computing Machinery, New York, NY, USA, 393--400. Google Scholar
Digital Library
- E. Cantu-Paz and C. Kamath. 2005. An empirical comparison of combinations of evolutionary algorithms and neural networks for classification problems. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 35, 5 (Oct 2005), 915--927. Google Scholar
Digital Library
- Nuno Lourenço, Francisco B. Pereira, and Ernesto Costa. 2016. Unveiling the Properties of Structured Grammatical Evolution. Genetic Programming and Evolvable Machines 17, 3 (Sept. 2016), 251--289.Google Scholar
Digital Library
- Nuno Lourenço, Filipe Assunção, Francisco B. Pereira, Ernesto Costa, and Penousal Machado. 2018. Structured Grammatical Evolution: A Dynamic Approach. Springer International Publishing, Cham, 137--161.Google Scholar
- Tyler McDonnell, Sari Andoni, Elmira Bonab, Sheila Cheng, Jun-Hwan Choi, Jimmie Goode, Keith Moore, Gavin Sellers, and Jacob Schrum. 2018. Divide and Conquer: Neuro evolution for Multiclass Classification. In Proceedings of the Genetic and Evolutionary Computation Conference (Kyoto, Japan) (GECCO '18). Association for Computing Machinery, New York, NY, USA, 474--481.Google Scholar
Digital Library
- Risto Miikkulainen, Jason Liang, Elliot Meyerson, Aditya Rawal, Daniel Fink, Olivier Francon, Bala Raju, Hormoz Shahrzad, Arshak Navruzyan, Nigel Duffy, and Babak Hodjat. 2019. Chapter 15 - Evolving Deep Neural Networks. In Artificial Intelligence in the Age of Neural Networks and Brain Computing, Robert Kozma, Cesare Alippi, Yoonsuck Choe, and Francesco Carlo Morabito (Eds.). Academic Press, 293 -- 312.Google Scholar
- Khabat Soltanian, Ali Ebnenasir, and Mohsen Afsharchi. 2021. Modular Grammatical Evolution for the Generation of Artificial Neural Networks. Evolutionary Computation (12 2021), 1--36. Google Scholar
Cross Ref
- Kenneth O. Stanley and Risto Miikkulainen. 2002. Evolving Neural Networks Through Augmenting Topologies. Evol. Comput. 10, 2 (June 2002), 99--127.Google Scholar
Digital Library
- Ioannis Tsoulos, Dimitris Gavrilis, and Euripidis Glavas. 2008. Neural network construction and training using grammatical evolution. Neurocomputing 72, 1 (2008), 269--277.Google Scholar
Digital Library
Index Terms
Modular grammatical evolution for the generation of artificial neural networks: (hot-off-the-press track at GECCO 2022)
Recommendations
Investigating whether hyperNEAT produces modular neural networks
HyperNEAT represents a class of neuroevolutionary algorithms that captures some of the power of natural development with a computationally efficient high-level abstraction of development. This class of algorithms is intended to provide many of the ...
Towards the evolution of multi-layered neural networks: a dynamic structured grammatical evolution approach
Current grammar-based NeuroEvolution approaches have several shortcomings. On the one hand, they do not allow the generation of Artificial Neural Networks (ANNs) composed of more than one hidden-layer. On the other, there is no way to evolve networks ...
Evolving artificial neural networks with FINCH
We present work with the FINCH automatic evolutionary programming tool to evolve code that generates Artificial Neural Networks (ANNs) that perform desired tasks. We show how FINCH can be used to evolve code that generates an ANN that performs a simple ...






Comments