Abstract
In a neural network, the weights act as parameters to determine the output(s) from a set of inputs. The weights are used to find the activation values of nodes of a layer from the values of the previous layer. Finding the ideal set of these weights for training a Multi-layer Perceptron neural network such that it minimizes the classification error is a widely known optimization problem. The presented article proposes a Hybrid Wolf-Bat algorithm, a novel optimization algorithm, as a solution to solve the discussed problem. The proposed algorithm is a hybrid of two already existing nature-inspired algorithms, Grey Wolf Optimization algorithm and Bat algorithm. The novel introduced approach is tested on ten different datasets of the medical field, obtained from the UCI machine learning repository. The performance of the proposed algorithm is compared with the recently developed nature-inspired algorithms: Grey Wolf Optimization algorithm, Cuckoo Search, Bat Algorithm, and Whale Optimization Algorithm, along with the standard Back-propagation training method available in the literature. The obtained results demonstrate that the proposed method outperforms other bio-inspired algorithms in terms of both speed of convergence and accuracy.
- Ibrahim Aljarah, Hossam Faris, and Seyedali Mirjalili. 2018. Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput. 22, 1 (Jan. 2018), 1--15. DOI:https://doi.org/10.1007/s00500-016-2442-1Google Scholar
Digital Library
- Abhishek Dixit, Sushil Kumar, Mili Pant, and Dr. Rohit Bansal. 2016. Hybrid nature inspired algorithms: Methodologies, architecture, and reviews. https://link.springer.com/chapter/10.1007%2F978-981-10-5272-9_29Google Scholar
- Hossam Faris, Ibrahim Aljarah, and Seyedali Mirjalili. 2016. Training feedforward neural networks using multi-verse optimizer for binary classification problems. Appl. Intell. 45 (Mar. 2016). DOI:https://doi.org/10.1007/s10489-016-0767-1Google Scholar
- Hossam Faris, Ibrahim Aljarah, and Seyedali Mirjalili. 2018. Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl. Intell. 48, 2 (Feb. 2018), 445--464. DOI:https://doi.org/10.1007/s10489-017-0967-3Google Scholar
Digital Library
- Hossam Faris, Seyedali Mirjalili, and Ibrahim Aljarah. 2019. Automatic selection of hidden neurons and weights in neural networks using grey wolf optimizer based on a hybrid encoding scheme. Int. J. Mach. Learn. Cybernet. (Jan. 2019). DOI:https://doi.org/10.1007/s1304Google Scholar
- Deepak Gupta, Jatin Arora, Utkarsh Agrawal, Ashish Khanna, and Victor Hugo C. de Albuquerque. 2019. Optimized Binary Bat algorithm for classification of white blood cells. Measurement 143 (2019), 180--190. DOI:https://doi.org/10.1016/j.measurement.2019.01.002Google Scholar
Cross Ref
- Ali Asghar Heidari, Hossam Faris, Ibrahim Aljarah, and Seyedali Mirjalili. 2019. An efficient hybrid multilayer perceptron neural network with grasshopper optimization. Soft Comput. 23 (2019), 7941--7958. DOI:https://doi.org/10.1007/s00500-018-3424-2Google Scholar
Digital Library
- Najmeh Sadat Jaddi, Salwani Abdullah, and Abdul Razak Hamdan. 2015. Optimization of neural network model using modified bat-inspired algorithm. Appl. Soft Comput. 37, C (Dec. 2015), 71--86. DOI:https://doi.org/10.1016/j.asoc.2015.08.002Google Scholar
- Ben Kröse, B. Krose, Patrick van der Smagt, and Patrick Smagt. 1993. An introduction to neural networks. J. Comput. Sci. 1 (Jan. 1993), 1--135.Google Scholar
- Seyedali Mirjalili. 2015. How effective is the grey wolf optimizer in training multi-layer perceptrons. Appl. Intell. 43, 1 (July 2015), 150--161. DOI:https://doi.org/10.1007/s10489-014-0645-7Google Scholar
Digital Library
- Seyedali Mirjalili, Seyed Mohammad Mirjalili, and Andrew Lewis. 2014. Grey wolf optimizer. Adv. Eng. Softw. 69 (Mar. 2014), 46--61. DOI:https://doi.org/10.1016/j.advengsoft.2013.12.007Google Scholar
- Sudarshan Nandy, Partha Pratim Sarkar, and Achintya Das. 2012. Analysis of a nature inspired firefly algorithm-based back-propagation neural network training. Int. J. Comput. Appl. 43 (June 2012). DOI:https://doi.org/10.5120/6401-8339Google Scholar
- Varun Ojha, Ajith Abraham, and Vaclav Snasel. 2017. Ensemble of heterogeneous flexible neural trees using multiobjective genetic programming. Appl. Soft Comput. 52 (Mar. 2017), 909--924. DOI:https://doi.org/10.1016/j.asoc.2016.09.035Google Scholar
- Varun Ojha, Ajith Abraham, and Vaclav Snasel. 2017. Metaheuristic design of feedforward neural networks: A review of two decades of research. Eng. Appl. Artific. Intell. 60 (Apr. 2017), 97--116. DOI:https://doi.org/10.1016/j.engappai.2017.01.013Google Scholar
- Sudhanshu Prakash Tiwari and Kapil Bansal. 2018. Nature inspired algorithms on Industrial applications: A survey. Int. J. Appl. Eng. Res. 6 (2018), 4282--4290.Google Scholar
- Luis Ruiz, R. Rueda, M. P. Cuéllar, and Maria del Carmen Pegalajar Jiménez. 2017. Energy consumption forecasting based on Elman neural networks with evolutive optimization. Expert Syst. Appl. 92 (Sep. 2017). DOI:https://doi.org/10.1016/j.eswa.2017.09.059Google Scholar
- David E. Rumelhart, Geoffrey E. Hinton, and Ronald J. Williams. 1988. Neurocomputing: Foundations of Research. MIT Press, Cambridge, MA, 696--699. http://dl.acm.org/citation.cfm?id=65669.104451.Google Scholar
- S. K. Shandilya, S. Shandilya, and A. K. Nagar. 2018. Advances in Nature-Inspired Computing and Applications. Springer International Publishing. Retrieved from https://books.google.co.in/books?id=gBxrDwAAQBAJ.Google Scholar
- Krzysztof Socha and Christian Blum. 2007. An ant colony optimization algorithm for continuous optimization: Application to feed-forward neural network training. Neural Comput. Appl. 16, 3 (May 2007), 235--247. DOI:https://doi.org/10.1007/s00521-007-0084-zGoogle Scholar
Digital Library
- Mohammad Valipour. 2015. Opimization of neural networks for precipitation analysis in a humid region to detect drought and wet year alarms. Meteorol. Appl. 23 (Nov. 2015), n/a–n/a. DOI:https://doi.org/10.1002/met.1533Google Scholar
- G. Villarrubia, Juan De Paz, Pablo Chamoso, and Fernando De La Prieta. 2017. Artificial neural networks used in optimization problems. Neurocomputing 272 (June 2017). DOI:https://doi.org/10.1016/j.neucom.2017.04.075Google Scholar
- Fubin Yang, Heejin Cho, Hongguang Zhang, Jian Zhang, and Yuting Wu. 2018. Artificial neural network (ANN)-based prediction and optimization of an organic Rankine cycle (ORC) for diesel engine waste heat recovery. Energy Convers. Manage. 164 (May 2018), 15--26. DOI:https://doi.org/10.1016/j.enconman.2018.02.062Google Scholar
- Xin-She Yang. 2010. A New Metaheuristic Bat-Inspired Algorithm. Springer, Berlin, 65--74. DOI:https://doi.org/10.1007/978-3-642-12538-6_6Google Scholar
Index Terms
Hybrid Wolf-Bat Algorithm for Optimization of Connection Weights in Multi-layer Perceptron
Recommendations
Hybrid particle swarm optimization-genetic algorithm trained multi-layer perceptron for classification of human glioma from molecular brain neoplasia data
AbstractMulti-Layer Perceptron (MLP) is among the most widely applied Artificial Neural Networks (ANNs). Multi-Layer Perceptron (MLP) requires specific designing and training depending upon specific applications. This paper deals with the high-...
Optimizing connection weights in neural networks using the whale optimization algorithm
The learning process of artificial neural networks is considered as one of the most difficult challenges in machine learning and has attracted many researchers recently. The main difficulty of training a neural network is the nonlinear nature and the ...
A hybrid intelligent genetic algorithm
Application of genetic algorithms to optimization of complex problems can lead to a substantial computational effort as a result of the repeated evaluation of the objective function(s) and the population-based nature of the search. This is often the ...






Comments