skip to main content
research-article

Hybrid Wolf-Bat Algorithm for Optimization of Connection Weights in Multi-layer Perceptron

Authors Info & Claims
Published:17 April 2020Publication History
Skip Abstract Section

Abstract

In a neural network, the weights act as parameters to determine the output(s) from a set of inputs. The weights are used to find the activation values of nodes of a layer from the values of the previous layer. Finding the ideal set of these weights for training a Multi-layer Perceptron neural network such that it minimizes the classification error is a widely known optimization problem. The presented article proposes a Hybrid Wolf-Bat algorithm, a novel optimization algorithm, as a solution to solve the discussed problem. The proposed algorithm is a hybrid of two already existing nature-inspired algorithms, Grey Wolf Optimization algorithm and Bat algorithm. The novel introduced approach is tested on ten different datasets of the medical field, obtained from the UCI machine learning repository. The performance of the proposed algorithm is compared with the recently developed nature-inspired algorithms: Grey Wolf Optimization algorithm, Cuckoo Search, Bat Algorithm, and Whale Optimization Algorithm, along with the standard Back-propagation training method available in the literature. The obtained results demonstrate that the proposed method outperforms other bio-inspired algorithms in terms of both speed of convergence and accuracy.

References

  1. Ibrahim Aljarah, Hossam Faris, and Seyedali Mirjalili. 2018. Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput. 22, 1 (Jan. 2018), 1--15. DOI:https://doi.org/10.1007/s00500-016-2442-1Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Abhishek Dixit, Sushil Kumar, Mili Pant, and Dr. Rohit Bansal. 2016. Hybrid nature inspired algorithms: Methodologies, architecture, and reviews. https://link.springer.com/chapter/10.1007%2F978-981-10-5272-9_29Google ScholarGoogle Scholar
  3. Hossam Faris, Ibrahim Aljarah, and Seyedali Mirjalili. 2016. Training feedforward neural networks using multi-verse optimizer for binary classification problems. Appl. Intell. 45 (Mar. 2016). DOI:https://doi.org/10.1007/s10489-016-0767-1Google ScholarGoogle Scholar
  4. Hossam Faris, Ibrahim Aljarah, and Seyedali Mirjalili. 2018. Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl. Intell. 48, 2 (Feb. 2018), 445--464. DOI:https://doi.org/10.1007/s10489-017-0967-3Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Hossam Faris, Seyedali Mirjalili, and Ibrahim Aljarah. 2019. Automatic selection of hidden neurons and weights in neural networks using grey wolf optimizer based on a hybrid encoding scheme. Int. J. Mach. Learn. Cybernet. (Jan. 2019). DOI:https://doi.org/10.1007/s1304Google ScholarGoogle Scholar
  6. Deepak Gupta, Jatin Arora, Utkarsh Agrawal, Ashish Khanna, and Victor Hugo C. de Albuquerque. 2019. Optimized Binary Bat algorithm for classification of white blood cells. Measurement 143 (2019), 180--190. DOI:https://doi.org/10.1016/j.measurement.2019.01.002Google ScholarGoogle ScholarCross RefCross Ref
  7. Ali Asghar Heidari, Hossam Faris, Ibrahim Aljarah, and Seyedali Mirjalili. 2019. An efficient hybrid multilayer perceptron neural network with grasshopper optimization. Soft Comput. 23 (2019), 7941--7958. DOI:https://doi.org/10.1007/s00500-018-3424-2Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Najmeh Sadat Jaddi, Salwani Abdullah, and Abdul Razak Hamdan. 2015. Optimization of neural network model using modified bat-inspired algorithm. Appl. Soft Comput. 37, C (Dec. 2015), 71--86. DOI:https://doi.org/10.1016/j.asoc.2015.08.002Google ScholarGoogle Scholar
  9. Ben Kröse, B. Krose, Patrick van der Smagt, and Patrick Smagt. 1993. An introduction to neural networks. J. Comput. Sci. 1 (Jan. 1993), 1--135.Google ScholarGoogle Scholar
  10. Seyedali Mirjalili. 2015. How effective is the grey wolf optimizer in training multi-layer perceptrons. Appl. Intell. 43, 1 (July 2015), 150--161. DOI:https://doi.org/10.1007/s10489-014-0645-7Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Seyedali Mirjalili, Seyed Mohammad Mirjalili, and Andrew Lewis. 2014. Grey wolf optimizer. Adv. Eng. Softw. 69 (Mar. 2014), 46--61. DOI:https://doi.org/10.1016/j.advengsoft.2013.12.007Google ScholarGoogle Scholar
  12. Sudarshan Nandy, Partha Pratim Sarkar, and Achintya Das. 2012. Analysis of a nature inspired firefly algorithm-based back-propagation neural network training. Int. J. Comput. Appl. 43 (June 2012). DOI:https://doi.org/10.5120/6401-8339Google ScholarGoogle Scholar
  13. Varun Ojha, Ajith Abraham, and Vaclav Snasel. 2017. Ensemble of heterogeneous flexible neural trees using multiobjective genetic programming. Appl. Soft Comput. 52 (Mar. 2017), 909--924. DOI:https://doi.org/10.1016/j.asoc.2016.09.035Google ScholarGoogle Scholar
  14. Varun Ojha, Ajith Abraham, and Vaclav Snasel. 2017. Metaheuristic design of feedforward neural networks: A review of two decades of research. Eng. Appl. Artific. Intell. 60 (Apr. 2017), 97--116. DOI:https://doi.org/10.1016/j.engappai.2017.01.013Google ScholarGoogle Scholar
  15. Sudhanshu Prakash Tiwari and Kapil Bansal. 2018. Nature inspired algorithms on Industrial applications: A survey. Int. J. Appl. Eng. Res. 6 (2018), 4282--4290.Google ScholarGoogle Scholar
  16. Luis Ruiz, R. Rueda, M. P. Cuéllar, and Maria del Carmen Pegalajar Jiménez. 2017. Energy consumption forecasting based on Elman neural networks with evolutive optimization. Expert Syst. Appl. 92 (Sep. 2017). DOI:https://doi.org/10.1016/j.eswa.2017.09.059Google ScholarGoogle Scholar
  17. David E. Rumelhart, Geoffrey E. Hinton, and Ronald J. Williams. 1988. Neurocomputing: Foundations of Research. MIT Press, Cambridge, MA, 696--699. http://dl.acm.org/citation.cfm?id=65669.104451.Google ScholarGoogle Scholar
  18. S. K. Shandilya, S. Shandilya, and A. K. Nagar. 2018. Advances in Nature-Inspired Computing and Applications. Springer International Publishing. Retrieved from https://books.google.co.in/books?id=gBxrDwAAQBAJ.Google ScholarGoogle Scholar
  19. Krzysztof Socha and Christian Blum. 2007. An ant colony optimization algorithm for continuous optimization: Application to feed-forward neural network training. Neural Comput. Appl. 16, 3 (May 2007), 235--247. DOI:https://doi.org/10.1007/s00521-007-0084-zGoogle ScholarGoogle ScholarDigital LibraryDigital Library
  20. Mohammad Valipour. 2015. Opimization of neural networks for precipitation analysis in a humid region to detect drought and wet year alarms. Meteorol. Appl. 23 (Nov. 2015), n/a–n/a. DOI:https://doi.org/10.1002/met.1533Google ScholarGoogle Scholar
  21. G. Villarrubia, Juan De Paz, Pablo Chamoso, and Fernando De La Prieta. 2017. Artificial neural networks used in optimization problems. Neurocomputing 272 (June 2017). DOI:https://doi.org/10.1016/j.neucom.2017.04.075Google ScholarGoogle Scholar
  22. Fubin Yang, Heejin Cho, Hongguang Zhang, Jian Zhang, and Yuting Wu. 2018. Artificial neural network (ANN)-based prediction and optimization of an organic Rankine cycle (ORC) for diesel engine waste heat recovery. Energy Convers. Manage. 164 (May 2018), 15--26. DOI:https://doi.org/10.1016/j.enconman.2018.02.062Google ScholarGoogle Scholar
  23. Xin-She Yang. 2010. A New Metaheuristic Bat-Inspired Algorithm. Springer, Berlin, 65--74. DOI:https://doi.org/10.1007/978-3-642-12538-6_6Google ScholarGoogle Scholar

Index Terms

  1. Hybrid Wolf-Bat Algorithm for Optimization of Connection Weights in Multi-layer Perceptron

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        • Published in

          cover image ACM Transactions on Multimedia Computing, Communications, and Applications
          ACM Transactions on Multimedia Computing, Communications, and Applications  Volume 16, Issue 1s
          Special Issue on Multimodal Machine Learning for Human Behavior Analysis and Special Issue on Computational Intelligence for Biomedical Data and Imaging
          January 2020
          376 pages
          ISSN:1551-6857
          EISSN:1551-6865
          DOI:10.1145/3388236
          Issue’s Table of Contents

          Copyright © 2020 ACM

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 17 April 2020
          • Revised: 1 July 2019
          • Accepted: 1 July 2019
          • Received: 1 May 2019
          Published in tomm Volume 16, Issue 1s

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Research
          • Refereed

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format
        About Cookies On This Site

        We use cookies to ensure that we give you the best experience on our website.

        Learn more

        Got it!