Review Optimized Artificial Neural Network by Meta-Heuristic Algorithm and its Applications

Authors

  • Noor Hassan Kadhim 1Computer Science Department, College of Computer Science and Information Technology, University of Al-Qadisiyah , Iraq‬‏
  • Dr. Qusay Mosa Computer Science Department, College of Computer Science and Information Technology, University of Al-Qadisiyah ,Iraq , Iraq

DOI:

https://doi.org/10.29304/jqcm.2021.13.3.825

Keywords:

Artificial neural network, metaheuristic optimizing algorithms, prediction; classification

Abstract

A Meta-Heuristic Algorithms Optimization (MAHO) is inspired by nature. The Artificial neural network (ANN) has been shown to be successful in a variety of applications, including machine learning. ANNs were optimized using meta-optimization methods to enhance classification performance and predictions. The fundamental objective of combining a meta-heuristic algorithm (MHAO) with an artificial neural network (ANN) is to train the network to update the weights. The training would be speedier than with a standard ANN since it will use a meta-heuristic method with global optimal searching capability to avoid local minimum and will also optimize difficult problems. will discuss some of these meta-heuristic algorithms using ANN as they are applied to common data sets, as well as real-time specific classification and prediction experiences. In order to give researchers motivational insights into their own fields of application.

Downloads

Download data is not yet available.

References

H. Alkabbani, A. Ahmadian, Q. Zhu, and A. Elkamel, “Machine Learning and Metaheuristic Methods for Renewable Power Forecasting: A Recent Review,” Front. Chem. Eng., vol. 3, no. April, pp. 1–21, 2021, doi: 10.3389/fceng.2021.665415.
[2] M. Eshtay, H. Faris, and N. Obeid, “Metaheuristic-based extreme learning machines: a review of design formulations and applications,” Int. J. Mach. Learn. Cybern., vol. 10, no. 6, pp. 1543–1561, 2019, doi: 10.1007/s13042-018-0833-6.
[3] K. Kaur and Y. Kumar, "Swarm Intelligence and its applications towards Various Computing: A Systematic Review," 2020 INTERNATIONAL CONFERENCE ON INTELLIGENT ENGINEERING AND MANAGEMENT (ICIEM), 2020, pp. 57-62, doi: 10.1109/ICIEM48762.2020.9160177.
[4] H. Stegherr, M. Heider, and J. Hähner, “Classifying Metaheuristics: Towards a unified multi-level classification system,” Nat. Comput., vol. 0, 2020, doi: 10.1007/s11047-020-09824-0.
[5] K.-L. Du, M. N. S. Swamy, and others, “Search and optimization by metaheuristics,” Tech. Algorithms Inspired by Nat., 2016.
[6] S. Koziel and A. Bekasiewicz, Multi-objective design of antennas using surrogate models. World Scientific, 2016.
[7] N. T. Dat, C. V. Kien, H. P. H. Anh and N. N. Son, "Parallel Multi-Population Technique for Meta-Heuristic Algorithms on Multi Core Processor," 2020 5TH INTERNATIONAL CONFERENCE ON GREEN TECHNOLOGY AND SUSTAINABLE DEVELOPMENT (GTSD), 2020, pp. 489-494, doi: 10.1109/GTSD50082.2020.9303114.
[8] T. L. Fine, Feedforward neural network methodology. Springer Science & Business Media, 2006.
[9] D. Graupe, Principles of artificial neural networks, vol. 7. World Scientific, 2013.
[10] A. H. Alsaeedi, A. H. Aljanabi, M. E. Manna, and A. L. Albukhnefis, “A proactive
meta heuristic model for optimizing weights of artificial neural network” 2020, doi:
10.11591/ijeecs. v20.i2. pp976-984.
optimizing weights of artificial neural network,” Indones. J. Electr. Eng. Comput. Sci., vol. 20, no. 2, pp. 976-984,
[11] M. Gori and A. Tesi, “On the problem of local minima in backpropagation,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 14, no. 1, pp. 76–86, 1992.
[12] C. Burges et al., “Learning to rank using gradient descent,” in Proceedings of the 22nd international conference on Machine learning, 2005, pp. 89–96.
[13] M. Dorigo, V. Maniezzo, and A. Colorni, “Ant system: optimization by a colony of cooperating agents,” IEEE Trans. Syst. Man, Cybern. Part B, vol. 26, no. 1, pp. 29–41, 1996.
[14] K. Socha and C. Blum, “An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training,” Neural Comput. Appl., vol. 16, no. 3, pp. 235–247, 2007.
[15] R. K. Sivagaminathan and S. Ramakrishnan, “A hybrid approach for feature subset selection using neural networks and ant colony optimization,” Expert Syst. Appl., vol. 33, no. 1, pp. 49–60, 2007.
[16] H. Shi and W. Li, “Artificial neural networks with ant colony optimization for assessing performance of residential buildings,” in 2009 International Conference on Future BioMedical Information Engineering (FBIE), 2009, pp. 379–382.
[17] D. Karaboga and B. Basturk, “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm,” J. Glob. Optim., vol. 39, no. 3, pp. 459–471, 2007.
[18] Taher, A., & Kadhimb, S. (2019). Hybrid between Genetic Algorithm and Artificial Bee Colony for Key
Generation Purpose. Journal of Al-Qadisiyah for Computer Science and Mathematics, 11(4), Comp Page 37-
46. https://doi.org/10.29304/jqcm.2019.11.4.627
[19] D. Karaboga and C. Ozturk, “Neural networks training by artificial bee colony algorithm on pattern classification,” Neural Netw. World, vol. 19, no. 3, p. 279, 2009.
[20] B. A. Garro, H. Sossa, and R. A. Vázquez, “Artificial neural network synthesis by means of artificial bee colony (abc) algorithm,” in 2011 IEEE Congress of Evolutionary Computation (CEC), 2011, pp. 331–338.
[21] S. Nandy, P. P. Sarkar, and A. Das, “Training a feed-forward neural network with artificial bee colony based backpropagation method,” arXiv Prepr. arXiv1209.2548, 2012.
[22] W.-C. Yeh and T.-J. Hsieh, “Artificial bee colony algorithm-neural networks for S-system models of biochemical networks approximation,” Neural Comput. Appl., vol. 21, no. 2, pp. 365–375, 2012.
[23] I. A. A. Al-Hadi, S. Z. M. Hashim, and S. M. H. Shamsuddin, “Bacterial Foraging Optimization Algorithm for neural network learning enhancement,” in 2011 11th international conference on hybrid intelligent systems (HIS), 2011, pp. 200–205.
[24] N. M. Nawi, M. Z. Rehman, and A. Khan, “A new bat based back-propagation (BAT-BP) algorithm,” in Advances in Systems Science, Springer, 2014, pp. 395–404.
[25] N. S. Jaddi, S. Abdullah, and A. R. Hamdan, “Multi-population cooperative bat algorithm-based optimization of artificial neural network model,” Inf. Sci. (Ny)., vol. 294, pp. 628–644, 2015.
[26] S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Let a biogeography-based optimizer train your multi-layer perceptron,” Inf. Sci. (Ny)., vol. 269, pp. 188–209, 2014.
[27] Y. Zhang, P. Phillips, S. Wang, G. Ji, J. Yang, and J. Wu, “Fruit classification by biogeography-based optimization and feedforward neural network,” Expert Syst., vol. 33, no. 3, pp. 239–253, 2016.
[28] A. Rodan, H. Faris, J. Alqatawna, and others, “Optimizing feedforward neural networks using biogeography based optimization for e-mail spam identification,” Int. J. Commun. Netw. Syst. Sci., vol. 9, no. 01, p. 19, 2016.
[29] A. Askarzadeh and A. Rezazadeh, “A new heuristic optimization algorithm for modeling of proton exchange membrane fuel cell: bird mating optimizer,” Int. J. Energy Res., vol. 37, no. 10, pp. 1196–1204, 2013.
[30] A. Askarzadeh and A. Rezazadeh, “Artificial neural network training using a new efficient optimization algorithm,” Appl. Soft Comput., vol. 13, no. 2, pp. 1206–1213, 2013.
[31] X.-S. Yang, “Firefly algorithms for multimodal optimization,” in International symposium on stochastic algorithms, 2009, pp. 169–178.
[32] S. Nandy, P. P. Sarkar, and A. Das, “Analysis of a nature inspired firefly algorithm based back-propagation neural network training,” arXiv Prepr. arXiv1206.5360, 2012.
[33] I. Brajevic and M. Tuba, “Training feed-forward neural networks using firefly algorithm,” Recent Adv. Knowl. Eng. Syst. Sci., 2013.
[34] M. Alweshah, “Firefly algorithm with artificial neural network for time series problems,” Res. J. Appl. Sci. Eng. Technol., vol. 7, no. 19, pp. 3978–3982, 2014.
[35] D. E. Golberg, “Genetic algorithms in search, optimization, and machine learning,” Addion wesley, vol. 1989, no. 102, p. 36, 1989.
[36] A. J. F. van Rooij, R. P. Johnson, and L. C. Jain, Neural network training using genetic algorithms. World Scientific Publishing Co., Inc., 1996.
[37] U. Seiffert, “Multiple layer perceptron training using genetic algorithms,” in ESANN, 2001, pp. 159–164.
[38] S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Grey wolf optimizer,” Adv. Eng. Softw., vol. 69, pp. 46–61, 2014.
[39] S. Mirjalili, “How effective is the Grey Wolf optimizer in training multi-layer perceptrons,” Appl. Intell., vol. 43, no. 1, pp. 150–161, 2015.
[40] A. H. Gandomi and A. H. Alavi, “Krill herd: a new bio-inspired optimization algorithm,” Commun. nonlinear Sci. Numer. Simul., vol. 17, no. 12, pp. 4831–4845, 2012.
[41] N. S. Lari and M. S. Abadeh, “Training artificial neural network by krill-herd algorithm,” in 2014 IEEE 7th Joint International Information Technology and Artificial Intelligence Conference, 2014, pp. 63–67.
[42] P. A. Kowalski and S. Łukasik, “Training neural networks with krill herd algorithm,” Neural Process. Lett., vol. 44, no. 1, pp. 5–17, 2016.
[43] S. Mirjalili, “Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm,” Knowledge-based Syst., vol. 89, pp. 228–249, 2015.
[44] W. Yamany, M. Fawzy, A. Tharwat, and A. E. Hassanien, “Moth-flame optimization for training multi-layer perceptrons,” in 2015 11th International computer engineering Conference (ICENCO), 2015, pp. 267–272.
[45] J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of ICNN’95-international conference on neural networks, 1995, vol. 4, pp. 1942–1948.
[46] M. Yaghini, M. M. Khoshraftar, and M. Fallahi, “A hybrid algorithm for artificial neural network training,” Eng. Appl. Artif. Intell., vol. 26, no. 1, pp. 293–301, 2013.
[47] Z. Beheshti and S. M. H. Shamsuddin, “CAPSO: centripetal accelerated particle swarm optimization,” Inf. Sci. (Ny)., vol. 258, pp. 54–79, 2014.
[48] R. Tang, S. Fong, X.-S. Yang, and S. Deb, “Wolf search algorithm with ephemeral memory,” in Seventh international conference on digital information management (ICDIM 2012), 2012, pp. 165–172.
[49] N. M. Nawi, M. Z. Rehman, and A. Khan, “WS-BP: An efficient wolf search based back-propagation algorithm,” in AIP Conference Proceedings, 2015, vol. 1660, no. 1, p. 50027.
[50] S. Mirjalili, “Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems,” Neural Comput. Appl., vol. 27, no. 4, pp. 1053–1073, 2016.
[51] G.-G. Wang, S. Deb, and Z. Cui, “Monarch butterfly optimization,” Neural Comput. Appl., vol. 31, no. 7, pp. 1995–2014, 2019.
[52] D. Devikanniga and R. J. S. Raj, “Classification of osteoporosis by artificial neural network based on monarch butterfly optimisation algorithm,” Healthc. Technol. Lett., vol. 5, no. 2, pp. 70–75, 2018
[53] N. S. Jaddi, S. Abdullah, and A. R. Hamdan, “Optimization of neural network model using modified bat-inspired algorithm,” Appl. Soft Comput., vol. 37, pp. 71–86, 2015.
[54] L. L. Kamal and H. Kodaz, “Training artificial neural network by bat optimization algorithms,” Int. J. Adv. Comput. Eng. Netw., vol. 5, no. 8, pp. 53–56, 2017.
[55] M. Mavrovouniotis and S. Yang, “Training neural networks with ant colony optimization algorithms for pattern classification,” Soft Comput., vol. 19, no. 6, pp. 1511–1522, 2015.
[56] A. Asuncion and D. Newman, “UCI machine learning repository.” Irvine, CA, USA, 2007.
[57] A. H. Gandomi, X.-S. Yang, A. H. Alavi, and S. Talatahari, “Bat algorithm for constrained optimization tasks,” Neural Comput. Appl., vol. 22, no. 6, pp. 1239–1255, 2013.
[58] V. N. Ghate and S. V Dudul, “Optimal MLP neural network classifier for fault detection of three phase induction motor,” Expert Syst. Appl., vol. 37, no. 4, pp. 3468–3481, 2010.
[59] M. Mavrovouniotis and S. Yang, “Evolving neural networks using ant colony optimization with pheromone trail limits,” in 2013 13th UK Workshop on Computational Intelligence (UKCI), 2013, pp. 16–23.

Downloads

Published

2021-08-07

How to Cite

Kadhim, N. H., & Mosa, D. Q. (2021). Review Optimized Artificial Neural Network by Meta-Heuristic Algorithm and its Applications. Journal of Al-Qadisiyah for Computer Science and Mathematics, 13(3), Comp Page 34 – 46. https://doi.org/10.29304/jqcm.2021.13.3.825

Issue

Section

Computer Articles