Improving Conjugate Gradient method For Training Feed Forward Neural Networks

Authors

  • Luma N. M.Tawfiq Department of Mathematics, College of Education-Ibn Al- Haitham, University of Baghdad .
  • Alaa . K. J. AL-Mosawi Department of Mathematics, College of Education-Ibn Al- Haitham, University of Baghdad .

Abstract

      In this paper, many modified and new algorithms have been proposed for training feed forward neural networks, many of them having a very fast convergence rate for reasonable size networks.

In all of these algorithms we use the gradient of the performance function (energy function, error function) to determine how to adjust the weights such that the performance function is minimized, where the back propagation algorithm has been used to increase the speed of training. The above algorithms have a variety of different computation and thus different type of form of search direction and storage requirements, and all the above algorithms applied in approximation problem.

Downloads

Download data is not yet available.

Downloads

Published

2017-09-29

How to Cite

N. M.Tawfiq, L., & . K. J. AL-Mosawi, A. (2017). Improving Conjugate Gradient method For Training Feed Forward Neural Networks. Journal of Al-Qadisiyah for Computer Science and Mathematics, 3(2), 10–24. Retrieved from https://jqcsm.qu.edu.iq/index.php/journalcm/article/view/270

Issue

Section

Math Articles