Skip to main content
Article
A Fast Algorithm for Finding Global Minima of Error Functions in Layered Neural Networks
Proceedings of 1990 IEEE International Joint Conference on Neural Networks
  • Junping Sun, Nova Southeastern University
  • William I. Grosky
  • Mohamad H. Hassoun
Event Date/Location
San Diego, CA / 1990
Document Type
Article
Date
6-1-1990
Disciplines
Description

A fast algorithm is proposed for optimal supervised learning in multiple-layer neural networks. The proposed algorithm is based on random optimization methods with dynamic annealing. The algorithm does not require the computation of error function gradients and guarantees convergence to global minima. When applied to multiple-layer neural networks, the proposed algorithm updates, in batch mode, all neuron weights by Gaussian-distributed increments in a direction which reduces total decision error. The variance of the Gaussian distribution is automatically controlled so that the random search step is concentrated in potential minimum energy/error regions. Also demonstrated is a hybrid method which combines a gradient-descent phase followed by a phase of dynamically annealed random search suitable for optimal search in difficult learning tasks like parity. Extensive simulations are performed which show substantial convergence speedup of the proposed learning method as compared to gradient search methods like backpropagation. The proposed algorithm is also shown to be simple to implement and computationally effective and to lead to global minima over wide ranges of parameter settings.

DOI
10.1109/IJCNN.1990.137653
Citation Information
Junping Sun, William I. Grosky and Mohamad H. Hassoun. "A Fast Algorithm for Finding Global Minima of Error Functions in Layered Neural Networks" Proceedings of 1990 IEEE International Joint Conference on Neural Networks (1990) p. I - 715 - I - 720
Available at: http://works.bepress.com/junping-sun/3/