In Chapter 4, learning in neural networks was viewed as a search mechanism for a minimum of a multi-dimensional criterion function or error function. There, and in subsequent chapters, gradient-based search methods were utilized for discovering locally optimal weight configurations for single and multiple unit nets. Also, in Section 7.5, gradient search was employed for descending on a computational energy function to reach locally minimum points/states which may represent solutions to combinatorial optimization problems.

This chapter discusses search methods which are capable of finding global optima of multi-modal multi-dimensional functions. In particular, it discusses search methods which are compatible with neural network learning and retrieval. These methods are expected to lead to "optimal" or "near optimal" weight configurations, by allowing the network to escape local minima during training. Also, these methods can be used to modify the gradient-type dynamics of recurrent neural nets (e.g., Hopfield's energy minimizing net), so that the network is able to escape "poor" attractors.

First, a general discussion on the difference between local and global search is presented. A stochastic gradient descent algorithm is introduced which extends local gradient search to global search. This is followed by a general discussion of stochastic simulated annealing search for locating globally optimal solutions. Next, simulated annealing is discussed in the context of stochastic neural nets for improved retrieval and training. A mean-field approximation of simulated annealing for networks with deterministic units is presented which offers a substantial speedup in convergence compared to stochastic simulated annealing. The chapter also reviews the fundamentals of genetic algorithms and their application in the training of multilayer neural nets. Finally, an improved hybrid genetic algorithm/gradient search method for feedforward neural net training is presented along with simulations and comparisons to backprop.

Goto [8.1] [8.2] [8.3] [8.4] [8.5] [8.6] [8.7]

Back to the Table of Contents

Back to Main Menu