Title
Beyond backpropagation: Using simulated annealing for training neural networks
Abstract
The vast majority of neural network research relies on a gradient algorithm, typically a variation of backpropagation, to obtain the weights of the model. Because of the enigmatic nature of complex nonlinear optimization problems, such as training artificial neural networks, this technique has often produced inconsistent and unpredictable results. To go beyond backpropagation's typical selection of local solutions, simulated annealing is suggested as an alternative training technique that will search globally. In this research, backpropagation will be directly compared with this global search technique via an intensive Monte Carlo study on seven test functions.
Document Type
Article
Publication Date
12-1-1999
Recommended Citation
Sexton, Randall S., Robert E. Dorsey, and John D. Johnson. "Beyond backpropagation: using simulated annealing for training neural networks." Journal of Organizational and End User Computing 11, no. 3 (1999): 3-10.
Journal Title
Journal of End User Computing