Title

Beyond backpropagation: Using simulated annealing for global optimization for neural networks

Abstract

The vast majority of neural network research rely on a gradient algorithm, typically a variation of backpropagation, to obtain the weights of the model. Because of the enigmatic nature of complex nonlinear optimization problems, such as artificial neural networks, this technique has often produced inconsistent and unpredictable results. In order to go beyond backpropagation's typical selection of local solutions, simulated annealing is suggested as an alterative optimization technique which will search globally. In this research, backpropagation will be directly compared to this global search technique via an intensive Monte Carlo study on seven test functions.

Document Type

Conference Proceeding

Publication Date

1-1-1998

Journal Title

Proceedings - Annual Meeting of the Decision Sciences Institute

Citation-only

Share

COinS