Global optimization for artificial neural networks: A tabu search application
The ability of neural networks to closely approximate unknown functions to any degree of desired accuracy has generated considerable demand for neural network research in business. The attractiveness of neural network research stems from researchers' need to approximate models within the business environment without having a priori knowledge about the true underlying function. Gradient techniques, such as backpropagation, are currently the most widely used methods for neural network optimization. Since these techniques search for local solutions, they are subject to local convergence and thus can perform poorly even on simple problems when forecasting out-of-sample. Consequently, a global search algorithm is warranted. In this paper we examine tabu search (TS) as a possible alternative to the problematic backpropagation approach. A Monte Carlo study was conducted to test the appropriateness of TS as a global search technique for optimizing neural networks. Holding the neural network architecture constant, 530 independent runs were conducted for each of seven test functions, including a production function that exhibits both increasing and diminishing marginal returns and the Mackey-Glass chaotic time series. In the resulting comparison, TS derived solutions that were significantly superior to those of backpropagation solutions for in-sample, interpolation, and extrapolation test data for all seven test functions. It was also shown that fewer function evaluations were needed to find these optimal values. © 1998 Published by Elsevier Science B.V. All rights reserved.
Neural networks, Optimization, Tabu Search
Sexton, Randall S., Bahram Alidaee, Robert E. Dorsey, and John D. Johnson. "Global optimization for artificial neural networks: A tabu search application." European Journal of Operational Research 106, no. 2-3 (1998): 570-584.
European Journal of Operational Research