Title

Simultaneous optimization of neural network function and architecture algorithm

Abstract

A major limitation to current artificial neural network (NN) research is the inability to adequately identify unnecessary weights in the solution. If a method were found that would allow unnecessary weights to be identified, decision-makers would gain crucial information about the problem at hand as well as benefit by having a network that was more effective and efficient. The Neural Network Simultaneous Optimization Algorithm (NNSOA) is proposed for supervised training in multilayer feedforward neural networks. We demonstrate with Monte Carlo studies that the NNSOA can be used to obtain both a global solution and simultaneously identify a parsimonious network structure. © 2002 Elsevier Science B.V. All rights reserved.

Department(s)

Information Technology and Cybersecurity

Document Type

Article

DOI

https://doi.org/10.1016/S0167-9236(02)00147-1

Keywords

Artificial intelligence, Backpropagation, Genetic algorithm, Neural networks, Parsimonious

Publication Date

1-1-2004

Journal Title

Decision Support Systems

Share

COinS