Simultaneous optimization of neural network function and architecture algorithm
A major limitation to current artificial neural network (NN) research is the inability to adequately identify unnecessary weights in the solution. If a method were found that would allow unnecessary weights to be identified, decision-makers would gain crucial information about the problem at hand as well as benefit by having a network that was more effective and efficient. The Neural Network Simultaneous Optimization Algorithm (NNSOA) is proposed for supervised training in multilayer feedforward neural networks. We demonstrate with Monte Carlo studies that the NNSOA can be used to obtain both a global solution and simultaneously identify a parsimonious network structure. © 2002 Elsevier Science B.V. All rights reserved.
Information Technology and Cybersecurity
Artificial intelligence, Backpropagation, Genetic algorithm, Neural networks, Parsimonious
Sexton, Randall S., Robert E. Dorsey, and Naheel A. Sikander. "Simultaneous optimization of neural network function and architecture algorithm." Decision Support Systems 36, no. 3 (2004): 283-296.
Decision Support Systems