Massively Scalable Parallel Neural Networks: A Modified Genetic Algorithm Approach.


Parallelizing neural networks is an active area of research. Current approaches surround the parallelization of the widely used back-propagation (BP) algorithm, which has a large amount of communication overhead, making it less than ideal for parallelization. An algorithm that does not depend on the backward propagation of errors, better lends itself to a parallel implementation. One well known training algorithm for neural networks explicitly incorporates network structure in the objective function to be minimized which yields simpler neural networks. Prior work has implemented this using a modified genetic algorithm in a serial fashion that is not scalable, thus limiting its usefulness. This research created a parallel version of the algorithm. The performance of the proposed algorithm is compared against the existing algorithm using a variety of benchmark problems. Computational experiments with benchmark datasets indicate that the parallel algorithm proposed in this research outperforms the serial version from prior research in achieving better predictive accuracy in the same time as well as identifying a simpler architecture.


Information Technology and Cybersecurity

Document Type


Publication Date


Journal Title

Insights to a Changing World Journal