Stochastic Resonance Enables BPP/log∗ Complexity and Universal Approximation in Analog Recurrent Neural Networks

Abstract

Stochastic resonance (SR) is a natural process that without limit increases the precision of signal measurements in biological and physical sciences. Most artificial neural networks (NNs) are implemented on digital computers of fixed-precision. A NN accessing universal approximation and a computational complexity class more powerful that of a Turing machine needs analog signals utilizing SR's limitless precision increase. This paper links an analog recurrent (AR) NN theorem, SR, BPP/log∗ (a physically realizable, super-Turing computation class), and universal approximation so NNs following them can be made computationally more powerful. An optical neural network mimicking chaos indicates super-Turing computation has been achieved. Additional tests are needed which can verify superTuring computation, show its superiority, and demonstrate its practical benefits. Truly powerful cognitively inspired computation needs to access the combination of ARNNs, SR, super-Turing mathematical complexity, and universal approximation.

Department(s)

Physics, Astronomy, and Materials Science
MSU JVIC-CASE
Engineering Program

Document Type

Conference Proceeding

DOI

https://doi.org/10.1109/IJCNN.2019.8851775

Keywords

cognitive theory, neural networks, stochastic resonance theory, super-Turing theory, universal approximation

Publication Date

7-1-2019

Journal Title

Proceedings of the International Joint Conference on Neural Networks

Share

COinS