Abstract

This paper explores the benefit of added noise in increasing the computational complexity of digital recurrent neural networks (RNNs). The physically accepted model of the universe imposes rational number, stochastic limits on all calculations. An analog RNN with those limits calculates at the super-Turing complexity level BPP/log∗. In this paper, we demonstrate how noise aids digital RNNs in attaining super-Turing operation similar to analog RNNs. We investigate moving limited-precision systems from not being chaotic at small amounts of noise, through consistency with chaos, to overwhelming it at large amounts of noise. A Kolmogorov-complexity-based proof shows that an infinite computational class hierarchy exists between P, the Turing class, and BPP/log∗. The hierarchy offers a possibility that the noise-enhanced digital RNNs could operate at a super-Turing level less complex than BPP/log∗. As the uniform noise increases, the digital RNNs develop positive Lyapunov exponents intimating that chaos is mimicked. The exponents maximize to the accepted values for the logistic and Hénon maps when the noise equals eight times the least significant bit of the noisy recurrent signals for the logistic digital RNN and four times the Hénon digital RNN.

Department(s)

Mathematics

Document Type

Article

DOI

https://doi.org/10.1103/PhysRevResearch.3.013120

Rights Information

Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the authors and the published article’s title, journal citation, and DOI.

Publication Date

2-8-2021

Journal Title

Physical Review Research

Share

COinS