Noise Quality and Super-Turing Computation in Recurrent Neural Networks

Emmett Redd, Missouri State University
Tayo Obafemi-Ajayi, Missouri State University

Abstract

Noise and stochasticity can be beneficial to the performance of neural networks. Recent studies show that optimized-magnitude, noise-enhanced digital recurrent neural networks are consistent with super-Turing operation. This occurred regardless of whether true random or sufficiently long pseudo-random number time series implementing the noise were used. This paper extends prior work by providing additional insight into the degrading effect of shortened and repeating pseudo-noise sequences on super-Turing operation. Shortening the repeat length in the noise resulted in fewer chaotic time series. This was measured by autocorrelation detected repetitions in the output. Similar rates of chaos inhibition by the shortening of the noise repeat lengths hint to an unknown, underlying commonality in noise-induced chaos among different maps, noise magnitudes, and pseudo-noise functions. Repeat lengths in the chaos-failed outputs were predominately integer multiples of the noise repeat lengths. Noise repeat lengths only marginally shorter than output sequences cause the noise-enhanced digital recurrent neural networks to repeat and, thereby, fail in being consistent with chaos and super-Turing computation. This implies that noise sequences used to improve neural network operation should be at least as long as any sequence it produces.