|
This article is cited in 1 scientific paper (total in 1 paper)
Nonlinear engineering and robotics
Noise Impact on a Recurrent Neural Network with
a Linear Activation Function
V. M. Moskvitin, N. I. Semenova Saratov State University,
ul. Astrakhanskaya 1, Saratov, 410012 Russia
Abstract:
In recent years, more and more researchers in the field of artificial neural networks have
been interested in creating hardware implementations where neurons and the connection between
them are realized physically. Such networks solve the problem of scaling and increase the speed
of obtaining and processing information, but they can be affected by internal noise.
In this paper we analyze an echo state neural network (ESN) in the presence of uncorrelated
additive and multiplicative white Gaussian noise. Here we consider the case where artificial neurons have a linear activation function with different slope coefficients. We consider the influence
of the input signal, memory and connection matrices on the accumulation of noise. We have
found that the general view of variance and the signal-to-noise ratio of the ESN output signal
is similar to only one neuron. The noise is less accumulated in ESN with a diagonal reservoir
connection matrix with a large “blurring” coefficient. This is especially true of uncorrelated
multiplicative noise.
Keywords:
artificial neural networks, recurrent neural network, echo state network, noise,
dispersion, statistic, white gaussian noise.
Received: 28.02.2023 Accepted: 26.04.2023
Citation:
V. M. Moskvitin, N. I. Semenova, “Noise Impact on a Recurrent Neural Network with
a Linear Activation Function”, Rus. J. Nonlin. Dyn., 19:2 (2023), 281–293
Linking options:
https://www.mathnet.ru/eng/nd853 https://www.mathnet.ru/eng/nd/v19/i2/p281
|
|