|
This article is cited in 4 scientific papers (total in 4 papers)
Reviews
Deep neural networks: origins, development, current status
A. V. Makarenko V.A. Trapeznikov Institute of Control Sciences of Russian Academy of Sciences, Moscow, Russia
Abstract:
The article covers the development of neural networks, from their origin in the form of the McCulloch-Pitts neuron to modern deep architectures. Major “neural network crises” are listed together with reasons that led to these crises. The main attention is paid to neural architectures that are trained with supervision learning using labeled datasets. References are given to original publications and mathematical theorems that lay the theoretical foundation for artificial neural networks. An analysis was carried out of the challenges in building effective deep neural architectures, ways to address these challenges are considered, success factors are identified. Main layers are listed for convolutional and recurrent neural networks, as well as their architectural combinations. Examples are given with references to demonstrate that deep neural networks are effective not only in applications with distinct structural patterns in the data (images, voice, music, etc.) but also applications with signals of stochastic/chaotic nature. A major direction of convolutional network development is identified too, which is the implementation of trainable integral transforms into the layers. A basic overview is provided for the modern Transformer architecture, which has become mainstream for sequence processing tasks (including tasks of computational linguistics). A scope of key goals and objectives is defined for the current theory of artificial neural networks.
Keywords:
deep learning, convolutional neural networks, recurrent neural networks.
Received: 17.12.2019 Revised: 25.12.2019 Accepted: 25.12.2019
Citation:
A. V. Makarenko, “Deep neural networks: origins, development, current status”, Probl. Upr., 2020, no. 2, 3–19
Linking options:
https://www.mathnet.ru/eng/pu1178 https://www.mathnet.ru/eng/pu/v2/p3
|
Statistics & downloads: |
Abstract page: | 347 | Full-text PDF : | 149 | References: | 45 | First page: | 19 |
|