Problemy Upravleniya
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Impact factor

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Probl. Upr.:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Problemy Upravleniya, 2020, Issue 2, Pages 3–19
DOI: https://doi.org/10.25728/pu.2020.2.1
(Mi pu1178)
 

This article is cited in 4 scientific papers (total in 4 papers)

Reviews

Deep neural networks: origins, development, current status

A. V. Makarenko

V.A. Trapeznikov Institute of Control Sciences of Russian Academy of Sciences, Moscow, Russia
Full-text PDF (903 kB) Citations (4)
References:
Abstract: The article covers the development of neural networks, from their origin in the form of the McCulloch-Pitts neuron to modern deep architectures. Major “neural network crises” are listed together with reasons that led to these crises. The main attention is paid to neural architectures that are trained with supervision learning using labeled datasets. References are given to original publications and mathematical theorems that lay the theoretical foundation for artificial neural networks. An analysis was carried out of the challenges in building effective deep neural architectures, ways to address these challenges are considered, success factors are identified. Main layers are listed for convolutional and recurrent neural networks, as well as their architectural combinations. Examples are given with references to demonstrate that deep neural networks are effective not only in applications with distinct structural patterns in the data (images, voice, music, etc.) but also applications with signals of stochastic/chaotic nature. A major direction of convolutional network development is identified too, which is the implementation of trainable integral transforms into the layers. A basic overview is provided for the modern Transformer architecture, which has become mainstream for sequence processing tasks (including tasks of computational linguistics). A scope of key goals and objectives is defined for the current theory of artificial neural networks.
Keywords: deep learning, convolutional neural networks, recurrent neural networks.
Received: 17.12.2019
Revised: 25.12.2019
Accepted: 25.12.2019
Document Type: Article
UDC: 004.852
Language: Russian
Citation: A. V. Makarenko, “Deep neural networks: origins, development, current status”, Probl. Upr., 2020, no. 2, 3–19
Citation in format AMSBIB
\Bibitem{Mak20}
\by A.~V.~Makarenko
\paper Deep neural networks: origins, development, current status
\jour Probl. Upr.
\yr 2020
\issue 2
\pages 3--19
\mathnet{http://mi.mathnet.ru/pu1178}
\crossref{https://doi.org/10.25728/pu.2020.2.1}
Linking options:
  • https://www.mathnet.ru/eng/pu1178
  • https://www.mathnet.ru/eng/pu/v2/p3
  • This publication is cited in the following 4 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Проблемы управления
    Statistics & downloads:
    Abstract page:347
    Full-text PDF :149
    References:45
    First page:19
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024