Computer Optics
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Computer Optics:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Computer Optics, 2023, Volume 47, Issue 1, Pages 160–169
DOI: https://doi.org/10.18287/2412-6179-CO-1147
(Mi co1113)
 

This article is cited in 3 scientific papers (total in 3 papers)

NUMERICAL METHODS AND DATA ANALYSIS

A new approach to training neural networks using natural gradient descent with momentum based on Dirichlet distributions

R. I. Abdulkadirova, P. A. Lyakhovb

a North-Caucasus Center for Mathematical Research, North-Caucasus Federal University, Stavropol
b North-Caucasus Federal University
References:
Abstract: In this paper, we propose a natural gradient descent algorithm with momentum based on Dirichlet distributions to speed up the training of neural networks. This approach takes into account not only the direction of the gradients, but also the convexity of the minimized function, which significantly accelerates the process of searching for the extremes. Calculations of natural gradients based on Dirichlet distributions are presented, with the proposed approach introduced into an error backpropagation scheme. The results of image recognition and time series forecasting during the experiments show that the proposed approach gives higher accuracy and does not require a large number of iterations to minimize loss functions compared to the methods of stochastic gradient descent, adaptive moment estimation and adaptive parameter-wise diagonal quasi-Newton method for nonconvex stochastic optimization
Keywords: pattern recognition, machine learning, optimization, Dirichlet distributions, natural gradient descent
Funding agency Grant number
Ministry of Science and Higher Education of the Russian Federation 075-02-2022-892
Russian Science Foundation 21-71-00017
22-71-00009
The authors would like to thank the North-Caucasus Federal University for the award of funding in the contest of competitive projects of scientific groups and individual scientists of the North-Caucasus Federal University. The research in section 2 was supported by the North-Caucasus Center for Mathematical Research through the Ministry of Science and Higher Education of the Russian Federation (Project No. 075-02-2022-892). The research in section 3 was supported by the Russian Science Foundation (Project No. 21-71-00017). The research in section 4 was supported by the Russian Science Foundation (Project No. 22-71-00009).
Received: 07.04.2022
Accepted: 24.08.2022
Document Type: Article
Language: Russian
Citation: R. I. Abdulkadirov, P. A. Lyakhov, “A new approach to training neural networks using natural gradient descent with momentum based on Dirichlet distributions”, Computer Optics, 47:1 (2023), 160–169
Citation in format AMSBIB
\Bibitem{AbdLya23}
\by R.~I.~Abdulkadirov, P.~A.~Lyakhov
\paper A new approach to training neural networks using natural gradient descent with momentum based on Dirichlet distributions
\jour Computer Optics
\yr 2023
\vol 47
\issue 1
\pages 160--169
\mathnet{http://mi.mathnet.ru/co1113}
\crossref{https://doi.org/10.18287/2412-6179-CO-1147}
Linking options:
  • https://www.mathnet.ru/eng/co1113
  • https://www.mathnet.ru/eng/co/v47/i1/p160
  • This publication is cited in the following 3 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Computer Optics
    Statistics & downloads:
    Abstract page:31
    Full-text PDF :7
    References:10
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024