|
This article is cited in 3 scientific papers (total in 3 papers)
A heuristic adaptive fast gradient method in stochastic optimization problems
A. V. Ogal'tsov, A. I. Turin State University – Higher School of Economics, Moscow, 101000 Russia
Abstract:
A fast adaptive heuristic stochastic gradient descent method is proposed. It is shown that this algorithm has a higher convergence rate in practical problems than currently popular optimization methods. Furthermore, a justification of this method is given, and difficulties that prevent obtaining optimal estimates for the proposed algorithm are described.
Key words:
fast gradient descent, stochastic optimization, adaptive optimization.
Received: 07.10.2019 Revised: 12.11.2019 Accepted: 10.03.2020
Citation:
A. V. Ogal'tsov, A. I. Turin, “A heuristic adaptive fast gradient method in stochastic optimization problems”, Zh. Vychisl. Mat. Mat. Fiz., 60:7 (2020), 1143–1150; Comput. Math. Math. Phys., 60:7 (2020), 1108–1115
Linking options:
https://www.mathnet.ru/eng/zvmmf11101 https://www.mathnet.ru/eng/zvmmf/v60/i7/p1143
|
Statistics & downloads: |
Abstract page: | 80 | References: | 15 |
|