|
Avtomatika i Telemekhanika, 2018, Issue 1, Pages 100–112
(Mi at14713)
|
|
|
|
This article is cited in 15 scientific papers (total in 15 papers)
Topical issue
Algorithms of inertial mirror descent in convex problems of stochastic optimization
A. V. Nazin Trapeznikov Institute of Control Sciences, Russian Academy of Sciences, Moscow, Russia
Abstract:
A minimization problem for mathematical expectation of a convex loss function over given convex compact $X\in\mathbb R^N$ is treated. It is assumed that the oracle sequentially returns stochastic subgradients for loss function at current points with uniformly bounded second moment. The aim consists in modification of well-known mirror descent method proposed by A. S. Nemirovsky and D. B. Yudin in 1979 and having extended the standard gradient method. In the beginning, the idea of a new so-called method of Inertial Mirror Descent (IMD) on example of a deterministic optimization problem in $\mathbb R^N$ with continuous time is demonstrated. Particularly, in Euclidean case the method of heavy ball is realized; it is noted that the new method no use additional point averaging. Further on, a discrete IMD algorithm is described; the upper bound on error over objective function (i.e., of the difference between current mean losses and their minimum) is proved.
Keywords:
stochastic optimization problem, convex optimization, mirror descent, heavy ball method, inertial mirror descent.
Citation:
A. V. Nazin, “Algorithms of inertial mirror descent in convex problems of stochastic optimization”, Avtomat. i Telemekh., 2018, no. 1, 100–112; Autom. Remote Control, 79:1 (2018), 78–88
Linking options:
https://www.mathnet.ru/eng/at14713 https://www.mathnet.ru/eng/at/y2018/i1/p100
|
Statistics & downloads: |
Abstract page: | 509 | Full-text PDF : | 100 | References: | 52 | First page: | 25 |
|