Computer Research and Modeling
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Computer Research and Modeling:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Computer Research and Modeling, 2021, Volume 13, Issue 6, Pages 1137–1147
DOI: https://doi.org/10.20537/2076-7633-2021-13-6-1137-1147
(Mi crm940)
 

This article is cited in 1 scientific paper (total in 1 paper)

NUMERICAL METHODS AND THE BASIS FOR THEIR APPLICATION

Ellipsoid method for convex stochastic optimization in small dimension

E. L. Gladinabc, K. E. Zainullinaa

a National Research University Moscow Institute of Physics and Technology, 9, Institutskiy per., Dolgoprudny, Moscow Region, 141701, Russia
b Institute for Information Transmission Problems RAS, 9, B. Karetny lane, Moscow, 127051, Russia
c Skolkovo Institute of Science and Technology, 30/1, Bolshoy Boulevard, Moscow, 121205, Russia
Full-text PDF (228 kB) Citations (1)
References:
Abstract: The article considers minimization of the expectation of convex function. Problems of this type often arise in machine learning and a variety of other applications. In practice, stochastic gradient descent (SGD) and similar procedures are usually used to solve such problems. We propose to use the ellipsoid method with mini-batching, which converges linearly and can be more efficient than SGD for a class of problems. This is verified by our experiments, which are publicly available. The algorithm does not require neither smoothness nor strong convexity of the objective to achieve linear convergence. Thus, its complexity does not depend on the conditional number of the problem. We prove that the method arrives at an approximate solution with given probability when using mini-batches of size proportional to the desired accuracy to the power -2. This enables efficient parallel execution of the algorithm, whereas possibilities for batch parallelization of SGD are rather limited. Despite fast convergence, ellipsoid method can result in a greater total number of calls to oracle than SGD, which works decently with small batches. Complexity is quadratic in dimension of the problem, hence the method is suitable for relatively small dimensionalities.
Keywords: stochastic optimization, convex optimization, ellipsoid method, mini-batching.
Received: 09.11.2020
Revised: 15.11.2021
Accepted: 16.11.2021
Document Type: Article
UDC: 519.85
Language: Russian
Citation: E. L. Gladin, K. E. Zainullina, “Ellipsoid method for convex stochastic optimization in small dimension”, Computer Research and Modeling, 13:6 (2021), 1137–1147
Citation in format AMSBIB
\Bibitem{GlaZai21}
\by E.~L.~Gladin, K.~E.~Zainullina
\paper Ellipsoid method for convex stochastic optimization in small dimension
\jour Computer Research and Modeling
\yr 2021
\vol 13
\issue 6
\pages 1137--1147
\mathnet{http://mi.mathnet.ru/crm940}
\crossref{https://doi.org/10.20537/2076-7633-2021-13-6-1137-1147}
Linking options:
  • https://www.mathnet.ru/eng/crm940
  • https://www.mathnet.ru/eng/crm/v13/i6/p1137
  • This publication is cited in the following 1 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Computer Research and Modeling
    Statistics & downloads:
    Abstract page:91
    Full-text PDF :32
    References:14
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024