Computer Research and Modeling
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Computer Research and Modeling:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Computer Research and Modeling, 2021, Volume 13, Issue 5, Pages 947–963
DOI: https://doi.org/10.20537/2076-7633-2021-13-5-947-963
(Mi crm927)
 

NUMERICAL METHODS AND THE BASIS FOR THEIR APPLICATION

Fast adaptive by constants of strong-convexity and Lipschitz for gradient first order methods

N. V. Pletnevab

a Moscow Institute of Physics and Technology, 9 Institute lane, Dolgoprudny, 141701, Russia
b Institution of Russian Academy of Sciences Dorodnicyn Computing Centre of RAS, 40 Vavilov st., Moscow, 119333, Russia
References:
Abstract: The work is devoted to the construction of efficient and applicable to real tasks first-order methods of convex optimization, that is, using only values of the target function and its derivatives. Construction uses OGM-G, fast gradient method which is optimal by complexity, but requires to know the Lipschitz constant for gradient and the strong convexity constant to determine the number of steps and step length. This requirement makes practical usage very hard. An adaptive on the constant for strong convexity algorithm ACGM is proposed, based on restarts of the OGM-G with update of the strong convexity constant estimate, and an adaptive on the Lipschitz constant for gradient ALGM, in which the use of OGM-G restarts is supplemented by the selection of the Lipschitz constant with verification of the smoothness conditions used in the universal gradient descent method. This eliminates the disadvantages of the original method associated with the need to know these constants, which makes practical usage possible. Optimality of estimates for the complexity of the constructed algorithms is proved. To verify the results obtained, experiments on model functions and real tasks from machine learning are carried out.
Keywords: fast gradient method, adaptivity on the constant for strong convexity, adaptivity on the Lipschitz constant for gradient.
Received: 19.05.2020
Revised: 03.09.2021
Accepted: 03.09.2021
Document Type: Article
UDC: 519.85
Language: Russian
Citation: N. V. Pletnev, “Fast adaptive by constants of strong-convexity and Lipschitz for gradient first order methods”, Computer Research and Modeling, 13:5 (2021), 947–963
Citation in format AMSBIB
\Bibitem{Ple21}
\by N.~V.~Pletnev
\paper Fast adaptive by constants of strong-convexity and Lipschitz for gradient first order methods
\jour Computer Research and Modeling
\yr 2021
\vol 13
\issue 5
\pages 947--963
\mathnet{http://mi.mathnet.ru/crm927}
\crossref{https://doi.org/10.20537/2076-7633-2021-13-5-947-963}
Linking options:
  • https://www.mathnet.ru/eng/crm927
  • https://www.mathnet.ru/eng/crm/v13/i5/p947
  • Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Computer Research and Modeling
    Statistics & downloads:
    Abstract page:94
    Full-text PDF :46
    References:38
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024