Computer Research and Modeling
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Computer Research and Modeling:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Computer Research and Modeling, 2018, Volume 10, Issue 3, Pages 305–314
DOI: https://doi.org/10.20537/2076-7633-2018-10-3-305-314
(Mi crm253)
 

This article is cited in 3 scientific papers (total in 3 papers)

SPECIAL ISSUE

A hypothesis about the rate of global convergence for optimal methods (Newton's type) in smooth convex optimization

A. V. Gasnikova, D. A. Kovalevb

a Moscow Institute of Physics and Technology, 9 Institutskij per., Dolgoprudny, Moscow Region, 141707, Russia
b Institute for Information Transmission Problems of the Russian Academy of Sciences, 19 Bolshoy Karetny per., Moscow, 127051, Russia
Full-text PDF (135 kB) Citations (3)
References:
Abstract: In this paper we discuss lower bounds for convergence of convex optimization methods of high order and attainability of this bounds. We formulate a hypothesis that covers all the cases. It is noticeable that we provide this statement without a proof. Newton method is the most famous method that uses gradient and Hessianof optimized function. However, it converges locally even for strongly convex functions. Global convergence can be achieved with cubic regularization of Newton method [Nesterov, Polyak, 2006], whose iteration cost is comparable with iteration cost of Newton method and is equivalent to inversion of Hessian of optimized function.Yu. Nesterov proposed accelerated variant of Newton method with cubic regularization in 2008 [Nesterov,2008]. R. Monteiro and B. Svaiter managed to improve global convergence of cubic regularized method in 2013 [Monteiro, Svaiter, 2013]. Y. Arjevani, O. Shamir and R. Shiff showed that convergence bound of Monteiro and Svaiter is optimal (cannot be improved by more than logarithmic factor with any second order method) in 2017 [Arjevani et al., 2017]. They also managed to find bounds for convex optimization methods of p-th order for $p\geq2$. However, they got bounds only for first and second order methods for strongly convex functions. In 2018 Yu. Nesterov proposed third order convex optimization methods with rate of convergence that is close to this lower bounds and with similar to Newton method cost of iteration [Nesterov, 2018]. Consequently, it was showed that high order methods can be practical. In this paper we formulate lower bounds for p-th order methods for $p\geq3$ for strongly convex unconstrained optimization problems. This paper can be viewed as a little survey of state of the art of high order optimization methods.
Keywords: Newton method, Hesse matrix, lower bounds, Chebyshev's type methods, superliner rate of convergence.
Funding agency Grant number
Russian Science Foundation 17-11-01027
This work was supported by RSCF grant No. 17-11-01027
Received: 28.02.2018
Revised: 12.05.2018
Accepted: 24.05.2018
Document Type: Article
UDC: 519.86
Language: Russian
Citation: A. V. Gasnikov, D. A. Kovalev, “A hypothesis about the rate of global convergence for optimal methods (Newton's type) in smooth convex optimization”, Computer Research and Modeling, 10:3 (2018), 305–314
Citation in format AMSBIB
\Bibitem{GasKov18}
\by A.~V.~Gasnikov, D.~A.~Kovalev
\paper A hypothesis about the rate of global convergence for optimal methods (Newton's type) in smooth convex optimization
\jour Computer Research and Modeling
\yr 2018
\vol 10
\issue 3
\pages 305--314
\mathnet{http://mi.mathnet.ru/crm253}
\crossref{https://doi.org/10.20537/2076-7633-2018-10-3-305-314}
Linking options:
  • https://www.mathnet.ru/eng/crm253
  • https://www.mathnet.ru/eng/crm/v10/i3/p305
  • This publication is cited in the following 3 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Computer Research and Modeling
    Statistics & downloads:
    Abstract page:629
    Full-text PDF :246
    References:81
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024