Computer Research and Modeling
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Computer Research and Modeling:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Computer Research and Modeling, 2022, Volume 14, Issue 2, Pages 497–515
DOI: https://doi.org/10.20537/2076-7633-2022-14-2-497-515
(Mi crm979)
 

MATHEMATICAL MODELING AND NUMERICAL SIMULATION

On accelerated adaptive methods and their modifications for alternating minimization

N. K. Tupitsaabc

a Moscow Institute of Physics and Technology, 9 Institutskiy per., Dolgoprudny, Moscow region, 141701, Russia
b Institute for Information Transmission Problems of the Russian Academy of Sciences (Kharkevich Institute), 19/1 Bol’shoy Karetnyy pereulok, Moscow, 212705, Russia
c HSE University, 20 Myasnitskaya st., Moscow, 101000, Russia
References:
Abstract: In the first part of the paper we present convergence analysis of AGMsDR method on a new class of functions — in general non-convex with $M$-Lipschitz-continuous gradients that satisfy Polyak–Lojasiewicz condition. Method does not need the value of $\mu^{PL}>0$ in the condition and converges linearly with a scale factor $(1-\frac{\mu^{PL}}{M})$. It was previously proved that method converges as $O(\frac{1}{k^2})$ if a function is convex and has $M$-Lipschitz-continuous gradient and converges linearly with a scale factor $(1-\sqrt{\frac{\mu^{SC}}{M}})$ if the value of strong convexity parameter $\mu^{SC}>0$ is known. The novelty is that one can save linear convergence if $\frac{\mu^{PL}}{\mu^{SC}}$ is not known, but without square root in the scale factor.
The second part presents modification of AGMsDR method for solving problems that allow alternating minimization (Alternating AGMsDR). The similar results are proved.
As the result, we present adaptive accelerated methods that converge as $O(\min\{\frac{M}{k^2}, (1-\frac{\mu^{PL}}{M})^{k-1}\})$ on a class of convex functions with $M$-Lipschitz-continuous gradient that satisfy Polyak–Lojasiewicz condition. Algorithms do not need values of $M$ and $\mu{^PL}$. If Polyak–Lojasiewicz condition does not hold, the convergence is $O(\frac{1}{k^2})$, but no tuning needed.
We also consider the adaptive catalyst envelope of non-accelerated gradient methods. The envelope allows acceleration up to $O(\frac{1}{k^2})$. We present numerical comparison of non-accelerated adaptive gradient descent which is accelerated using adaptive catalyst envelope with AGMsDR, Alternating AGMsDR, APDAGD (Adaptive Primal-Dual Accelerated GradientDescent) and Sinkhorn's algorithm on the problem dual to the optimal transport problem.
Conducted experiments show faster convergence of alternating AGMsDR in comparison with described catalyst approach and AGMsDR, despite the same asymptotic rate $O(\frac{1}{k^2})$. Such behavior can be explained by linear convergence of AGMsDR method and was tested on quadratic functions. Alternating AGMsDR demonstrated better performance in comparison with AGMsDR.
Keywords: convex optimization, alternating minimization, accelerated methods, adaptive methods, Polyak–Lojasiewicz condition.
Funding agency Grant number
Russian Science Foundation 18-71-10108
This research was funded by Russian Science Foundation (project 18-71-10108).
Received: 15.03.2020
Revised: 12.12.2021
Accepted: 13.02.2022
Document Type: Article
UDC: 519.8
Language: Russian
Citation: N. K. Tupitsa, “On accelerated adaptive methods and their modifications for alternating minimization”, Computer Research and Modeling, 14:2 (2022), 497–515
Citation in format AMSBIB
\Bibitem{Tup22}
\by N.~K.~Tupitsa
\paper On accelerated adaptive methods and their modifications for alternating minimization
\jour Computer Research and Modeling
\yr 2022
\vol 14
\issue 2
\pages 497--515
\mathnet{http://mi.mathnet.ru/crm979}
\crossref{https://doi.org/10.20537/2076-7633-2022-14-2-497-515}
Linking options:
  • https://www.mathnet.ru/eng/crm979
  • https://www.mathnet.ru/eng/crm/v14/i2/p497
  • Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Computer Research and Modeling
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2025