Computer Research and Modeling
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Computer Research and Modeling:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Computer Research and Modeling, 2022, Volume 14, Issue 2, Pages 473–495
DOI: https://doi.org/10.20537/2076-7633-2022-14-2-473-495
(Mi crm978)
 

This article is cited in 1 scientific paper (total in 1 paper)

MATHEMATICAL MODELING AND NUMERICAL SIMULATION

Subgradient methods for non-smooth optimization problems with some relaxation of sharp minimum

S. S. Ablaevab, D. V. Makarenkoa, F. S. Stonyakinab, M. S. Alkousaac, I. V. Baranb

a Moscow Institute of Physics and Technology, 9 Institutskiy per., Dolgoprudny, Moscow region, 141701, Russia
b V. I. Vernadsky Crimean Federal University, 4 Academician Vernadsky Avenue, Simferopol, Republic of Crimea, 295007, Russia
c HSE University, 20 Myasnitskaya st., Moscow, 101000, Russia
Full-text PDF (497 kB) Citations (1)
References:
Abstract: Non-smooth optimization often arises in many applied problems. The issues of developing efficient computational procedures for such problems in high-dimensional spaces are very topical. First-order methods (subgradient methods) are well applicable here, but in fairly general situations they lead to low speed guarantees for large-scale problems. One of the approaches to this type of problem can be to identify a subclass of non-smooth problems that allow relatively optimistic results on the rate of convergence. For example, one of the options for additional assumptions can be the condition of a sharp minimum, proposed in the late 1960s by B. T. Polyak. In the case of the availability of information about the minimal value of the function for Lipschitz-continuous problems with a sharp minimum, it turned out to be possible to propose a subgradient method with a Polyak step-size, which guarantees a linear rate of convergence in the argument. This approach made it possible to cover a number of important applied problems (for example, the problem of projecting on to a convex compact set). However, both the condition of the availability of the minimal value of the function and the condition of a sharp minimum itself look rather restrictive. In this regard, in this paper, we propose a generalized condition for a sharp minimum, some what similar to the inexact oracle proposed recently by Devolder–Glineur—Nesterov. The proposed approach makes it possible to extend the class of applicability of subgradient methods with the Polyak step-size, to the situation of inexact information about the value of the minimum, as well as the unknown Lipschitz constant of the objective function. Moreover,the use of local analogs of the global characteristics of the objective function makes it possible to apply the results of this type to wider classes of problems. We show the possibility of applying the proposed approach to strongly convex non-smooth problems, also, we make an experimental comparison with the known optimal subgradient method for such a class of problems. Moreover, there were obtained some results connected to the applicability of the proposed technique to some types of problems with convexity relaxations: the recently proposed notion of weak $\beta$-quasi-convexity and ordinary quasi-convexity. Also in the paper, we study a generalization of the described technique to the situation with the assumption that the $\delta$-subgradient of the objective function is available instead of the usual subgradient. For one of the considered methods, conditions are found under which, in practice, it is possible to escape the projection of the considered iterative sequence on to the feasible set of the problem.
Keywords: subgradient method, sharp minimum, quasi-convex function, weak $\beta$-quasi-convex function, Lipschitz-continuous function, $\delta$-subgradient.
Funding agency Grant number
Russian Science Foundation 22-21-20065
This work in Theorems 5, 6, and 7 was supported by Russian Science Foundation and Moscow city (project 22-21-20065, https://rscf.ru/project/22-21-20065/).
Received: 12.02.2022
Revised: 30.03.2022
Accepted: 05.04.2022
Document Type: Article
UDC: 519.85
Language: Russian
Citation: S. S. Ablaev, D. V. Makarenko, F. S. Stonyakin, M. S. Alkousa, I. V. Baran, “Subgradient methods for non-smooth optimization problems with some relaxation of sharp minimum”, Computer Research and Modeling, 14:2 (2022), 473–495
Citation in format AMSBIB
\Bibitem{AblMakSto22}
\by S.~S.~Ablaev, D.~V.~Makarenko, F.~S.~Stonyakin, M.~S.~Alkousa, I.~V.~Baran
\paper Subgradient methods for non-smooth optimization problems with some relaxation of sharp minimum
\jour Computer Research and Modeling
\yr 2022
\vol 14
\issue 2
\pages 473--495
\mathnet{http://mi.mathnet.ru/crm978}
\crossref{https://doi.org/10.20537/2076-7633-2022-14-2-473-495}
Linking options:
  • https://www.mathnet.ru/eng/crm978
  • https://www.mathnet.ru/eng/crm/v14/i2/p473
  • This publication is cited in the following 1 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Computer Research and Modeling
    Statistics & downloads:
    Abstract page:149
    Full-text PDF :55
    References:30
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024