Diskretnyi Analiz i Issledovanie Operatsii
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Impact factor
Guidelines for authors

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Diskretn. Anal. Issled. Oper.:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Diskretnyi Analiz i Issledovanie Operatsii, 2022, Volume 29, Issue 3, Pages 24–44
DOI: https://doi.org/10.33048/daio.2022.29.739
(Mi da1301)
 

Optimization of subgradient method parameters on the base of rank-two correction of metric matrices

V. N. Krutikova, P. S. Stanimirovićb, O. N. Indenkoa, E. M. Tovbisc, L. A. Kazakovtsevc

a Kemerovo State University, 6 Krasnaya Street, 650043 Kemerovo, Russia
b Faculty of Sciences and Mathematics, University of Niš, 33 Višegradska Street, 18000 Niš, Serbia
c Reshetnev Siberian State University of Science and Technology, 31 Krasnoyarskiy Rabochiy Avenue, 660031 Krasnoyarsk, Russia
References:
Abstract: We establish a relaxation subgradient method (RSM) that includes parameter optimization utilizing metric rank-two correction matrices with a structure analogous to quasi-Newtonian (QN) methods. The metric matrix transformation consists of suppressing orthogonal and amplifying collinear components of the minimal length subgradient vector. The problem of constructing a metric matrix is formulated as a problem of solving an involved system of inequalities. Solving such system is based on a new learning algorithm. An estimate for its convergence rate is obtained depending on the parameters of the subgradient set. A new RSM has been developed and investigated on this basis. Computational experiments on complex large-scale functions confirm the effectiveness of the proposed algorithm. Tab. 4, bibliogr. 32.
Keywords: convex optimization, nonsmooth optimization, relaxation subgradient method.
Funding agency Grant number
Ministry of Science and Higher Education of the Russian Federation FEFE-2020-0013
Science Fund of the Republic of Serbia 7750185
Ministry of Education, Science and Technical Development of Serbia 451–03–68/2020–14/200124
This research is supported by the Ministry of Science and Higher Education of Russia (State Contract FEFE–2020–0013). The work of the second author is supported by the Science Foundation of the Republic of Serbia (Grant 7750185) and the Ministry of Education, Science and Technological Development of the Republic of Serbia (Contract 451–03–68/2020–14/200124).
Received: 10.05.2022
Revised: 10.05.2022
Accepted: 12.05.2022
Bibliographic databases:
Document Type: Article
UDC: 519.8
Language: Russian
Citation: V. N. Krutikov, P. S. Stanimirović, O. N. Indenko, E. M. Tovbis, L. A. Kazakovtsev, “Optimization of subgradient method parameters on the base of rank-two correction of metric matrices”, Diskretn. Anal. Issled. Oper., 29:3 (2022), 24–44
Citation in format AMSBIB
\Bibitem{KruStaInd22}
\by V.~N.~Krutikov, P.~S.~Stanimirovi{\'c}, O.~N.~Indenko, E.~M.~Tovbis, L.~A.~Kazakovtsev
\paper Optimization of subgradient method parameters on the base of rank-two correction of~metric~matrices
\jour Diskretn. Anal. Issled. Oper.
\yr 2022
\vol 29
\issue 3
\pages 24--44
\mathnet{http://mi.mathnet.ru/da1301}
\crossref{https://doi.org/10.33048/daio.2022.29.739}
\mathscinet{http://mathscinet.ams.org/mathscinet-getitem?mr=4497550}
Linking options:
  • https://www.mathnet.ru/eng/da1301
  • https://www.mathnet.ru/eng/da/v29/i3/p24
  • Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Дискретный анализ и исследование операций
    Statistics & downloads:
    Abstract page:131
    Full-text PDF :34
    References:26
    First page:6
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024