Vestnik Tomskogo Gosudarstvennogo Universiteta. Matematika i Mekhanika
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Impact factor

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Vestn. Tomsk. Gos. Univ. Mat. Mekh.:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Vestnik Tomskogo Gosudarstvennogo Universiteta. Matematika i Mekhanika, 2014, Number 3(29), Pages 5–19 (Mi vtgu388)  

This article is cited in 3 scientific papers (total in 3 papers)

MATHEMATICS

The subgradient multistep minimization method for nonsmooth high-dimensional problems

V. N. Krutikov, Ya. N. Vershinin

Kemerovo State University, Kemerovo, Russian Federation
Full-text PDF (464 kB) Citations (3)
References:
Abstract: In this paper, a new multistep relaxation subgradient minimization method is proposed. It is based on principles of organization of “conjugate subgradients” methods. The presented method belongs to the class of relaxation methods of the $\varepsilon$-subgradient type (RSM) and is intended for solving nonsmooth high-dimensional problems.
The space tension RSMs available at present are comparable in the rate of convergence for smooth functions with quasi-Newton methods and are efficient in solving nonsmooth problems of the ravine type. At high dimensional problems, it effectiveness is reduced due to the necessity of storage and transformation of the metric matrix. In the smooth case, the conjugate gradient method substitutes quasi-Newton methods at high-dimensional problems. Existing multistep RSMs are significantly inferior to the subgradient space tension methods in the rate of convergence and loop at ravine type nonsmooth optimization problems. That is why they are practically not applied for even for small dimension problems. These circumstances determine the importance of establishing effective multistage RSMs. In the considered relaxation subgradient method, additional learning relations are used at iterations with the aim to improve the efficiency of the learning algorithm for a known method based on extending the Kaczmarz algorithm to inequality systems. This innovation expands the range of solved nonsmooth optimization problems and increases the rate of convergence in solving smooth and non-smooth minimization problems.
Numerical results indicate an increase in the minimization method efficiency due to orthogonalization of learning vectors in the algorithm that solves the inequalities, which is particularly evident when solving nonsmooth problems of high dimensionality with a high degree of elongation of the level surfaces. According to the convergence properties at high dimension quadratic functions, at a large scatter of eigenvalues, the developed algorithm is superior to existing multi-step relaxation subgradient methods and is comparable in the effectiveness to the conjugate gradients method.
Keywords: Kaczmarz algortihm, multistep algorithm, minimization method, convergence rate.
Received: 13.04.2013
Document Type: Article
UDC: 519.6
Language: Russian
Citation: V. N. Krutikov, Ya. N. Vershinin, “The subgradient multistep minimization method for nonsmooth high-dimensional problems”, Vestn. Tomsk. Gos. Univ. Mat. Mekh., 2014, no. 3(29), 5–19
Citation in format AMSBIB
\Bibitem{KruVer14}
\by V.~N.~Krutikov, Ya.~N.~Vershinin
\paper The subgradient multistep minimization method for nonsmooth high-dimensional problems
\jour Vestn. Tomsk. Gos. Univ. Mat. Mekh.
\yr 2014
\issue 3(29)
\pages 5--19
\mathnet{http://mi.mathnet.ru/vtgu388}
Linking options:
  • https://www.mathnet.ru/eng/vtgu388
  • https://www.mathnet.ru/eng/vtgu/y2014/i3/p5
  • This publication is cited in the following 3 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Вестник Томского государственного университета. Математика и механика
    Statistics & downloads:
    Abstract page:399
    Full-text PDF :140
    References:41
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024