Preprints of the Keldysh Institute of Applied Mathematics
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Keldysh Institute preprints:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Preprints of the Keldysh Institute of Applied Mathematics, 2018, 164, 26 pp.
DOI: https://doi.org/10.20948/prepr-2018-164
(Mi ipmp2523)
 

How to optimize preconditioners for the conjugate gradient method: a stochastic approach

I. V. Oseledets, M. A. Botchev, A. M. Katrutsa, G. V. Ovchinnikov
References:
Abstract: The conjugate gradient method (CG) is usually used with a preconditioner which improves efficiency and robustness of the method. Many preconditioners include parameters and a proper choice of a preconditioner and its parameters is often not a trivial task. Although many convergence estimates exist which can be used for optimizing preconditioners, they typically hold for all initial guess vectors, reflecting the worst convergence rate. To account for the mean convergence rate instead, in this paper, we follow a simple stochastic approach. It is based on trial runs with random initial guess vectors and leads to a functional which can be used to monitor convergence and to optimize preconditioner parameters in CG. Presented numerical experiments show that optimization of this new functional usually yields a better parameter value than optimization of the functional based on the spectral condition number.
Keywords: conjugate gradient method, preconditioners, condition number, eigenvalue clustering, relaxed incomplete Cholesky preconditioner.
Funding agency Grant number
Russian Foundation for Basic Research 17-01-00854_a
Bibliographic databases:
Document Type: Preprint
Language: Russian
Citation: I. V. Oseledets, M. A. Botchev, A. M. Katrutsa, G. V. Ovchinnikov, “How to optimize preconditioners for the conjugate gradient method: a stochastic approach”, Keldysh Institute preprints, 2018, 164, 26 pp.
Citation in format AMSBIB
\Bibitem{OseIzoKat18}
\by I.~V.~Oseledets, M.~A.~Botchev, A.~M.~Katrutsa, G.~V.~Ovchinnikov
\paper How to optimize preconditioners for the conjugate gradient method: a stochastic approach
\jour Keldysh Institute preprints
\yr 2018
\papernumber 164
\totalpages 26
\mathnet{http://mi.mathnet.ru/ipmp2523}
\crossref{https://doi.org/10.20948/prepr-2018-164}
\elib{https://elibrary.ru/item.asp?id=35421405}
Linking options:
  • https://www.mathnet.ru/eng/ipmp2523
  • https://www.mathnet.ru/eng/ipmp/y2018/p164
  • Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Препринты Института прикладной математики им. М. В. Келдыша РАН
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024