Proceedings of the Institute for System Programming of the RAS
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Proceedings of ISP RAS:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Proceedings of the Institute for System Programming of the RAS, 2019, Volume 31, Issue 4, Pages 113–120
DOI: https://doi.org/10.15514/ISPRAS-2019-31(4)-7
(Mi tisp442)
 

This article is cited in 3 scientific papers (total in 3 papers)

Bayes regularization in the selection of weight coefficients in the predictor ensembles

A. S. Nuzhny

Nuclear Safety Institute of the Russian Academy of Sciences
Full-text PDF (857 kB) Citations (3)
References:
Abstract: The supervised learning problem is discussed in the article: it is necessary to restore the dependence that maps a vector set into a scalar based on a finite set of examples of such a mapping - a training sample. This problem belongs to the class of inverse problems, and, like most inverse problems, is mathematically incorrect. This is expressed in the fact that if you construct the solution using the least squares method according to the points of the training sample, you may encounter retraining - a situation where the model describes the training set well, but gives a big error on the test one. We apply the approach when a solution is sought in the form of an ensemble of predictive models. Ensembles are built using the bagging method. Perceptrons and decision trees are considered as basic learning models. The final decision is obtained by weighted voting of predictors. Weights are selected by minimizing model errors in the training set. To avoid over-fitting in the selection of weights, Bayesian regularization of the solution is applied. In order to choose regularization parameters, it is proposed to use the method of orthogonalized basic functions, which allows obtaining their optimal values without using expensive iterative procedures.
Keywords: supervised learning, bagging, ill-passed problem, Bayesian regularization of learning.
Document Type: Article
Language: Russian
Citation: A. S. Nuzhny, “Bayes regularization in the selection of weight coefficients in the predictor ensembles”, Proceedings of ISP RAS, 31:4 (2019), 113–120
Citation in format AMSBIB
\Bibitem{Nuz19}
\by A.~S.~Nuzhny
\paper Bayes regularization in the selection of weight coefficients in the predictor ensembles
\jour Proceedings of ISP RAS
\yr 2019
\vol 31
\issue 4
\pages 113--120
\mathnet{http://mi.mathnet.ru/tisp442}
\crossref{https://doi.org/10.15514/ISPRAS-2019-31(4)-7}
Linking options:
  • https://www.mathnet.ru/eng/tisp442
  • https://www.mathnet.ru/eng/tisp/v31/i4/p113
  • This publication is cited in the following 3 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Proceedings of the Institute for System Programming of the RAS
    Statistics & downloads:
    Abstract page:114
    Full-text PDF :67
    References:21
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024