Informatics and Automation
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Informatics and Automation:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Informatics and Automation, 2023, Issue 22, volume 2, Pages 393–415
DOI: https://doi.org/10.15622/ia.22.2.6
(Mi trspy1242)
 

This article is cited in 1 scientific paper (total in 1 paper)

Mathematical Modeling, Numerical Methods

Optimization of the regression ensemble size

Yu. Zelenkov

HSE University
Abstract: Ensemble learning algorithms such as bagging often generate unnecessarily large models, which consume extra computational resources and may degrade the generalization ability. Pruning can potentially reduce ensemble size as well as improve performance; however, researchers have previously focused more on pruning classifiers rather than regressors. This is because, in general, ensemble pruning is based on two metrics: diversity and accuracy. Many diversity metrics are known for problems dealing with a finite set of classes defined by discrete labels. Therefore, most of the work on ensemble pruning is focused on such problems: classification, clustering, and feature selection. For the regression problem, it is much more difficult to introduce a diversity metric. In fact, the only such metric known to date is a correlation matrix based on regressor predictions. This study seeks to address this gap. First, we introduce the mathematical condition that allows checking whether the regression ensemble includes redundant estimators, i.e., estimators, whose removal improves the ensemble performance. Developing this approach, we propose a new ambiguity-based pruning (AP) algorithm that bases on error-ambiguity decomposition formulated for a regression problem. To check the quality of AP, we compare it with the two methods that directly minimize the error by sequentially including and excluding regressors, as well as with the state-of-art Ordered Aggregation algorithm. Experimental studies confirm that the proposed approach allows reducing the size of the regression ensemble with simultaneous improvement in its performance and surpasses all compared methods.
Keywords: ensemble pruning, regression, ensemble learning, error-ambiguity decomposition, diversity of regressors.
Received: 27.11.2022
Document Type: Article
Language: English
Citation: Yu. Zelenkov, “Optimization of the regression ensemble size”, Informatics and Automation, 22:2 (2023), 393–415
Citation in format AMSBIB
\Bibitem{Zel23}
\by Yu.~Zelenkov
\paper Optimization of the regression ensemble size
\jour Informatics and Automation
\yr 2023
\vol 22
\issue 2
\pages 393--415
\mathnet{http://mi.mathnet.ru/trspy1242}
\crossref{https://doi.org/10.15622/ia.22.2.6}
Linking options:
  • https://www.mathnet.ru/eng/trspy1242
  • https://www.mathnet.ru/eng/trspy/v22/i2/p393
  • This publication is cited in the following 1 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Informatics and Automation
    Statistics & downloads:
    Abstract page:50
    Full-text PDF :35
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024