Two-level regression method using ensembles of trees with optimal divergence
Doklady Rossijskoj akademii nauk. Matematika, informatika, processy upravleniâ, Tome 499 (2021), pp. 63-66.

Voir la notice de l'article provenant de la source Math-Net.Ru

The article discusses a new two-level regression analysis method in which a corrective procedure is applied to optimal ensembles of regression trees. Optimization is carried out based on the simultaneous achievement of the divergence of the algorithms in the forecast space and a good approximation of the data by individual algorithms of the ensemble. Simple averaging, random regression forest, and gradient boosting are used as corrective procedures. Experiments are presented comparing the proposed method with the standard decision forest and the standard gradient boosting method for decision trees.
Keywords: regression, collective methods, bagging, gradient boosting.
@article{DANMA_2021_499_a13,
     author = {Yu. I. Zhuravlev and O. V. Sen'ko and A. A. Dokukin and N. N. Kiselyova and I. A. Saenko},
     title = {Two-level regression method using ensembles of trees with optimal divergence},
     journal = {Doklady Rossijskoj akademii nauk. Matematika, informatika, processy upravleni\^a},
     pages = {63--66},
     publisher = {mathdoc},
     volume = {499},
     year = {2021},
     language = {ru},
     url = {http://geodesic.mathdoc.fr/item/DANMA_2021_499_a13/}
}
TY  - JOUR
AU  - Yu. I. Zhuravlev
AU  - O. V. Sen'ko
AU  - A. A. Dokukin
AU  - N. N. Kiselyova
AU  - I. A. Saenko
TI  - Two-level regression method using ensembles of trees with optimal divergence
JO  - Doklady Rossijskoj akademii nauk. Matematika, informatika, processy upravleniâ
PY  - 2021
SP  - 63
EP  - 66
VL  - 499
PB  - mathdoc
UR  - http://geodesic.mathdoc.fr/item/DANMA_2021_499_a13/
LA  - ru
ID  - DANMA_2021_499_a13
ER  - 
%0 Journal Article
%A Yu. I. Zhuravlev
%A O. V. Sen'ko
%A A. A. Dokukin
%A N. N. Kiselyova
%A I. A. Saenko
%T Two-level regression method using ensembles of trees with optimal divergence
%J Doklady Rossijskoj akademii nauk. Matematika, informatika, processy upravleniâ
%D 2021
%P 63-66
%V 499
%I mathdoc
%U http://geodesic.mathdoc.fr/item/DANMA_2021_499_a13/
%G ru
%F DANMA_2021_499_a13
Yu. I. Zhuravlev; O. V. Sen'ko; A. A. Dokukin; N. N. Kiselyova; I. A. Saenko. Two-level regression method using ensembles of trees with optimal divergence. Doklady Rossijskoj akademii nauk. Matematika, informatika, processy upravleniâ, Tome 499 (2021), pp. 63-66. http://geodesic.mathdoc.fr/item/DANMA_2021_499_a13/

[1] Dokukin A.A., Senko O.V., “Regressionnaya model, osnovannaya na vypuklykh kombinatsiyakh, maksimalno korreliruyuschikh s otklikom”, ZhVMiMF, 55:3 (2015), 530–544 | MR | Zbl

[2] Zhuravlev Yu.I., “Korrektnye algebry nad mnozhestvom nekorrektnykh (evristicheskikh) algoritmov I”, Kibernetika, 1977, no. 4, 14–21 | Zbl

[3] Zhuravlev Yu.I., “Korrektnye algebry nad mnozhestvom nekorrektnykh (evristicheskikh) algoritmov II”, Kibernetika, 1977, no. 6, 21–27 | Zbl

[4] Breiman L., “Bagging predictors”, Machine Learning, 24 (1996), 123–140 | Zbl

[5] Hastie T., Tibshirani R., Friedman J.H., “10 Boosting and Additive Trees”, The Elements of Statistical Learning, 2nd ed., Springer, New York, 2009, 337–384 | DOI | MR

[6] Wolpert D.H., “Stacked Generalization”, Neural Networks, 5:2 (1992), 241–259 | DOI | MR

[7] Breiman L., “Stacked Regressions”, Machine Learning, 24 (1996), 49–64 | Zbl

[8] Pedregosa F., Varoquaux G., Gramfort A., Michel V., Thirion B., Grisel O., Blondel M., Prettenhofer P., Weiss R., Dubourg V., Vanderplas J., Passos A., Cournapeau D., Brucher M., Perrot M., Duchesnay E., “Scikit-learn: Machine Learning in Python”, J. Machine Learning Research, 2011, no. 12, 2825–2830 | MR | Zbl