Mots-clés : bias-variance decomposition
@article{IIGUM_2023_43_a7,
author = {Victor M. Nedel'ko},
title = {On the properties of bias-variance decomposition for {kNN} regression},
journal = {The Bulletin of Irkutsk State University. Series Mathematics},
pages = {110--121},
year = {2023},
volume = {43},
language = {en},
url = {http://geodesic.mathdoc.fr/item/IIGUM_2023_43_a7/}
}
Victor M. Nedel'ko. On the properties of bias-variance decomposition for kNN regression. The Bulletin of Irkutsk State University. Series Mathematics, Tome 43 (2023), pp. 110-121. http://geodesic.mathdoc.fr/item/IIGUM_2023_43_a7/
[1] d'Ascoli S., Refinetti M., Biroli G., and Krzakala F., “Double trouble in double descent: bias and variance(s) in the lazy regime”, Proceedings of the 37th International Conference on Machine Learning, ICML'20, 2020, 213, 2280–2290 JMLR.org
[2] Belkin M., Hsu D., Ma S., Mandal S., “Reconciling modern machine learning practice and the classical bias-variance trade-off”, Proceedings of the National Academy of Sciences, 116:32 (2019), 15849–1585 | DOI
[3] Berikov V., “Semi-supervised classification using multiple clustering and low-rank matrix operations”, Lecture Notes in Computer Science, 11548, 2019, 529–540 | DOI
[4] Hastie T., Tibshirani R., Friedman H., Jerome, The Elements of Statistical Learning, 2009
[5] Heskes T., “Bias/variance decompositions for likelihood-based estimators”, Neural Computation, 10:6 (1998), 1425–1433 | DOI
[6] Kanevskiy D., Vorontsov K., “Cooperative coevolutionary ensemble learning”, Multiple Classifier Systems, 7th International Workshop, MCS 2007 (Prague, Czech Republic, May, 23–25, 2007), 469–478 | DOI
[7] Kotsiantis S., “Bagging and boosting variants for handling classifications problems: A survey”, The Knowledge Engineering Review, 29:1 (2014), 78–100 | DOI
[8] Lbov G., Startseva N., “On some a concept of complexity of a strategy of nature in pattern recognition”, Data Analysis in Expert Systems, Computational systems, 117, 1986, 91–102 (in Russian)
[9] Lbov G. S., Startseva N. G., “Complexity of distributions in classification problems”, Russ. Acad. Sci., Dokl. Math., 50:2 (1994)
[10] Lbov G., Startseva N., Logical Decision Functions and the Problem of Statistical Robustness of Solutions, Institute of Mathematics SB RAS, Novosibirsk, 1999 (in Russian)
[11] Nakkiran P., Kaplun G., Bansal Y., Yang T., Barak B., Sutskever I., “Deep double descent: where bigger models and more data hurt”, Journal of Statistical Mechanics: Theory and Experiment, 2021
[12] Neal B., Mittal S., Baratin A., Tantia V., Scicluna M., Lacoste-Julien S., Mitliagkas I., A Modern Take on the Bias-Variance Tradeoff in Neural Networks, 2018, arXiv: 1810.08591 | DOI
[13] Nedel'ko V., “Some aspects of estimating a quality of decision functions construction methods”, Tomsk state university. Journal of control and computer science, 3:24 (2013), 123–132 (in Russian)
[14] Nedel'ko V., “Statistical fitting criterion on the basis of cross-validation estimation”, Pattern Recognition and Image Analysis, 28 (2018), 510–515 | DOI
[15] Nedel'ko V., “On decompositions of decision function quality measure”, Bulletin of Irkutsk State University. Series Mathematics, 33 (2020), 64–79 | DOI
[16] Nedel'ko V., “Tight risk bounds for histogram classifier”, Proceedings of IFOST-2016, 11th International Forum on Strategic Technology, 2016, 267–271
[17] Rachakonda A. R., Bhatnagar A., “Aratio: Extending area under the roc curve for probabilistic labels”, Pattern Recognition Letters, 150 (2021), 265–271 | DOI
[18] Rudakov K., “Mathematical Foundations for Processing High Data Volume”, Machine Learning and Artificial Intelligence, Pattern Recognit. Image Anal., 29 (2019), 339–343 | DOI
[19] Stuart G., Bienenstock E., Doursat R., “Neural networks and the bias/variance dilemma”, Neural Computation, 4:1 (1992), 1–58 | DOI
[20] Yang Z., Yu Y., You C., Steinhardt J., Ma Y., “Rethinking bias-variance trade-off for generalization of neural networks”, International Conference on Machine Learning, 2020, 10767–10777