Highly robust training of regularizedradial basis function networks
Kybernetika, Tome 60 (2024) no. 1, pp. 38-59
Cet article a éte moissonné depuis la source Czech Digital Mathematics Library
Radial basis function (RBF) networks represent established tools for nonlinear regression modeling with numerous applications in various fields. Because their standard training is vulnerable with respect to the presence of outliers in the data, several robust methods for RBF network training have been proposed recently. This paper is interested in robust regularized RBF networks. A robust inter-quantile version of RBF networks based on trimmed least squares is proposed here. Then, a systematic comparison of robust regularized RBF networks follows, which is evaluated over a set of 405 networks trained using various combinations of robustness and regularization types. The experiments proceed with a particular focus on the effect of variable selection, which is performed by means of a backward procedure, on the optimal number of RBF units. The regularized inter-quantile RBF networks based on trimmed least squares turn out to outperform the competing approaches in the experiments if a highly robust prediction error measure is considered.
Radial basis function (RBF) networks represent established tools for nonlinear regression modeling with numerous applications in various fields. Because their standard training is vulnerable with respect to the presence of outliers in the data, several robust methods for RBF network training have been proposed recently. This paper is interested in robust regularized RBF networks. A robust inter-quantile version of RBF networks based on trimmed least squares is proposed here. Then, a systematic comparison of robust regularized RBF networks follows, which is evaluated over a set of 405 networks trained using various combinations of robustness and regularization types. The experiments proceed with a particular focus on the effect of variable selection, which is performed by means of a backward procedure, on the optimal number of RBF units. The regularized inter-quantile RBF networks based on trimmed least squares turn out to outperform the competing approaches in the experiments if a highly robust prediction error measure is considered.
DOI :
10.14736/kyb-2024-1-0038
Classification :
62J02, 68T37, 68W25
Keywords: regression neural networks; robust training; effective regularization; quantile regression; robustness
Keywords: regression neural networks; robust training; effective regularization; quantile regression; robustness
@article{10_14736_kyb_2024_1_0038,
author = {Kalina, Jan and Vidnerov\'a, Petra and Jan\'a\v{c}ek, Patrik},
title = {Highly robust training of regularizedradial basis function networks},
journal = {Kybernetika},
pages = {38--59},
year = {2024},
volume = {60},
number = {1},
doi = {10.14736/kyb-2024-1-0038},
mrnumber = {4730699},
zbl = {07893446},
language = {en},
url = {http://geodesic.mathdoc.fr/articles/10.14736/kyb-2024-1-0038/}
}
TY - JOUR AU - Kalina, Jan AU - Vidnerová, Petra AU - Janáček, Patrik TI - Highly robust training of regularizedradial basis function networks JO - Kybernetika PY - 2024 SP - 38 EP - 59 VL - 60 IS - 1 UR - http://geodesic.mathdoc.fr/articles/10.14736/kyb-2024-1-0038/ DO - 10.14736/kyb-2024-1-0038 LA - en ID - 10_14736_kyb_2024_1_0038 ER -
%0 Journal Article %A Kalina, Jan %A Vidnerová, Petra %A Janáček, Patrik %T Highly robust training of regularizedradial basis function networks %J Kybernetika %D 2024 %P 38-59 %V 60 %N 1 %U http://geodesic.mathdoc.fr/articles/10.14736/kyb-2024-1-0038/ %R 10.14736/kyb-2024-1-0038 %G en %F 10_14736_kyb_2024_1_0038
Kalina, Jan; Vidnerová, Petra; Janáček, Patrik. Highly robust training of regularizedradial basis function networks. Kybernetika, Tome 60 (2024) no. 1, pp. 38-59. doi: 10.14736/kyb-2024-1-0038
Cité par Sources :