Exploring the impact of post-training rounding in regression models
Applications of Mathematics, Tome 69 (2024) no. 2, pp. 257-271
Cet article a éte moissonné depuis la source Czech Digital Mathematics Library
Post-training rounding, also known as quantization, of estimated parameters stands as a widely adopted technique for mitigating energy consumption and latency in machine learning models. This theoretical endeavor delves into the examination of the impact of rounding estimated parameters in key regression methods within the realms of statistics and machine learning. The proposed approach allows for the perturbation of parameters through an additive error with values within a specified interval. This method is elucidated through its application to linear regression and is subsequently extended to encompass radial basis function networks, multilayer perceptrons, regularization networks, and logistic regression, maintaining a consistent approach throughout.
Post-training rounding, also known as quantization, of estimated parameters stands as a widely adopted technique for mitigating energy consumption and latency in machine learning models. This theoretical endeavor delves into the examination of the impact of rounding estimated parameters in key regression methods within the realms of statistics and machine learning. The proposed approach allows for the perturbation of parameters through an additive error with values within a specified interval. This method is elucidated through its application to linear regression and is subsequently extended to encompass radial basis function networks, multilayer perceptrons, regularization networks, and logistic regression, maintaining a consistent approach throughout.
DOI :
10.21136/AM.2024.0090-23
Classification :
62H12, 62M45, 68Q87
Keywords: supervised learning; trained model; perturbations; effect of rounding; low-precision arithmetic
Keywords: supervised learning; trained model; perturbations; effect of rounding; low-precision arithmetic
@article{10_21136_AM_2024_0090_23,
author = {Kalina, Jan},
title = {Exploring the impact of post-training rounding in regression models},
journal = {Applications of Mathematics},
pages = {257--271},
year = {2024},
volume = {69},
number = {2},
doi = {10.21136/AM.2024.0090-23},
mrnumber = {4728194},
zbl = {07893334},
language = {en},
url = {http://geodesic.mathdoc.fr/articles/10.21136/AM.2024.0090-23/}
}
TY - JOUR AU - Kalina, Jan TI - Exploring the impact of post-training rounding in regression models JO - Applications of Mathematics PY - 2024 SP - 257 EP - 271 VL - 69 IS - 2 UR - http://geodesic.mathdoc.fr/articles/10.21136/AM.2024.0090-23/ DO - 10.21136/AM.2024.0090-23 LA - en ID - 10_21136_AM_2024_0090_23 ER -
Kalina, Jan. Exploring the impact of post-training rounding in regression models. Applications of Mathematics, Tome 69 (2024) no. 2, pp. 257-271. doi: 10.21136/AM.2024.0090-23
Cité par Sources :