Towards a loss function for training neural network models of time series imputation
Vestnik Ûžno-Uralʹskogo gosudarstvennogo universiteta. Seriâ Vyčislitelʹnaâ matematika i informatika, Tome 13 (2024) no. 4, pp. 53-73

Voir la notice de l'article provenant de la source Math-Net.Ru

In the article, we touch upon the problem of choosing a loss function for training neural network models for imputation of missing values of multidimensional time series and introduce a novel loss function called MPDE (Mean Profile Distance Error). The MPDE function for real and reconstructed $m$-length subsequences is calculated as the average of the distances between all pairs of $\ell$-length sliding windows of these subsequences, where $\ell\leqslant m$ and above windows have the same starting points. The distance between two windows is a modification of the MPdist (matrix profile distance) similarity measure and is defined as the weighted sum of the Euclidean and z‑normalized Euclidean distances between these windows. The above weights are taken from the range [0,1] and are the hyperparameters of the loss function. When training a neural network model, MPDE allows taking into account the behavioral similarity of the compared subsequences through the presence of similar windows in them, regardless of the relative locations of these windows. Since MPDE has a high computational complexity, we implement a parallel algorithm for its calculation on a GPU to incorporate MPDE into deep learning frameworks. The algorithm is implemented using the PyTorch framework, where MPDE is represented as a sequence of automatically parallelizable operations with multidimensional tensors. Experiments over multidimensional time series from various subject domains showed that in 78% of cases state-of-the-art neural network models achieve their highest imputation accuracy (in terms of the RMSE metric) when using the proposed loss function; at the same time, the above models demonstrate imputation accuracy 40% higher than the average value achieved when using other loss functions.
Keywords: time series, imputation of missing values, neural networks, loss function, PyTorch.
@article{VYURV_2024_13_4_a3,
     author = {A. A. Yurtin},
     title = {Towards a loss function for training neural network models of time series imputation},
     journal = {Vestnik \^U\v{z}no-Uralʹskogo gosudarstvennogo universiteta. Seri\^a Vy\v{c}islitelʹna\^a matematika i informatika},
     pages = {53--73},
     publisher = {mathdoc},
     volume = {13},
     number = {4},
     year = {2024},
     language = {ru},
     url = {http://geodesic.mathdoc.fr/item/VYURV_2024_13_4_a3/}
}
TY  - JOUR
AU  - A. A. Yurtin
TI  - Towards a loss function for training neural network models of time series imputation
JO  - Vestnik Ûžno-Uralʹskogo gosudarstvennogo universiteta. Seriâ Vyčislitelʹnaâ matematika i informatika
PY  - 2024
SP  - 53
EP  - 73
VL  - 13
IS  - 4
PB  - mathdoc
UR  - http://geodesic.mathdoc.fr/item/VYURV_2024_13_4_a3/
LA  - ru
ID  - VYURV_2024_13_4_a3
ER  - 
%0 Journal Article
%A A. A. Yurtin
%T Towards a loss function for training neural network models of time series imputation
%J Vestnik Ûžno-Uralʹskogo gosudarstvennogo universiteta. Seriâ Vyčislitelʹnaâ matematika i informatika
%D 2024
%P 53-73
%V 13
%N 4
%I mathdoc
%U http://geodesic.mathdoc.fr/item/VYURV_2024_13_4_a3/
%G ru
%F VYURV_2024_13_4_a3
A. A. Yurtin. Towards a loss function for training neural network models of time series imputation. Vestnik Ûžno-Uralʹskogo gosudarstvennogo universiteta. Seriâ Vyčislitelʹnaâ matematika i informatika, Tome 13 (2024) no. 4, pp. 53-73. http://geodesic.mathdoc.fr/item/VYURV_2024_13_4_a3/