On the convergence rate of the subgradient method with metric variation and its applications in neural network approximation schemes
Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika, no. 55 (2018), pp. 22-37

Voir la notice de l'article provenant de la source Math-Net.Ru

In this paper, the relaxation subgradient method with rank 2 correction of metric matrices is studied. It is proven that, on high-convex functions, in the case of the existence of a linear coordinate transformation reducing the degree of the task casualty, the method has a linear convergence rate corresponding to the casualty degree. The paper offers a new efficient tool for choosing the initial approximation of an artificial neural network. The use of regularization allowed excluding the overfitting effect and efficiently deleting low-significant neurons and intra-neural connections. The ability to efficiently solve such problems is ensured by the use of the subgradient method with metric matrix rank 2 correction. It has been experimentally proved that the convergence rate of the quasi-Newton method and that of the method under research are virtually equivalent on smooth functions. The method has a high convergence rate on non-smooth functions as well. The method's computing capabilities are used to build efficient neural network learning algorithms. The paper describes an artificial neural network learning algorithm which, together with the redundant neuron suppression, allows obtaining reliable approximations in one count.
Keywords: method, subgradient, minimization, rate of convergence, neural networks, regularization.
@article{VTGU_2018_55_a2,
     author = {V. N. Krutikov and N. S. Samoilenko},
     title = {On the convergence rate of the subgradient method with metric variation and its applications in neural network approximation schemes},
     journal = {Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika},
     pages = {22--37},
     publisher = {mathdoc},
     number = {55},
     year = {2018},
     language = {ru},
     url = {http://geodesic.mathdoc.fr/item/VTGU_2018_55_a2/}
}
TY  - JOUR
AU  - V. N. Krutikov
AU  - N. S. Samoilenko
TI  - On the convergence rate of the subgradient method with metric variation and its applications in neural network approximation schemes
JO  - Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika
PY  - 2018
SP  - 22
EP  - 37
IS  - 55
PB  - mathdoc
UR  - http://geodesic.mathdoc.fr/item/VTGU_2018_55_a2/
LA  - ru
ID  - VTGU_2018_55_a2
ER  - 
%0 Journal Article
%A V. N. Krutikov
%A N. S. Samoilenko
%T On the convergence rate of the subgradient method with metric variation and its applications in neural network approximation schemes
%J Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika
%D 2018
%P 22-37
%N 55
%I mathdoc
%U http://geodesic.mathdoc.fr/item/VTGU_2018_55_a2/
%G ru
%F VTGU_2018_55_a2
V. N. Krutikov; N. S. Samoilenko. On the convergence rate of the subgradient method with metric variation and its applications in neural network approximation schemes. Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika, no. 55 (2018), pp. 22-37. http://geodesic.mathdoc.fr/item/VTGU_2018_55_a2/