Local methods with adaptivity via scaling
Trudy Matematicheskogo Instituta imeni V.A. Steklova, Tome 79 (2024) no. 6, pp. 1051-1091

Voir la notice de l'article provenant de la source Math-Net.Ru

The rapid development of machine learning and deep learning has introduced increasingly complex optimization challenges that must be addressed. Indeed, training modern, advanced models has become difficult to implement without leveraging multiple computing nodes in a distributed environment. Distributed optimization is also fundamental to emerging fields such as federated learning. Specifically, there is a need to organize the training process so as to minimize the time lost due to communication. A widely used and extensively researched technique to mitigate the communication bottleneck involves performing local training before communication. This approach is the focus of our paper. Concurrently, adaptive methods that incorporate scaling, notably led by Adam, gained significant popularity in recent years. Therefore, this paper aims to merge the local training technique with the adaptive approach to develop efficient distributed learning methods. We consider the classical Local SGD method and enhance it with a scaling feature. A crucial aspect is that scaling is described generically, allowing us to analyze various approaches, including Adam, RMSProp, and OASIS, in a unified manner. In addition to the theoretical analysis, we validate the performance of our methods in practice by training a neural network. Bibliography: 49 titles.
Keywords: convex optimization, distributed optimization, adaptive methods, preconditioning.
@article{RM_2024_79_6_a5,
     author = {S. A. Chezhegov and S. N. Skorik and N. Khachaturov and D. S. Shalagin and A. A. Avetisyan and M. Tak\'a\v{c} and Y. A. Kholodov and A. N. Beznosikov},
     title = {Local methods with adaptivity via scaling},
     journal = {Trudy Matematicheskogo Instituta imeni V.A. Steklova},
     pages = {1051--1091},
     publisher = {mathdoc},
     volume = {79},
     number = {6},
     year = {2024},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/RM_2024_79_6_a5/}
}
TY  - JOUR
AU  - S. A. Chezhegov
AU  - S. N. Skorik
AU  - N. Khachaturov
AU  - D. S. Shalagin
AU  - A. A. Avetisyan
AU  - M. Takáč
AU  - Y. A. Kholodov
AU  - A. N. Beznosikov
TI  - Local methods with adaptivity via scaling
JO  - Trudy Matematicheskogo Instituta imeni V.A. Steklova
PY  - 2024
SP  - 1051
EP  - 1091
VL  - 79
IS  - 6
PB  - mathdoc
UR  - http://geodesic.mathdoc.fr/item/RM_2024_79_6_a5/
LA  - en
ID  - RM_2024_79_6_a5
ER  - 
%0 Journal Article
%A S. A. Chezhegov
%A S. N. Skorik
%A N. Khachaturov
%A D. S. Shalagin
%A A. A. Avetisyan
%A M. Takáč
%A Y. A. Kholodov
%A A. N. Beznosikov
%T Local methods with adaptivity via scaling
%J Trudy Matematicheskogo Instituta imeni V.A. Steklova
%D 2024
%P 1051-1091
%V 79
%N 6
%I mathdoc
%U http://geodesic.mathdoc.fr/item/RM_2024_79_6_a5/
%G en
%F RM_2024_79_6_a5
S. A. Chezhegov; S. N. Skorik; N. Khachaturov; D. S. Shalagin; A. A. Avetisyan; M. Takáč; Y. A. Kholodov; A. N. Beznosikov. Local methods with adaptivity via scaling. Trudy Matematicheskogo Instituta imeni V.A. Steklova, Tome 79 (2024) no. 6, pp. 1051-1091. http://geodesic.mathdoc.fr/item/RM_2024_79_6_a5/