Voir la notice du chapitre de livre
Mots-clés : gradient dominance condition.
@article{TIMM_2019_25_4_a22,
author = {F. S. Stonyakin},
title = {Adaptation to inexactness for some gradient-type optimization methods},
journal = {Trudy Instituta matematiki i mehaniki},
pages = {210--225},
year = {2019},
volume = {25},
number = {4},
language = {ru},
url = {http://geodesic.mathdoc.fr/item/TIMM_2019_25_4_a22/}
}
F. S. Stonyakin. Adaptation to inexactness for some gradient-type optimization methods. Trudy Instituta matematiki i mehaniki, Trudy Instituta Matematiki i Mekhaniki UrO RAN, Tome 25 (2019) no. 4, pp. 210-225. http://geodesic.mathdoc.fr/item/TIMM_2019_25_4_a22/
[1] Necoara I., Nesterov Y. Glineur F., “Linear convergence of first order methods for non-strongly convex optimization”, Math. Program., 175 (2019), 69–107 | DOI | MR | Zbl
[2] Tyurin A. I., Gasnikov A. V., “Bystryi gradientnyi spusk dlya zadach vypukloi minimizatsii s orakulom, vydayuschim $(\delta,L)$-model funktsii v zaproshennoi tochke.”, Zhurn. vychisl. matematiki i mat. fiziki, 59:7 (2019), 1137–1150 | DOI | Zbl
[3] Stonyakin F. S., Dvinskikh D., Dvurechensky P., Kroshnin A., Kuznetsova O., Agafonov A., Gasnikov A., Tyurin A., Uribe C. A., Pasechnyuk D., Artamonov S., “Gradient methods for problems with inexact model of the objective”, Intern. Conf. on Mathematical optimization theory and operations research (MOTOR 2019), extended conference abstracts, Lecture Notes in Computer Science, 11548, eds. M. Khachay, Y. Kochetov, P. Pardalos, Springer, Cham, 2019, 97–114 | DOI
[4] Lu H., Freund R. M., Nesterov Y., “Relatively smooth convex optimization by Firstorder methods, and applications”, SIAM J. Optim., 28:1 (2018), 333–354 | DOI | MR | Zbl
[5] Devolder O., Glineur F., Nesterov Yu., “First-order methods of smooth convex optimization with inexact oracle”, Math. Program., 146:1–2 (2014), 37–75 | DOI | MR | Zbl
[6] Devolder O., Exactness, inexactness and stochasticity in first-order methods for large-scale convex optimization, PhD thesis, 2013, 320 pp.
[7] Nesterov Yu., “Gradient methods for minimizing composite functions”, Math. Program., 140:1 (2013), 125–161 | DOI | MR | Zbl
[8] Nesterov Yu. E., Algoritmicheskaya vypuklaya optimizatsiya, diss$\dots$ d-r fiz.-mat. nauk: 01.01.07, MFTI, M., 2013, 367 pp.
[9] Karimi H., Nutini J., Schmidt M., “Linear sonvergence of pradient and zroximal-gradient methods under the Polyak-Lojasiewicz condition”, Machine Learning and Knowledge Discovery in Databases, Lecture Notes in Computer Science, 9851, eds. B. Berendt etc., Springer, Cham, 2016, 795–811 | DOI
[10] Mordukhovich B., Variational analysis and generalized differentiation I, theory and examples, Part of the Grundlehren der mathematischen Wissenschaften book series, 330, Springer-Verlag, Berlin; Heidelberg, 2006, 579 pp. | DOI | MR
[11] Polyak B. T., “Gradientnye metody minimizatsii funktsionalov”, Zhurn. vychisl. matematiki i mat. fiziki, 3:4 (1963), 643–653 | DOI | Zbl
[12] Polyak B. T., Vvedenie v optimizatsiyu, Nauka, M., 1983, 384 pp. | MR
[13] Stonyakin F. S., “Analog kvadratichnoi interpolyatsii dlya spetsialnogo klassa negladkikh funktsionalov i odno ego prilozhenie k adaptivnomu metodu zerkalnogo spuska”, Dinamicheskie sistemy, 9 (37):1 (2019), 3–16 | MR
[14] Mordukhovich B. S., Nam N. M., “Applications of variational analysis to a generalized Fermat-Torricelli problem”, J. Optim. Theory Appl., 148:3 (2011), 431–454 | DOI | MR | Zbl
[15] Nemirovskii A. S., Yudin D. B., Slozhnost zadach i effektivnost metodov optimizatsii, Nauka. Gl. redaktsiya fiz.-mat. literatury, M., 1979, 384 pp.
[16] Nesterov Yu., “Universal gradient methods for convex optimization problems”, Math. Program. Ser. A, 152:1–2 (2015), 381–404 | DOI | MR | Zbl
[17] D'Aspremont A., “Smooth optimization with approximate gradient”, SIAM J. Optimi., 19:3 (2008), 1171–1183 | DOI | MR | Zbl