The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient
Sbornik. Mathematics, Tome 211 (2020) no. 4, pp. 481-504 Cet article a éte moissonné depuis la source Math-Net.Ru

Voir la notice de l'article

We consider the minimization problem for a nonconvex function with Lipschitz continuous gradient on a proximally smooth (possibly nonconvex) subset of a finite-dimensional Euclidean space. We introduce the error bound condition with exponent $\alpha\in(0,1]$ for the gradient mapping. Under this condition, it is shown that the standard gradient projection algorithm converges to a solution of the problem linearly or sublinearly, depending on the value of the exponent $\alpha$. This paper is theoretical. Bibliography: 23 titles.
Keywords: gradient mapping, error bound condition, proximal smoothness, nonconvex extremal problem.
Mots-clés : gradient projection algorithm
@article{SM_2020_211_4_a0,
     author = {M. V. Balashov},
     title = {The gradient projection algorithm for a~proximally smooth set and a~function with {Lipschitz} continuous gradient},
     journal = {Sbornik. Mathematics},
     pages = {481--504},
     year = {2020},
     volume = {211},
     number = {4},
     language = {en},
     url = {http://geodesic.mathdoc.fr/item/SM_2020_211_4_a0/}
}
TY  - JOUR
AU  - M. V. Balashov
TI  - The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient
JO  - Sbornik. Mathematics
PY  - 2020
SP  - 481
EP  - 504
VL  - 211
IS  - 4
UR  - http://geodesic.mathdoc.fr/item/SM_2020_211_4_a0/
LA  - en
ID  - SM_2020_211_4_a0
ER  - 
%0 Journal Article
%A M. V. Balashov
%T The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient
%J Sbornik. Mathematics
%D 2020
%P 481-504
%V 211
%N 4
%U http://geodesic.mathdoc.fr/item/SM_2020_211_4_a0/
%G en
%F SM_2020_211_4_a0
M. V. Balashov. The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient. Sbornik. Mathematics, Tome 211 (2020) no. 4, pp. 481-504. http://geodesic.mathdoc.fr/item/SM_2020_211_4_a0/

[1] A. A. Goldstein, “Convex programming in Hilbert space”, Bull. Amer. Math. Soc., 70:5 (1964), 709–710 | DOI | MR | Zbl

[2] E. S. Levitin, B. T. Polyak, “Constrained minimization methods”, U.S.S.R. Comput. Math. Math. Phys., 6:5 (1966), 1–50 | DOI | MR | Zbl

[3] B. T. Polyak, “Gradient methods for the minimisation functionals”, U.S.S.R. Comput. Math. Math. Phys., 3:4 (1963), 864–878 | DOI | MR | Zbl

[4] B. T. Polyak, Vvedenie v optimizatsiyu, Nauka, M., 1983, 384 pp. | MR | Zbl

[5] Yu. Nesterov, Introductory lectures on convex optimization. A basic course, Appl. Optim., 87, Kluwer Acad. Publ., Boston, MA, 2004, xviii+236 pp. | DOI | MR | Zbl

[6] H. Karimi, J. Nutini, M. Schmidt, “Linear convergence of gradient and proximal-gradient methods under the Polyak–Łojasiewicz condition”, Machine learning and knowledge discovery in databases. ECML PKDD 2016, Lecture Notes in Comput. Sci., 9851, Springer, Cham, 2016, 795–811 | DOI

[7] B. T. Polyak, “Gradient methods for the minimisation of functionals”, U.S.S.R. Comput. Math. Math. Phys., 3:4 (1963), 864–878 | DOI | MR | Zbl

[8] S. Lojasiewicz, “Une propriété topologique des sous-ensembles analitiques réels”, Les équations aux dérivées partielles (Paris, 1962), Éd. du CNRS, Paris, 1963, 87–89 | MR | Zbl

[9] Zhi-Quan Luo, “New error bounds and their applications to convergence analysis of iterative algorithms”, Error bounds in mathematical programming (Kowloon, 1998), Math. Program., 88:2, Ser. B (2000), 341–355 | DOI | MR | Zbl

[10] J. Bolte, Trong Phong Nguyen, J. Peypouquet, B. W. Suter, “From error bounds to the complexity of first-order descent methods for convex functions”, Math. Program., 165:2, Ser. A (2017), 471–507 ; arXiv: 1510.08234v3 | DOI | MR | Zbl

[11] D. Drusvyatskiy, A. S. Lewis, “Error bounds, quadratic growth, and linear convergence of proximal methods”, Math. Oper. Res., 43:3 (2018), 919–948 ; arXiv: 1602.06661v2 | DOI | MR

[12] Tianbao Yang, Qihang Lin, “RSG: Beating Subgradient Method without Smoothness and Strong Convexity”, J. Mach. Learn. Res., 19 (2018), 6, 33 pp. | MR | Zbl

[13] M. V. Balashov, M. O. Golubev, “About the Lipschitz property of the metric projection in the Hilbert space”, J. Math. Anal. Appl., 394:2 (2012), 545–551 | DOI | MR | Zbl

[14] M. V. Balashov, “Maximization of a function with Lipschitz continuous gradient”, J. Math. Sci. (N.Y.), 209:1 (2015), 12–18 | DOI | MR | Zbl

[15] M. V. Balashov, B. T. Polyak, A. A. Tremba, “Gradient projection and conditional gradient methods for constrained nonconvex minimization”, Numer. Funct. Anal. Optim., Publ. online: 2020 | DOI

[16] J.-Ph. Vial, “Strong and weak convexity of sets and functions”, Math. Oper. Res., 8:2 (1983), 231–259 | DOI | MR | Zbl

[17] F. H. Clarke, R. J. Stern, P. R. Wolenski, “Proximal smoothness and lower-$C^{2}$ property”, J. Convex Anal., 2:1-2 (1995), 117–144 | MR | Zbl

[18] M. V. Balashov, G. E. Ivanov, “Weakly convex and proximally smooth sets in Banach spaces”, Izv. Math., 73:3 (2009), 455–499 | DOI | DOI | MR | Zbl

[19] R. A. Poliquin, R. T. Rockafellar, L. Thibault, “Local differentiability of distance functions”, Trans. Amer. Math. Soc., 352:11 (2000), 5231–5249 | DOI | MR | Zbl

[20] M. V. Balashov, “About the gradient projection algorithm for a strongly convex function and a proximally smooth set”, J. Convex Anal., 24:2 (2017), 493–500 | MR | Zbl

[21] A. D. Ioffe, “Metric regularity – a survey. Part 1. Theory”, J. Aust. Math. Soc., 101:2 (2016), 188–243 ; Part 2. Applications:3, 376–417 ; Metric regularity. Theory and applications – a survey, arXiv: 1505.07920v2 | DOI | MR | Zbl | DOI | MR | Zbl

[22] M. Bounkhel, L. Thibault, “On various notions of regularity of sets in nonsmooth analysis”, Nonlinear Anal., 48:2, Ser. A: Theory Methods (2002), 223–246 | DOI | MR | Zbl

[23] E. S. Polovinkin, Mnogoznachnyi analiz i differentsialnye vklyucheniya, Fizmatlit, M., 2015, 524 pp.