The subgradient multistep minimization method for nonsmooth high-dimensional problems
Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika, no. 3 (2014), pp. 5-19 Cet article a éte moissonné depuis la source Math-Net.Ru

Voir la notice de l'article

In this paper, a new multistep relaxation subgradient minimization method is proposed. It is based on principles of organization of “conjugate subgradients” methods. The presented method belongs to the class of relaxation methods of the $\varepsilon$-subgradient type (RSM) and is intended for solving nonsmooth high-dimensional problems. The space tension RSMs available at present are comparable in the rate of convergence for smooth functions with quasi-Newton methods and are efficient in solving nonsmooth problems of the ravine type. At high dimensional problems, it effectiveness is reduced due to the necessity of storage and transformation of the metric matrix. In the smooth case, the conjugate gradient method substitutes quasi-Newton methods at high-dimensional problems. Existing multistep RSMs are significantly inferior to the subgradient space tension methods in the rate of convergence and loop at ravine type nonsmooth optimization problems. That is why they are practically not applied for even for small dimension problems. These circumstances determine the importance of establishing effective multistage RSMs. In the considered relaxation subgradient method, additional learning relations are used at iterations with the aim to improve the efficiency of the learning algorithm for a known method based on extending the Kaczmarz algorithm to inequality systems. This innovation expands the range of solved nonsmooth optimization problems and increases the rate of convergence in solving smooth and non-smooth minimization problems. Numerical results indicate an increase in the minimization method efficiency due to orthogonalization of learning vectors in the algorithm that solves the inequalities, which is particularly evident when solving nonsmooth problems of high dimensionality with a high degree of elongation of the level surfaces. According to the convergence properties at high dimension quadratic functions, at a large scatter of eigenvalues, the developed algorithm is superior to existing multi-step relaxation subgradient methods and is comparable in the effectiveness to the conjugate gradients method.
Keywords: Kaczmarz algortihm, multistep algorithm, minimization method
Mots-clés : convergence rate.
@article{VTGU_2014_3_a0,
     author = {V. N. Krutikov and Ya. N. Vershinin},
     title = {The subgradient multistep minimization method for nonsmooth high-dimensional problems},
     journal = {Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika},
     pages = {5--19},
     year = {2014},
     number = {3},
     language = {ru},
     url = {http://geodesic.mathdoc.fr/item/VTGU_2014_3_a0/}
}
TY  - JOUR
AU  - V. N. Krutikov
AU  - Ya. N. Vershinin
TI  - The subgradient multistep minimization method for nonsmooth high-dimensional problems
JO  - Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika
PY  - 2014
SP  - 5
EP  - 19
IS  - 3
UR  - http://geodesic.mathdoc.fr/item/VTGU_2014_3_a0/
LA  - ru
ID  - VTGU_2014_3_a0
ER  - 
%0 Journal Article
%A V. N. Krutikov
%A Ya. N. Vershinin
%T The subgradient multistep minimization method for nonsmooth high-dimensional problems
%J Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika
%D 2014
%P 5-19
%N 3
%U http://geodesic.mathdoc.fr/item/VTGU_2014_3_a0/
%G ru
%F VTGU_2014_3_a0
V. N. Krutikov; Ya. N. Vershinin. The subgradient multistep minimization method for nonsmooth high-dimensional problems. Vestnik Tomskogo gosudarstvennogo universiteta. Matematika i mehanika, no. 3 (2014), pp. 5-19. http://geodesic.mathdoc.fr/item/VTGU_2014_3_a0/

[1] Wolfe P., “Note on a method of conjugate subgradients for minimizing nondifferentiable functions”, Math. Programming, 7:3 (1974), 380–383 | DOI | MR | Zbl

[2] Demyanov V. F., Vasilev L. V., Nedifferentsiruemaya optimizatsiya, Nauka, M., 1972, 368 pp. | MR

[3] Krutikov V. N., Petrova T. V., “Novyi metod resheniya zadach minimizatsii bolshoi razmernosti”, Vestnik KemGU (Kemerovo), 2001, no. 4, 65–71

[4] Shor N. Z., Metody minimizatsii nedifferentsiruemykh funktsii i ikh prilozheniya, Naukova dumka, Kiev, 1979, 199 pp. | MR | Zbl

[5] Krutikov V. N., Petrova T. V., “Relaksatsionnyi metod minimizatsii s rastyazheniem prostranstva v napravlenii subgradienta”, Ekonomika i mat. metody, 39:1 (2003), 33–49

[6] Krutikov V. N., Gorskaya T. A., “Semeistvo relaksatsionnykh subgradientnykh metodov s dvukhrangovoi korrektsiei matrits metriki”, Ekonomika i mat. metody, 45:4 (2009), 37–80

[7] Krutikov V. N., Relaksatsionnye metody bezuslovnoi optimizatsii, osnovannye na printsipakh obucheniya, Ucheb. posobie, GOU VPO “Kemerovskii gosudarstvennyi universitet”, Kuzbassvuzizdat, Kemerovo, 2004, 171 pp.

[8] Krutikov V. N., Obuchayuschiesya metody bezuslovnoi optimizatsii i ikh primenenie, Izd-vo Tom. gos. pedagogicheskogo un-ta, Tomsk, 2008, 264 pp.

[9] Kaczmarz S., “Approximate solution of systems of linear equations”, Int. J. Control, 54:3 (1993), 1239–1241 | MR

[10] Tsypkin Ya. Z., Osnovy teorii obuchayuschikhsya sistem, Nauka, M., 1981, 251 pp. | MR

[11] Krutikov V. N., Vershinin Ya. N., “Algoritmy obucheniya na osnove ortogonalizatsii posledovatelnykh vektorov”, Vestnik KemGU, 2012, no. 2(50), 37–42

[12] Skokov V. A., “Varianty metoda urovnei dlya minimizatsii negladkikh vypuklykh funktsii i ikh chislennoe issledovanie”, Ekonomika i matematicheskie metody, 33:1 (1997), 129–138 | Zbl

[13] Polyak B. T., Vvedenie v optimizatsiyu, Nauka, M., 1983, 384 pp. | MR