A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization
Applications of Mathematics, Tome 69 (2024) no. 6, pp. 847-866
Cet article a éte moissonné depuis la source Czech Digital Mathematics Library
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore, C. Kou, Y. Dai (2015) proposed some suitable modifications of the SSML-BFGS method such that the sufficient descent condition holds. For the sake of improvement of modified SSML-BFGS method, in this paper, we present an efficient SSML-BFGS-type three-term conjugate gradient method for solving unconstrained minimization using Ford-Moghrabi secant equation instead of the usual secant equations. The method is shown to be globally convergent under certain assumptions. Numerical results compared with methods using the usual secant equations are reported.
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore, C. Kou, Y. Dai (2015) proposed some suitable modifications of the SSML-BFGS method such that the sufficient descent condition holds. For the sake of improvement of modified SSML-BFGS method, in this paper, we present an efficient SSML-BFGS-type three-term conjugate gradient method for solving unconstrained minimization using Ford-Moghrabi secant equation instead of the usual secant equations. The method is shown to be globally convergent under certain assumptions. Numerical results compared with methods using the usual secant equations are reported.
Classification :
65K05, 90C06
Keywords: unconstrained optimization; conjugate gradient method; multi-step secant condition; self-scaling; improved Wolfe line search
Keywords: unconstrained optimization; conjugate gradient method; multi-step secant condition; self-scaling; improved Wolfe line search
@article{10_21136_AM_2024_0204_23,
author = {Kim, Yongjin and Jong, Yunchol and Kim, Yong},
title = {A self-scaling memoryless {BFGS} based conjugate gradient method using multi-step secant condition for unconstrained minimization},
journal = {Applications of Mathematics},
pages = {847--866},
year = {2024},
volume = {69},
number = {6},
doi = {10.21136/AM.2024.0204-23},
language = {en},
url = {http://geodesic.mathdoc.fr/articles/10.21136/AM.2024.0204-23/}
}
TY - JOUR AU - Kim, Yongjin AU - Jong, Yunchol AU - Kim, Yong TI - A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization JO - Applications of Mathematics PY - 2024 SP - 847 EP - 866 VL - 69 IS - 6 UR - http://geodesic.mathdoc.fr/articles/10.21136/AM.2024.0204-23/ DO - 10.21136/AM.2024.0204-23 LA - en ID - 10_21136_AM_2024_0204_23 ER -
%0 Journal Article %A Kim, Yongjin %A Jong, Yunchol %A Kim, Yong %T A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization %J Applications of Mathematics %D 2024 %P 847-866 %V 69 %N 6 %U http://geodesic.mathdoc.fr/articles/10.21136/AM.2024.0204-23/ %R 10.21136/AM.2024.0204-23 %G en %F 10_21136_AM_2024_0204_23
Kim, Yongjin; Jong, Yunchol; Kim, Yong. A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization. Applications of Mathematics, Tome 69 (2024) no. 6, pp. 847-866. doi: 10.21136/AM.2024.0204-23
Cité par Sources :