TY - JOUR
T1 - Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
AU - Narushima, Yasushi
AU - Yabe, Hiroshi
N1 - Funding Information:
The authors are supported in part by the Grant-in-Aid for Scientific Research (C) 21510164 of Japan Society for the Promotion of Science.
PY - 2012/11
Y1 - 2012/11
N2 - Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a descent search direction. On the other hand, Hager and Zhang (2005) proposed another conjugate gradient method which always generates a descent search direction. In this paper, combining Dai-Liao's idea and Hager-Zhang's idea, we propose conjugate gradient methods based on secant conditions that generate descent search directions. In addition, we prove global convergence properties of the proposed methods. Finally, preliminary numerical results are given.
AB - Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a descent search direction. On the other hand, Hager and Zhang (2005) proposed another conjugate gradient method which always generates a descent search direction. In this paper, combining Dai-Liao's idea and Hager-Zhang's idea, we propose conjugate gradient methods based on secant conditions that generate descent search directions. In addition, we prove global convergence properties of the proposed methods. Finally, preliminary numerical results are given.
KW - Conjugate gradient method
KW - Descent search direction
KW - Global convergence
KW - Secant condition
KW - Unconstrained optimization
UR - http://www.scopus.com/inward/record.url?scp=84862841641&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84862841641&partnerID=8YFLogxK
U2 - 10.1016/j.cam.2012.01.036
DO - 10.1016/j.cam.2012.01.036
M3 - Article
AN - SCOPUS:84862841641
SN - 0377-0427
VL - 236
SP - 4303
EP - 4317
JO - Journal of Computational and Applied Mathematics
JF - Journal of Computational and Applied Mathematics
IS - 17
ER -