TY - JOUR
T1 - Global convergence of a memory gradient method for unconstrained optimization
AU - Narushima, Yasushi
AU - Yabe, Hiroshi
N1 - Copyright:
Copyright 2008 Elsevier B.V., All rights reserved.
PY - 2006/11
Y1 - 2006/11
N2 - Memory gradient methods are used for unconstrained optimization, especially large scale problems. The first idea of memory gradient methods was proposed by Miele and Cantrell (1969) and Cragg and Levy (1969). In this paper, we present a new memory gradient method which generates a descent search direction for the objective function at every iteration. We show that our method converges globally to the solution if the Wolfe conditions are satisfied within the framework of the line search strategy. Our numerical results show that the proposed method is efficient for given standard test problems if we choose a good parameter included in the method.
AB - Memory gradient methods are used for unconstrained optimization, especially large scale problems. The first idea of memory gradient methods was proposed by Miele and Cantrell (1969) and Cragg and Levy (1969). In this paper, we present a new memory gradient method which generates a descent search direction for the objective function at every iteration. We show that our method converges globally to the solution if the Wolfe conditions are satisfied within the framework of the line search strategy. Our numerical results show that the proposed method is efficient for given standard test problems if we choose a good parameter included in the method.
KW - Descent search direction
KW - Global convergence
KW - Memory gradient method
KW - Unconstrained optimization
KW - Wolfe conditions
UR - http://www.scopus.com/inward/record.url?scp=33750913699&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33750913699&partnerID=8YFLogxK
U2 - 10.1007/s10589-006-8719-z
DO - 10.1007/s10589-006-8719-z
M3 - Article
AN - SCOPUS:33750913699
SN - 0926-6003
VL - 35
SP - 325
EP - 346
JO - Computational Optimization and Applications
JF - Computational Optimization and Applications
IS - 3
ER -