Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization

Yasushi Narushima, Hiroshi Yabe

研究成果: Article査読

38 被引用数 (Scopus)

抄録

Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a descent search direction. On the other hand, Hager and Zhang (2005) proposed another conjugate gradient method which always generates a descent search direction. In this paper, combining Dai-Liao's idea and Hager-Zhang's idea, we propose conjugate gradient methods based on secant conditions that generate descent search directions. In addition, we prove global convergence properties of the proposed methods. Finally, preliminary numerical results are given.

本文言語English
ページ(範囲)4303-4317
ページ数15
ジャーナルJournal of Computational and Applied Mathematics
236
17
DOI
出版ステータスPublished - 2012 11月
外部発表はい

ASJC Scopus subject areas

  • 計算数学
  • 応用数学

フィンガープリント

「Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル