A PROXIMAL QUASI-NEWTON METHOD BASED ON MEMORYLESS MODIFIED SYMMETRIC RANK-ONE FORMULA

Yasushi Narushima, Shummin Nakayama

研究成果: Article査読

抄録

We consider proximal gradient methods for minimizing a composite function of a differentiable function and a convex function. To accelerate the general proximal gradient methods, we focus on proximal quasi-Newton type methods based on proximal mappings scaled by quasi-Newton matrices. Although it is usually dificult to compute the scaled proximal mappings, applying the memoryless symmetric rank-one (SR1) formula makes this easier. Since the scaled (quasi-Newton) matrices must be positive definite, we develop an algorithm using the memoryless SR1 formula based on a modified spectral scaling secant condition. We give the subsequential convergence property of the proposed method for general objective functions. In addition, we show the R-linear convergence property of the method under a strong convexity assumption. Finally, some numerical results are reported.

本文言語English
ページ(範囲)4095-4111
ページ数17
ジャーナルJournal of Industrial and Management Optimization
19
6
DOI
出版ステータスPublished - 2023 6月

ASJC Scopus subject areas

  • ビジネスおよび国際経営
  • 戦略と経営
  • 制御と最適化
  • 応用数学

フィンガープリント

「A PROXIMAL QUASI-NEWTON METHOD BASED ON MEMORYLESS MODIFIED SYMMETRIC RANK-ONE FORMULA」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル