TY - JOUR
T1 - A PROXIMAL QUASI-NEWTON METHOD BASED ON MEMORYLESS MODIFIED SYMMETRIC RANK-ONE FORMULA
AU - Narushima, Yasushi
AU - Nakayama, Shummin
N1 - Funding Information:
2020 Mathematics Subject Classification. Primary: 90C30, 90C53, 90C06; Secondary: 65K05. Key words and phrases. Nonsmooth optimization, proximal quasi-Newton method, memoryless quasi-Newton method, symmetric rank one formula, convergence property. This research was supported by JSPS KAKENHI (grant numbers JP18K11179, JP20K11698, and JP20K14986). ∗Corresponding author: Yasushi Narushima.
Publisher Copyright:
© 2023,Journal of Industrial and Management Optimization. All Rights Reserved.
PY - 2023/6
Y1 - 2023/6
N2 - We consider proximal gradient methods for minimizing a composite function of a differentiable function and a convex function. To accelerate the general proximal gradient methods, we focus on proximal quasi-Newton type methods based on proximal mappings scaled by quasi-Newton matrices. Although it is usually dificult to compute the scaled proximal mappings, applying the memoryless symmetric rank-one (SR1) formula makes this easier. Since the scaled (quasi-Newton) matrices must be positive definite, we develop an algorithm using the memoryless SR1 formula based on a modified spectral scaling secant condition. We give the subsequential convergence property of the proposed method for general objective functions. In addition, we show the R-linear convergence property of the method under a strong convexity assumption. Finally, some numerical results are reported.
AB - We consider proximal gradient methods for minimizing a composite function of a differentiable function and a convex function. To accelerate the general proximal gradient methods, we focus on proximal quasi-Newton type methods based on proximal mappings scaled by quasi-Newton matrices. Although it is usually dificult to compute the scaled proximal mappings, applying the memoryless symmetric rank-one (SR1) formula makes this easier. Since the scaled (quasi-Newton) matrices must be positive definite, we develop an algorithm using the memoryless SR1 formula based on a modified spectral scaling secant condition. We give the subsequential convergence property of the proposed method for general objective functions. In addition, we show the R-linear convergence property of the method under a strong convexity assumption. Finally, some numerical results are reported.
KW - convergence property
KW - memoryless quasi-Newton method
KW - Nonsmooth optimization
KW - proximal quasi-Newton method
KW - symmetric rank one formula
UR - http://www.scopus.com/inward/record.url?scp=85151978311&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85151978311&partnerID=8YFLogxK
U2 - 10.3934/jimo.2022123
DO - 10.3934/jimo.2022123
M3 - Article
AN - SCOPUS:85151978311
SN - 1547-5816
VL - 19
SP - 4095
EP - 4111
JO - Journal of Industrial and Management Optimization
JF - Journal of Industrial and Management Optimization
IS - 6
ER -