Distributed Sparse Optimization with Minimax Concave Regularization

Kei Komuro, Masahiro Yukawa, Renato L.G. Cavalcante

研究成果: Conference contribution

3 被引用数 (Scopus)

抄録

We study the use of weakly-convex minmax concave (MC) regularizes in distributed sparse optimization. The global cost function is the squared error penalized by the MC regularizer. While it is convex as long as the whole system is overdetermined and the regularization parameter is sufficiently small, the local cost of each node is usually nonconvex as the system from local measurements are underdetermined in practical applications. The Moreau decomposition is applied to the MC regularizer so that the total cost takes the form of a smooth function plus the rescaled ℓ1 norm. We propose two solvers: the first applies the proximal gradient exact first-order algorithm (PG-EXTRA) directly to our cost, while the second is based on convex relaxation of the local costs to ensure convergence. Numerical examples show that the proposed approaches attain significant gains compared to the ℓ1 -based PG-EXTRA.

本文言語English
ホスト出版物のタイトル2021 IEEE Statistical Signal Processing Workshop, SSP 2021
出版社IEEE Computer Society
ページ31-35
ページ数5
ISBN(電子版)9781728157672
DOI
出版ステータスPublished - 2021 7月 11
外部発表はい
イベント21st IEEE Statistical Signal Processing Workshop, SSP 2021 - Virtual, Rio de Janeiro, Brazil
継続期間: 2021 7月 112021 7月 14

出版物シリーズ

名前IEEE Workshop on Statistical Signal Processing Proceedings
2021-July

Conference

Conference21st IEEE Statistical Signal Processing Workshop, SSP 2021
国/地域Brazil
CityVirtual, Rio de Janeiro
Period21/7/1121/7/14

ASJC Scopus subject areas

  • 電子工学および電気工学
  • 応用数学
  • 信号処理
  • コンピュータ サイエンスの応用

フィンガープリント

「Distributed Sparse Optimization with Minimax Concave Regularization」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル