Distributed Sparse Optimization With Weakly Convex Regularizer: Consensus Promoting and Approximate Moreau Enhanced Penalties Towards Global Optimality

Kei Komuro, Masahiro Yukawa, Renato Luis Garrido Cavalcante

研究成果: Article査読

3 被引用数 (Scopus)

抄録

We propose a promising framework for distributed sparse optimization based on weakly convex regularizers. More specifically, we pose two distributed optimization problems to recover sparse signals in networks. The first problem formulation relies on statistical properties of the signals, and it uses an approximate Moreau enhanced penalty. In contrast, the second formulation does not rely on any statistical assumptions, and it uses an additional consensus promoting penalty (CPP) that convexifies the cost function over the whole network. To solve both problems, we propose a distributed proximal debiasing-gradient (DPD) method, which uses the exact first-order proximal gradient algorithm. The DPD method features a pair of proximity operators that play complementary roles: one sparsifies the estimate, and the other reduces the bias caused by the sparsification. Owing to the overall convexity of the whole cost functions, the proposed method guarantees convergence to a global minimizer, as demonstrated by numerical examples. In addition, the use of CPP improves the convergence speed significantly.

本文言語English
ページ(範囲)514-527
ページ数14
ジャーナルIEEE Transactions on Signal and Information Processing over Networks
8
DOI
出版ステータスPublished - 2022

ASJC Scopus subject areas

  • 信号処理
  • 情報システム
  • コンピュータ ネットワークおよび通信

フィンガープリント

「Distributed Sparse Optimization With Weakly Convex Regularizer: Consensus Promoting and Approximate Moreau Enhanced Penalties Towards Global Optimality」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル