Distributed Sparse Optimization with Minimax Concave Regularization

Kei Komuro, Masahiro Yukawa, Renato L.G. Cavalcante

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)


We study the use of weakly-convex minmax concave (MC) regularizes in distributed sparse optimization. The global cost function is the squared error penalized by the MC regularizer. While it is convex as long as the whole system is overdetermined and the regularization parameter is sufficiently small, the local cost of each node is usually nonconvex as the system from local measurements are underdetermined in practical applications. The Moreau decomposition is applied to the MC regularizer so that the total cost takes the form of a smooth function plus the rescaled ℓ1 norm. We propose two solvers: the first applies the proximal gradient exact first-order algorithm (PG-EXTRA) directly to our cost, while the second is based on convex relaxation of the local costs to ensure convergence. Numerical examples show that the proposed approaches attain significant gains compared to the ℓ1 -based PG-EXTRA.

Original languageEnglish
Title of host publication2021 IEEE Statistical Signal Processing Workshop, SSP 2021
PublisherIEEE Computer Society
Number of pages5
ISBN (Electronic)9781728157672
Publication statusPublished - 2021 Jul 11
Externally publishedYes
Event21st IEEE Statistical Signal Processing Workshop, SSP 2021 - Virtual, Rio de Janeiro, Brazil
Duration: 2021 Jul 112021 Jul 14

Publication series

NameIEEE Workshop on Statistical Signal Processing Proceedings


Conference21st IEEE Statistical Signal Processing Workshop, SSP 2021
CityVirtual, Rio de Janeiro

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Applied Mathematics
  • Signal Processing
  • Computer Science Applications


Dive into the research topics of 'Distributed Sparse Optimization with Minimax Concave Regularization'. Together they form a unique fingerprint.

Cite this