TY - JOUR
T1 - Distributed adaptive learning with multiple kernels in diffusion networks
AU - Shin, Ban Sok
AU - Yukawa, Masahiro
AU - Cavalcante, Renato Luis Garrido
AU - Dekorsy, Armin
N1 - Funding Information:
Manuscript received January 19, 2018; revised July 10, 2018 and August 13, 2018; accepted August 21, 2018. Date of publication August 31, 2018; date of current version September 24, 2018. The associate editor coordinating the review of this manuscript and approving it for publication was Prof. Gesualdo Scutari. The work of M. Yukawa was supported by the Japan Society for the Promotion of Science Grants-in-Aid (15K06081, 15K13986, and 15H02757). (Corresponding author: Ban-Sok Shin.) B.-S. Shin and A. Dekorsy are with the Department of Communications Engineering, University of Bremen, 28359 Bremen, Germany (e-mail:, shin@ant.uni-bremen.de; dekorsy@ant.uni-bremen.de).
Publisher Copyright:
© 2018 IEEE.
PY - 2018/11/1
Y1 - 2018/11/1
N2 - We propose an adaptive scheme for distributed learning of nonlinear functions by a network of nodes. The proposed algorithm consists of a local adaptation stage utilizing multiple kernels with projections onto hyperslabs and a diffusion stage to achieve consensus on the estimates over the whole network. Multiple kernels are incorporated to enhance the approximation of functions with several high- A nd low-frequency components common in practical scenarios. We provide a thorough convergence analysis of the proposed scheme based on the metric of the Cartesian product of multiple reproducing kernel Hilbert spaces. To this end, we introduce a modified consensus matrix considering this specific metric and prove its equivalence to the ordinary consensus matrix. Besides, the use of hyperslabs enables a significant reduction of the computational demand with only a minor loss in the performance. Numerical evaluations with synthetic and real data are conducted showing the efficacy of the proposed algorithm compared to the state-of-the-art schemes.
AB - We propose an adaptive scheme for distributed learning of nonlinear functions by a network of nodes. The proposed algorithm consists of a local adaptation stage utilizing multiple kernels with projections onto hyperslabs and a diffusion stage to achieve consensus on the estimates over the whole network. Multiple kernels are incorporated to enhance the approximation of functions with several high- A nd low-frequency components common in practical scenarios. We provide a thorough convergence analysis of the proposed scheme based on the metric of the Cartesian product of multiple reproducing kernel Hilbert spaces. To this end, we introduce a modified consensus matrix considering this specific metric and prove its equivalence to the ordinary consensus matrix. Besides, the use of hyperslabs enables a significant reduction of the computational demand with only a minor loss in the performance. Numerical evaluations with synthetic and real data are conducted showing the efficacy of the proposed algorithm compared to the state-of-the-art schemes.
KW - Distributed adaptive learning
KW - consensus
KW - kernel adaptive filter
KW - multiple kernels
KW - nonlinear regression
KW - spatial reconstruction
UR - http://www.scopus.com/inward/record.url?scp=85052811437&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85052811437&partnerID=8YFLogxK
U2 - 10.1109/TSP.2018.2868040
DO - 10.1109/TSP.2018.2868040
M3 - Article
AN - SCOPUS:85052811437
SN - 1053-587X
VL - 66
SP - 5505
EP - 5519
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
IS - 21
M1 - 8453003
ER -