TY - GEN
T1 - On Kernel design for online model selection by Gaussian multikernel adaptive filtering
AU - Toda, Osamu
AU - Yukawa, Masahiro
N1 - Publisher Copyright:
© 2014 Asia-Pacific Signal and Information Processing Ass.
PY - 2014/2/12
Y1 - 2014/2/12
N2 - In this paper, we highlight a design of Gaussian kernels for online model selection by the multikernel adaptive filtering approach. In the typical multikernel adaptive filtering, the maximum value that each kernel function can take is one. This means that, if one employs multiple Gaussian kernels with multiple variances, the one with the largest variance would become dominant in the kernelized input vector (or matrix). This makes the autocorrelation matrix of the the kernelized input vector be ill-conditioned, causing significant deterioration in convergence speed. To avoid this ill-conditioned problem, we consider the normalization of the Gaussian kernels. Because of the normalization, the condition number of the autocorrelation matrix is improved, and hence the convergence behavior is improved considerably. As a possible alternative to the original multikernel-based online model selection approach using the Moreau-envelope approximation, we also study an adaptive extension of the generalized forward-backward splitting (GFBS) method to suppress the cost function without any approximation. Numerical examples show that the original approximate method tends to select the correct center points of the Gaussian kernels and thus outperforms the exact method.
AB - In this paper, we highlight a design of Gaussian kernels for online model selection by the multikernel adaptive filtering approach. In the typical multikernel adaptive filtering, the maximum value that each kernel function can take is one. This means that, if one employs multiple Gaussian kernels with multiple variances, the one with the largest variance would become dominant in the kernelized input vector (or matrix). This makes the autocorrelation matrix of the the kernelized input vector be ill-conditioned, causing significant deterioration in convergence speed. To avoid this ill-conditioned problem, we consider the normalization of the Gaussian kernels. Because of the normalization, the condition number of the autocorrelation matrix is improved, and hence the convergence behavior is improved considerably. As a possible alternative to the original multikernel-based online model selection approach using the Moreau-envelope approximation, we also study an adaptive extension of the generalized forward-backward splitting (GFBS) method to suppress the cost function without any approximation. Numerical examples show that the original approximate method tends to select the correct center points of the Gaussian kernels and thus outperforms the exact method.
UR - http://www.scopus.com/inward/record.url?scp=84949924569&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84949924569&partnerID=8YFLogxK
U2 - 10.1109/APSIPA.2014.7041802
DO - 10.1109/APSIPA.2014.7041802
M3 - Conference contribution
AN - SCOPUS:84949924569
T3 - 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA 2014
BT - 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA 2014
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA 2014
Y2 - 9 December 2014 through 12 December 2014
ER -