TY - GEN
T1 - How are the centered kernel principal components relevant to regression task? -An exact analysis
AU - Yukawa, Masahiro
AU - Muller, Klaus Robert
AU - Ogino, Yuto
N1 - Funding Information:
∗This work was supported in part by JSPS Grants-in-Aids (15K06081, 15K13986, 15H02757), in part by the National Research Foundation of Korea funded by the Ministry of Education, Science and Technologyin the BK21 program, in part by Institute for Information & Communications Technology Promotion (IITP) grant funded by the Korea government (No. 2017-0-00451), and in part by the German Research Foundation under Grant DFG MU 987/6-1, Grant SPP 1527, and Grant MU 987/14-1 as well as BMBF (BBDC).
Publisher Copyright:
© 2018 IEEE.
PY - 2018/9/10
Y1 - 2018/9/10
N2 - We present an exact analytic expression of the contributions of the kernel principal components to the relevant information in a nonlinear regression problem. A related study has been presented by Braun, Buhmann, and Müller in 2008, where an upper bound of the contributions was given for a general supervised learning problem but with 'uncentered' kernel PCAs. Our analysis clarifies that the relevant information of a kernel regression under explicit centering operation is contained in a finite number of leading kernel principal components, as in the 'uncentered' kernel-Pca case, if the kernel matches the underlying nonlinear function so that the eigenvalues of the centered kernel matrix decay quickly. We compare the regression performances of the least-square-based methods with the centered and uncentered kernel PCAs by simulations.
AB - We present an exact analytic expression of the contributions of the kernel principal components to the relevant information in a nonlinear regression problem. A related study has been presented by Braun, Buhmann, and Müller in 2008, where an upper bound of the contributions was given for a general supervised learning problem but with 'uncentered' kernel PCAs. Our analysis clarifies that the relevant information of a kernel regression under explicit centering operation is contained in a finite number of leading kernel principal components, as in the 'uncentered' kernel-Pca case, if the kernel matches the underlying nonlinear function so that the eigenvalues of the centered kernel matrix decay quickly. We compare the regression performances of the least-square-based methods with the centered and uncentered kernel PCAs by simulations.
KW - Kernel PCA
KW - Nonlinear regression
KW - Reproducing kernel Hilbert space
KW - Spectral decomposition
UR - http://www.scopus.com/inward/record.url?scp=85054209716&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85054209716&partnerID=8YFLogxK
U2 - 10.1109/ICASSP.2018.8462392
DO - 10.1109/ICASSP.2018.8462392
M3 - Conference contribution
AN - SCOPUS:85054209716
SN - 9781538646588
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 2841
EP - 2845
BT - 2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018
Y2 - 15 April 2018 through 20 April 2018
ER -