TY - JOUR
T1 - Approximate Kernel Selection with Strong Approximate Consistency
AU - Ding, Lizhong
AU - Liu, Yong
AU - Liao, Shizhong
AU - Li, Yu
AU - Yang, Peng
AU - Pan, Yijie
AU - Huang, Chao
AU - Shao, Ling
AU - Gao, Xin
N1 - KAUST Repository Item: Exported on 2020-10-01
Acknowledged KAUST grant number(s): BAS/1/1624-01-01, URF/1/3007-01-01
Acknowledgements: This publication is based upon work supported by the King Abdullah University of Science and Technology (KAUST) Office of Sponsored Research (OSR) under Award No. URF/1/3007-01-01 and BAS/1/1624-01-01, National Natural Science Foundation of China (No. 61703396), National Natural Science Foundation of China (No. 61673293) and Shenzhen Government (GJHZ20180419190732022).
PY - 2019/9/7
Y1 - 2019/9/7
N2 - Kernel selection is fundamental to the generalization performance of kernel-based learning algorithms. Approximate kernel selection is an efficient kernel selection approach that exploits the convergence property of the kernel selection criteria and the computational virtue of kernel matrix approximation. The convergence property is measured by the notion of approximate consistency. For the existing Nyström approximations, whose sampling distributions are independent of the specific learning task at hand, it is difficult to establish the strong approximate consistency. They mainly focus on the quality of the low-rank matrix approximation, rather than the performance of the kernel selection criterion used in conjunction with the approximate matrix. In this paper, we propose a novel Nyström approximate kernel selection algorithm by customizing a criterion-driven adaptive sampling distribution for the Nyström approximation, which adaptively reduces the error between the approximate and accurate criteria. We theoretically derive the strong approximate consistency of the proposed Nyström approximate kernel selection algorithm. Finally, we empirically evaluate the approximate consistency of our algorithm as compared to state-of-the-art methods.
AB - Kernel selection is fundamental to the generalization performance of kernel-based learning algorithms. Approximate kernel selection is an efficient kernel selection approach that exploits the convergence property of the kernel selection criteria and the computational virtue of kernel matrix approximation. The convergence property is measured by the notion of approximate consistency. For the existing Nyström approximations, whose sampling distributions are independent of the specific learning task at hand, it is difficult to establish the strong approximate consistency. They mainly focus on the quality of the low-rank matrix approximation, rather than the performance of the kernel selection criterion used in conjunction with the approximate matrix. In this paper, we propose a novel Nyström approximate kernel selection algorithm by customizing a criterion-driven adaptive sampling distribution for the Nyström approximation, which adaptively reduces the error between the approximate and accurate criteria. We theoretically derive the strong approximate consistency of the proposed Nyström approximate kernel selection algorithm. Finally, we empirically evaluate the approximate consistency of our algorithm as compared to state-of-the-art methods.
UR - http://hdl.handle.net/10754/630863
UR - https://aaai.org/ojs/index.php/AAAI/article/view/4223
U2 - 10.1609/aaai.v33i01.33013462
DO - 10.1609/aaai.v33i01.33013462
M3 - Article
SN - 2374-3468
VL - 33
SP - 3462
EP - 3469
JO - Proceedings of the AAAI Conference on Artificial Intelligence
JF - Proceedings of the AAAI Conference on Artificial Intelligence
ER -