Approximate Kernel Selection via Matrix Approximation

Lizhong Ding, Shizhong Liao, Yong Liu, Li Liu, Fan Zhu, Yazhou Yao, Ling Shao, Xin Gao

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

Kernel selection is of fundamental importance for the generalization of kernel methods. This article proposes an approximate approach for kernel selection by exploiting the approximability of kernel selection and the computational virtue of kernel matrix approximation. We define approximate consistency to measure the approximability of the kernel selection problem. Based on the analysis of approximate consistency, we solve the theoretical problem of whether, under what conditions, and at what speed, the approximate criterion is close to the accurate, one, establishing the foundations of approximate kernel selection. We introduce two selection criteria based on error estimation and prove the approximate consistency of the multilevel circulant matrix (MCM) approximation and Nyström approximation under these criteria. Under the theoretical guarantees of the approximate consistency, we design approximate algorithms for kernel selection, which exploits the computational advantages of the MCM and Nyström approximations to conduct kernel selection in a linear or quasi-linear complexity. We experimentally validate the theoretical results for the approximate consistency and evaluate the effectiveness of the proposed kernel selection algorithms.
Original languageEnglish (US)
Pages (from-to)1-11
Number of pages11
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
StatePublished - Jan 14 2020

Fingerprint

Dive into the research topics of 'Approximate Kernel Selection via Matrix Approximation'. Together they form a unique fingerprint.

Cite this