Abstract
A fundamental drawback of kernel-based statistical models is their limited scalability to large data sets, which requires resorting to approximations. In this work, we focus on the popular Gaussian kernel and on techniques to linearize kernel-based models by means of random feature approximations. In particular, we do so by studying a less explored random feature approximation based on Maclaurin expansions and polynomial sketches. We show that such approaches yield poor results when modelling high-frequency data, and we propose a novel localization scheme that improves kernel approximations and downstream performance significantly in this regime. We demonstrate these gains on a number of experiments involving the application of Gaussian process regression to synthetic and real-world data of different data sizes and dimensions.
Original language | English (US) |
---|---|
Pages | 987-996 |
Number of pages | 10 |
DOIs | |
State | Published - 2022 |
Event | 26th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, KES 2022 - Verona, Italy Duration: Sep 7 2022 → Sep 9 2022 |
Conference
Conference | 26th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, KES 2022 |
---|---|
Country/Territory | Italy |
City | Verona |
Period | 09/7/22 → 09/9/22 |
Keywords
- Gaussian Kernel
- Gaussian Processes
- Polynomial Kernel
- Random Features
ASJC Scopus subject areas
- General Computer Science