Local Random Feature Approximations of the Gaussian Kernel

Jonas Wacker*, Maurizio Filippone

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

Abstract

A fundamental drawback of kernel-based statistical models is their limited scalability to large data sets, which requires resorting to approximations. In this work, we focus on the popular Gaussian kernel and on techniques to linearize kernel-based models by means of random feature approximations. In particular, we do so by studying a less explored random feature approximation based on Maclaurin expansions and polynomial sketches. We show that such approaches yield poor results when modelling high-frequency data, and we propose a novel localization scheme that improves kernel approximations and downstream performance significantly in this regime. We demonstrate these gains on a number of experiments involving the application of Gaussian process regression to synthetic and real-world data of different data sizes and dimensions.

Original languageEnglish (US)
Pages987-996
Number of pages10
DOIs
StatePublished - 2022
Event26th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, KES 2022 - Verona, Italy
Duration: Sep 7 2022Sep 9 2022

Conference

Conference26th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, KES 2022
Country/TerritoryItaly
CityVerona
Period09/7/2209/9/22

Keywords

  • Gaussian Kernel
  • Gaussian Processes
  • Polynomial Kernel
  • Random Features

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Local Random Feature Approximations of the Gaussian Kernel'. Together they form a unique fingerprint.

Cite this