Scalable Gaussian-process regression and variable selection using Vecchia approximations

Jian Cao, Joseph Guinness, Marc G. Genton, Matthias Katzfuss

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Gaussian process (GP) regression is a flexible, nonparametric approach to regression that naturally quantifies uncertainty. In many applications, the number of responses and covariates are both large, and a goal is to select covariates that are related to the response. For this setting, we propose a novel, scalable algorithm, coined VGPR, which optimizes a penalized GP log-likelihood based on the Vecchia GP approximation, an ordered conditional approximation from spatial statistics that implies a sparse Cholesky factor of the precision matrix. We traverse the regularization path from strong to weak penalization, sequentially adding candidate covariates based on the gradient of the log-likelihood and deselecting irrelevant covariates via a new quadratic constrained coordinate descent algorithm. We propose Vecchia-based mini-batch subsampling, which provides unbiased gradient estimators. The resulting procedure is scalable to millions of responses and thousands of covariates. Theoretical analysis and numerical studies demonstrate the improved scalability and accuracy relative to existing methods.

Original languageEnglish (US)
Article number348
JournalJournal of Machine Learning Research
Volume23
StatePublished - Sep 1 2022

Keywords

  • adaptive bridge penalty
  • gradient-based variable selection
  • mini-batch subsampling
  • ordered conditional approximation
  • penalized Gaussian regression

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Scalable Gaussian-process regression and variable selection using Vecchia approximations'. Together they form a unique fingerprint.

Cite this