Abstract
Carrying out Bayesian inference over parameters of statistical models is intractable when the likelihood and the prior are non-conjugate. Variational bootstrap provides a way to obtain samples from the posterior distribution over model parameters, where each sample is the solution of a task where the labels are perturbed. For Bayesian linear regression with a Gaussian likelihood, variational bootstrap yields samples from the exact posterior, whereas for nonlinear models with a Gaussian likelihood some guarantees of approaching the true posterior can be established. In this work, we extend variational bootstrap to the Bernoulli likelihood to tackle classification tasks. We use a transformation of the labels which allows us to turn the classification task into a regression one, and then we apply variational bootstrap to obtain samples from an approximate posterior distribution over the parameters of the model. Variational bootstrap allows us to employ advanced gradient optimization techniques which provide fast convergence. We provide experimental evidence that the proposed approach allows us to achieve classification accuracy and uncertainty estimation comparable with MCMC methods at a fraction of the cost.
Original language | English (US) |
---|---|
Pages | 1222-1231 |
Number of pages | 10 |
DOIs | |
State | Published - 2022 |
Event | 26th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, KES 2022 - Verona, Italy Duration: Sep 7 2022 → Sep 9 2022 |
Conference
Conference | 26th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, KES 2022 |
---|---|
Country/Territory | Italy |
City | Verona |
Period | 09/7/22 → 09/9/22 |
Keywords
- Bayesian classifier
- bootstrap
- Gaussian processes
- neural networks
ASJC Scopus subject areas
- General Computer Science