Sparse Polynomial Chaos expansions using variational relevance vector machines

Panagiotis Tsilifis, Iason Papaioannou, Daniel Straub, Fabio Nobile

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

The challenges for non-intrusive methods for Polynomial Chaos modeling lie in the computational efficiency and accuracy under a limited number of model simulations. These challenges can be addressed by enforcing sparsity in the series representation through retaining only the most important basis terms. In this work, we present a novel sparse Bayesian learning technique for obtaining sparse Polynomial Chaos expansions which is based on a Relevance Vector Machine model and is trained using Variational Inference. The methodology shows great potential in high-dimensional data-driven settings using relatively few data points and achieves user-controlled sparse levels that are comparable to other methods such as compressive sensing. The proposed approach is illustrated on two numerical examples, a synthetic response function that is explored for validation purposes and a low-carbon steel plate with random Young's modulus and random loading, which is modeled by stochastic finite element with 38 input random variables.
Original languageEnglish (US)
Pages (from-to)109498
JournalJournal of Computational Physics
Volume416
DOIs
StatePublished - Sep 2020
Externally publishedYes

Fingerprint

Dive into the research topics of 'Sparse Polynomial Chaos expansions using variational relevance vector machines'. Together they form a unique fingerprint.

Cite this