Efficient low-rank approximation of the stochastic Galerkin matrix in tensor formats

Mike Espig, Wolfgang Hackbusch, Alexander Litvinenko, Hermann G. Matthies, Philipp Wähnert*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

33 Scopus citations

Abstract

In this article, we describe an efficient approximation of the stochastic Galerkin matrix which stems from a stationary diffusion equation. The uncertain permeability coefficient is assumed to be a log-normal random field with given covariance and mean functions. The approximation is done in the canonical tensor format and then compared numerically with the tensor train and hierarchical tensor formats. It will be shown that under additional assumptions the approximation error depends only on the smoothness of the covariance function and does not depend either on the number of random variables nor the degree of the multivariate Hermite polynomials.

Original languageEnglish (US)
Pages (from-to)818-829
Number of pages12
JournalComputers and Mathematics with Applications
Volume67
Issue number4
DOIs
StatePublished - Mar 2014

Keywords

  • Low-rank tensor formats
  • Stochastic Galerkin matrix
  • Stochastic PDEs
  • Tensor approximation
  • Uncertainty quantification

ASJC Scopus subject areas

  • Modeling and Simulation
  • Computational Theory and Mathematics
  • Computational Mathematics

Fingerprint

Dive into the research topics of 'Efficient low-rank approximation of the stochastic Galerkin matrix in tensor formats'. Together they form a unique fingerprint.

Cite this