Spectral Shape Recovery and Analysis Via Data-driven Connections

Riccardo Marin, Arianna Rampini, Umberto Castellani, Emanuele Rodola, Maks Ovsjanikov, Simone Melzi

Research output: Contribution to journalArticlepeer-review

9 Scopus citations


AbstractWe introduce a novel learning-based method to recover shapes from their Laplacian spectra, based on establishing and exploring connections in a learned latent space. The core of our approach consists in a cycle-consistent module that maps between a learned latent space and sequences of eigenvalues. This module provides an efficient and effective link between the shape geometry, encoded in a latent vector, and its Laplacian spectrum. Our proposed data-driven approach replaces the need for ad-hoc regularizers required by prior methods, while providing more accurate results at a fraction of the computational cost. Moreover, these latent space connections enable novel applications for both analyzing and controlling the spectral properties of deformable shapes, especially in the context of a shape collection. Our learning model and the associated analysis apply without modifications across different dimensions (2D and 3D shapes alike), representations (meshes, contours and point clouds), nature of the latent space (generated by an auto-encoder or a parametric model), as well as across different shape classes, and admits arbitrary resolution of the input spectrum without affecting complexity. The increased flexibility allows us to address notoriously difficult tasks in 3D vision and geometry processing within a unified framework, including shape generation from spectrum, latent space exploration and analysis, mesh super-resolution, shape exploration, style transfer, spectrum estimation for point clouds, segmentation transfer and non-rigid shape matching.
Original languageEnglish (US)
JournalInternational Journal of Computer Vision
StatePublished - 2021
Externally publishedYes


Dive into the research topics of 'Spectral Shape Recovery and Analysis Via Data-driven Connections'. Together they form a unique fingerprint.

Cite this