TY - GEN
T1 - PanoVerse
T2 - 28th International Conference on Web3D Technology, Web3D 2023
AU - Pintore, Giovanni
AU - Jaspe-Villanueva, Alberto
AU - Hadwiger, Markus
AU - Gobbetti, Enrico
AU - Schneider, Jens
AU - Agus, Marco
N1 - Publisher Copyright:
© 2023 Owner/Author.
PY - 2023/10/9
Y1 - 2023/10/9
N2 - We present a novel framework, dubbed PanoVerse, for the automatic creation and presentation of immersive stereoscopic environments from a single indoor panoramic image. Once per 360° shot, a novel data-driven architecture generates a fixed set of panoramic stereo pairs distributed around the current central view-point. Once per frame, directly on the HMD, we rapidly fuse the precomputed views to seamlessly cover the exploration workspace. To realize this system, we introduce several novel techniques that combine and extend state-of-the art data-driven techniques. In particular, we present a gated architecture for panoramic monocular depth estimation and, starting from the re-projection of visible pixels based on predicted depth, we exploit the same gated architecture for inpainting the occluded and disoccluded areas, introducing a mixed GAN with self-supervised loss to evaluate the stereoscopic consistency of the generated images. At interactive rates, we interpolate precomputed panoramas to produce photorealistic stereoscopic views in a lightweight WebXR viewer. The system works on a variety of available VR headsets and can serve as a base component for Metaverse applications. We demonstrate our technology on several indoor scenes from publicly available data.
AB - We present a novel framework, dubbed PanoVerse, for the automatic creation and presentation of immersive stereoscopic environments from a single indoor panoramic image. Once per 360° shot, a novel data-driven architecture generates a fixed set of panoramic stereo pairs distributed around the current central view-point. Once per frame, directly on the HMD, we rapidly fuse the precomputed views to seamlessly cover the exploration workspace. To realize this system, we introduce several novel techniques that combine and extend state-of-the art data-driven techniques. In particular, we present a gated architecture for panoramic monocular depth estimation and, starting from the re-projection of visible pixels based on predicted depth, we exploit the same gated architecture for inpainting the occluded and disoccluded areas, introducing a mixed GAN with self-supervised loss to evaluate the stereoscopic consistency of the generated images. At interactive rates, we interpolate precomputed panoramas to produce photorealistic stereoscopic views in a lightweight WebXR viewer. The system works on a variety of available VR headsets and can serve as a base component for Metaverse applications. We demonstrate our technology on several indoor scenes from publicly available data.
KW - Data-driven Methods
KW - Immersive Stereoscopic Exploration
KW - Indoor Environments
KW - Metaverse Applications
KW - Omnidirectional Images
KW - WebXR
UR - http://www.scopus.com/inward/record.url?scp=85175421251&partnerID=8YFLogxK
U2 - 10.1145/3611314.3615914
DO - 10.1145/3611314.3615914
M3 - Conference contribution
AN - SCOPUS:85175421251
T3 - Proceedings - Web3D 2023: 28th International Conference on Web3D Technology
BT - Proceedings - Web3D 2023
A2 - Spencer, Stephen N.
PB - Association for Computing Machinery, Inc
Y2 - 9 October 2023 through 11 October 2023
ER -