On-the-fly generation and rendering of infinite cities on the GPU

Markus Steinberger, Michael Kenzel, Bernhard K. Kainz, Peter Wonka, Dieter Schmalstieg

Research output: Contribution to journalArticlepeer-review

29 Scopus citations

Abstract

In this paper, we present a new approach for shape-grammar-based generation and rendering of huge cities in real-time on the graphics processing unit (GPU). Traditional approaches rely on evaluating a shape grammar and storing the geometry produced as a preprocessing step. During rendering, the pregenerated data is then streamed to the GPU. By interweaving generation and rendering, we overcome the problems and limitations of streaming pregenerated data. Using our methods of visibility pruning and adaptive level of detail, we are able to dynamically generate only the geometry needed to render the current view in real-time directly on the GPU. We also present a robust and efficient way to dynamically update a scene's derivation tree and geometry, enabling us to exploit frame-to-frame coherence. Our combined generation and rendering is significantly faster than all previous work. For detailed scenes, we are capable of generating geometry more rapidly than even just copying pregenerated data from main memory, enabling us to render cities with thousands of buildings at up to 100 frames per second, even with the camera moving at supersonic speed. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.
Original languageEnglish (US)
Pages (from-to)105-114
Number of pages10
JournalComputer Graphics Forum
Volume33
Issue number2
DOIs
StatePublished - Jun 1 2014

ASJC Scopus subject areas

  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'On-the-fly generation and rendering of infinite cities on the GPU'. Together they form a unique fingerprint.

Cite this