Computational eyeglasses and near-eye displays with focus cues

Gordon Wetzstein

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In emerging VR/AR applications, the primary interface between the user and the digital world is the near-eye display. However, today’s VR/AR systems struggle to provide natural and comfortable viewing experiences, partly due to their fixed focal plane designs. In this paper, we discuss gaze-contingent varifocal display designs for next-generation computational near-eye displays. Moreover, we discuss how the same technology components, eye tracking and focus-tunable optics, can help presbyopes see the real world better. As humans age, they gradually lose the ability to accommodate, or refocus, to near distances because of the stiffening of the crystalline lens. This condition, known as presbyopia, affects nearly 20% of people worldwide. Here, we discuss the design and implementation of a new presbyopia correction, autofocals, to externally mimic the natural accommodation response, combining eye tracker and depth sensor data to automatically drive focus-tunable lenses. With extensive user studies, we demonstrate that autofocals improve visual acuity over conventional monovision eyeglasses and task performance over progressives.
Original languageEnglish (US)
Title of host publicationSID Symposium Digest of Technical Papers
PublisherWiley
Pages41-44
Number of pages4
DOIs
StatePublished - Sep 25 2020
Externally publishedYes

Fingerprint

Dive into the research topics of 'Computational eyeglasses and near-eye displays with focus cues'. Together they form a unique fingerprint.

Cite this