In emerging VR/AR applications, the primary interface between the user and the digital world is the near-eye display. However, today’s VR/AR systems struggle to provide natural and comfortable viewing experiences, partly due to their fixed focal plane designs. In this paper, we discuss gaze-contingent varifocal display designs for next-generation computational near-eye displays. Moreover, we discuss how the same technology components, eye tracking and focus-tunable optics, can help presbyopes see the real world better. As humans age, they gradually lose the ability to accommodate, or refocus, to near distances because of the stiffening of the crystalline lens. This condition, known as presbyopia, affects nearly 20% of people worldwide. Here, we discuss the design and implementation of a new presbyopia correction, autofocals, to externally mimic the natural accommodation response, combining eye tracker and depth sensor data to automatically drive focus-tunable lenses. With extensive user studies, we demonstrate that autofocals improve visual acuity over conventional monovision eyeglasses and task performance over progressives.