TY - JOUR
T1 - Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz
AU - Angelopoulos, Anastasios N.
AU - Martel, Julien N.P.
AU - Kohli, Amit P.
AU - Conradt, Jorg
AU - Wetzstein, Gordon
N1 - KAUST Repository Item: Exported on 2021-06-24
Acknowledgements: A.N.A. was supported by a National Science Foundation (NSF) Fellowship and a Berkeley Fellowship. J.N.P.M. was supported by a Swiss
National Foundation (SNF) Fellowship (P2EZP2 181817), G.W. was supported by an NSF CAREER Award (IIS 1553333), a Sloan Fellowship, by the KAUST Office of Sponsored Research through the Visual Computing Center CCF grant, and a PECASE by the ARL. Thanks to Stephen Boyd and Mert Pilanci for helpful conversations.
This publication acknowledges KAUST support, but has no KAUST affiliated authors.
PY - 2021/5
Y1 - 2021/5
N2 - Fast and accurate eye tracking is crucial for many applications. Current camera-based eye tracking systems, however, are fundamentally limited by their bandwidth, forcing a tradeoff between image resolution and framerate, i.e. between latency and update rate. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial eye trackers when evaluated in the same conditions. Our system builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the gaze vector from the parametric pupil model in real time. Using the first hybrid frame-event gaze dataset, which will be made public, we demonstrate that our system achieves accuracies of 0.45 degrees -- 1.75 degrees for fields of view ranging from 45 degrees to 98 degrees.
AB - Fast and accurate eye tracking is crucial for many applications. Current camera-based eye tracking systems, however, are fundamentally limited by their bandwidth, forcing a tradeoff between image resolution and framerate, i.e. between latency and update rate. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial eye trackers when evaluated in the same conditions. Our system builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the gaze vector from the parametric pupil model in real time. Using the first hybrid frame-event gaze dataset, which will be made public, we demonstrate that our system achieves accuracies of 0.45 degrees -- 1.75 degrees for fields of view ranging from 45 degrees to 98 degrees.
UR - http://hdl.handle.net/10754/662526
UR - https://ieeexplore.ieee.org/document/9389490/
U2 - 10.1109/tvcg.2021.3067784
DO - 10.1109/tvcg.2021.3067784
M3 - Article
SN - 1077-2626
VL - 27
SP - 2577
EP - 2586
JO - IEEE Transactions on Visualization and Computer Graphics
JF - IEEE Transactions on Visualization and Computer Graphics
IS - 5
ER -