Abstract
The inspection of feature-rich information spaces often requires supportive tools that reduce visual clutter without sacrificing details. One common approach is to use focus+context lenses that provide multiple views of the data. While these lenses present local details together with global context, they require additional manual interaction. In this paper, we discuss the design space for gaze-adaptive lenses and present an approach that automatically displays additional details with respect to visual focus. We developed a prototype for a map application capable of displaying names and star-ratings of different restaurants. In a pilot study, we compared the gaze-adaptive lens to a mouse-only system in terms of efficiency, effectiveness, and usability. Our results revealed that participants were faster in locating the restaurants and more accurate in a map drawing task when using the gaze-adaptive lens. We discuss these results in relation to observed search strategies and inspected map areas.
Original language | English |
---|---|
Title of host publication | Proceedings ETRA 2020 Full Papers - ACM Symposium on Eye Tracking Research and Applications |
Editors | Andreas Bulling, Anke Huckauf, Eakta Jain, Ralph Radach, Daniel Weiskopf |
Place of Publication | New York |
Publisher | Association for Computing Machinery (ACM) |
Number of pages | 9 |
ISBN (Electronic) | 978-1-4503-7133-9 |
DOIs | |
Publication status | Published - Jun 2020 |
Externally published | Yes |