Gaze-Adaptive Lenses for Feature-Rich Information Spaces

Fabian Göbel, Kuno Kurzhals, Victor R Schinazi, Peter Kiefer, Martin Raubal

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

2 Citations (Scopus)

Abstract

The inspection of feature-rich information spaces often requires supportive tools that reduce visual clutter without sacrificing details. One common approach is to use focus+context lenses that provide multiple views of the data. While these lenses present local details together with global context, they require additional manual interaction. In this paper, we discuss the design space for gaze-adaptive lenses and present an approach that automatically displays additional details with respect to visual focus. We developed a prototype for a map application capable of displaying names and star-ratings of different restaurants. In a pilot study, we compared the gaze-adaptive lens to a mouse-only system in terms of efficiency, effectiveness, and usability. Our results revealed that participants were faster in locating the restaurants and more accurate in a map drawing task when using the gaze-adaptive lens. We discuss these results in relation to observed search strategies and inspected map areas.
Original languageEnglish
Title of host publicationProceedings ETRA 2020 Full Papers - ACM Symposium on Eye Tracking Research and Applications
EditorsAndreas Bulling, Anke Huckauf, Eakta Jain, Ralph Radach, Daniel Weiskopf
Place of PublicationNew York
PublisherAssociation for Computing Machinery (ACM)
Number of pages9
ISBN (Electronic)978-1-4503-7133-9
DOIs
Publication statusPublished - Jun 2020
Externally publishedYes

Fingerprint

Dive into the research topics of 'Gaze-Adaptive Lenses for Feature-Rich Information Spaces'. Together they form a unique fingerprint.

Cite this