GazeLens: Guiding Attention to Improve Gaze Interpretation in Hub-Satellite Collaboration
Paper in proceeding, 2019

In hub-satellite collaboration using video, interpreting gaze direction is critical for communication between hub coworkers sitting around a table and their remote satellite colleague. However, 2D video distorts images and makes this interpretation inaccurate. We present GazeLens, a video conferencing system that improves hub coworkers’ ability to interpret the satellite worker’s gaze. A 360∘ camera captures the hub coworkers and a ceiling camera captures artifacts on the hub table. The system combines these two video feeds in an interface. Lens widgets strategically guide the satellite worker’s attention toward specific areas of her/his screen allow hub coworkers to clearly interpret her/his gaze direction. Our evaluation shows that GazeLens (1) increases hub coworkers’ overall gaze interpretation accuracy by 25.8% in comparison to a conventional video conferencing system, (2) especially for physical artifacts on the hub table, and (3) improves hub coworkers’ ability to distinguish between gazes toward people and artifacts. We discuss how screen space can be leveraged to improve gaze interpretation.

lens widgets

gaze

remote collaboration

telepresence

Author

Khanh Duy Le

Chalmers, Computer Science and Engineering (Chalmers), Interaction design

Ignacio Avellino

Sorbonne University

Cédric Fleury

University of Paris-Sud

Morten Fjeld

Chalmers, Computer Science and Engineering (Chalmers), Interaction design

Andreas Kunz

Swiss Federal Institute of Technology in Zürich (ETH)

Human-Computer Interaction

03029743 (ISSN) 16113349 (eISSN)

Vol. 11747 282-303
978-3-030-29383-3 (ISBN)

17th IFIP TC 13 International Conference on Human-Computer Interaction
Paphos, Cyprus,

Subject Categories

Human Computer Interaction

DOI

10.1007/978-3-030-29384-0_18

More information

Latest update

10/16/2019