Gazed and confused: Understanding and designing shared gaze for remote collaboration

Sarah D'Angelo, Darren Gergle

Research output: Chapter in Book/Report/Conference proceedingConference contribution

83 Scopus citations

Abstract

People utilize eye gaze as an important cue for monitoring attention and coordinating awareness. This study investigates how remote pairs make use of a graphical representation of their partner's eye-gaze during a tightly-coupled collaborative task. Our results suggest that reproducing shared gaze in a remote collaboration setting makes pairs more accurate when referring to linguistically complex objects by facilitating the production of efficient forms of deictic references. We discuss how the availability of gaze influences coordination strategies and implications for the design of shared gaze in remote collaboration systems.

Original languageEnglish (US)
Title of host publicationCHI 2016 - Proceedings, 34th Annual CHI Conference on Human Factors in Computing Systems
PublisherAssociation for Computing Machinery
Pages2492-2496
Number of pages5
ISBN (Electronic)9781450333627
DOIs
StatePublished - May 7 2016
Event34th Annual Conference on Human Factors in Computing Systems, CHI 2016 - San Jose, United States
Duration: May 7 2016May 12 2016

Publication series

NameConference on Human Factors in Computing Systems - Proceedings

Other

Other34th Annual Conference on Human Factors in Computing Systems, CHI 2016
Country/TerritoryUnited States
CitySan Jose
Period5/7/165/12/16

Keywords

  • Computer supported collaborative work
  • Eye-tracking

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'Gazed and confused: Understanding and designing shared gaze for remote collaboration'. Together they form a unique fingerprint.

Cite this