The multiplayer: Multi-perspective social video navigation

Zihao Yu*, Nicholas Diakopoulos, Mor Naaman

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

We present a multi-perspective video "multiplayer" designed to organize social video aggregated from online sites like YouTube. Our system automatically time-aligns videos using audio fingerprinting, thus bringing them into a unified temporal frame. The interface utilizes social metadata to visually aid navigation and cue users to more interesting portions of an event. We provide details about the visual and interaction design rationale of the multiplayer.

Original languageEnglish (US)
Title of host publicationUIST 2010 - 23rd ACM Symposium on User Interface Software and Technology, Adjunct Proceedings
Pages413-414
Number of pages2
DOIs
StatePublished - Dec 6 2010
Event23rd Annual ACM Symposium on User Interface Software and Technology, UIST 2010 - New York, NY, United States
Duration: Oct 3 2010Oct 6 2010

Publication series

NameUIST 2010 - 23rd ACM Symposium on User Interface Software and Technology, Adjunct Proceedings

Other

Other23rd Annual ACM Symposium on User Interface Software and Technology, UIST 2010
CountryUnited States
CityNew York, NY
Period10/3/1010/6/10

Keywords

  • Multi-perspective
  • Social media
  • Video

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Software

Fingerprint Dive into the research topics of 'The multiplayer: Multi-perspective social video navigation'. Together they form a unique fingerprint.

Cite this