Seeing speech affects acoustic information processing in the human brainstem

Gabriella Musacchia*, Mikko Sams, Trent Nicol, Nina Kraus

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

79 Scopus citations

Abstract

Afferent auditory processing in the human brainstem is often assumed to be determined by acoustic stimulus features alone and immune to stimulation by other senses or cognitive factors. In contrast, we show that lipreading during speech perception influences early acoustic processing. Event-related brainstem potentials were recorded from ten healthy adults to concordant (acoustic-visual match), conflicting (acoustic-visual mismatch) and unimodal stimuli. Audiovisual (AV) interactions occurred as early as ∼11 ms post-acoustic stimulation and persisted for the first 30 ms of the response. Furthermore, the magnitude of interaction depended on AV pairings. These findings indicate considerable plasticity in early auditory processing.

Original languageEnglish (US)
Pages (from-to)1-10
Number of pages10
JournalExperimental Brain Research
Volume168
Issue number1-2
DOIs
StatePublished - Jan 2006

Funding

Acknowledgements NIH R01 DC01510 supported this work. The authors wish to thank their colleagues in the Auditory Neuroscience Laboratory at Northwestern University as well as Dan Zellner and the staff at Northwestern’s Digital Media Studio for their film editing expertise.

Keywords

  • Auditory
  • Brainstem
  • Multisensory
  • Speech
  • Visual

ASJC Scopus subject areas

  • General Neuroscience

Fingerprint

Dive into the research topics of 'Seeing speech affects acoustic information processing in the human brainstem'. Together they form a unique fingerprint.

Cite this