Bimodal bilinguals co-activate both languages during spoken comprehension

Anthony Shook*, Viorica Marian

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

92 Scopus citations


Bilinguals have been shown to activate their two languages in parallel, and this process can often be attributed to overlap in input between the two languages. The present study examines whether two languages that do not overlap in input structure, and that have distinct phonological systems, such as American Sign Language (ASL) and English, are also activated in parallel. Hearing ASL-English bimodal bilinguals' and English monolinguals' eye-movements were recorded during a visual world paradigm, in which participants were instructed, in English, to select objects from a display. In critical trials, the target item appeared with a competing item that overlapped with the target in ASL phonology. Bimodal bilinguals looked more at competing item than at phonologically unrelated items and looked more at competing items relative to monolinguals, indicating activation of the sign-language during spoken English comprehension. The findings suggest that language co-activation is not modality specific, and provide insight into the mechanisms that may underlie cross-modal language co-activation in bimodal bilinguals, including the role that top-down and lateral connections between levels of processing may play in language comprehension.

Original languageEnglish (US)
Pages (from-to)314-324
Number of pages11
Issue number3
StatePublished - Sep 2012


  • American sign language
  • Bilingualism
  • Language processing

ASJC Scopus subject areas

  • Experimental and Cognitive Psychology
  • Developmental and Educational Psychology
  • Cognitive Neuroscience
  • Language and Linguistics
  • Linguistics and Language


Dive into the research topics of 'Bimodal bilinguals co-activate both languages during spoken comprehension'. Together they form a unique fingerprint.

Cite this