Tracing the Trajectory of Sensory Plasticity across Different Stages of Speech Learning in Adulthood

Rachel Reetzke, Zilong Xie, Fernando Llanos, Bharath Chandrasekaran*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

53 Scopus citations

Abstract

Although challenging, adults can learn non-native phonetic contrasts with extensive training [1, 2], indicative of perceptual learning beyond an early sensitivity period [3, 4]. Training can alter low-level sensory encoding of newly acquired speech sound patterns [5]; however, the time-course, behavioral relevance, and long-term retention of such sensory plasticity is unclear. Some theories argue that sensory plasticity underlying signal enhancement is immediate and critical to perceptual learning [6, 7]. Others, like the reverse hierarchy theory (RHT), posit a slower time-course for sensory plasticity [8]. RHT proposes that higher-level categorical representations guide immediate, novice learning, while lower-level sensory changes do not emerge until expert stages of learning [9]. We trained 20 English-speaking adults to categorize a non-native phonetic contrast (Mandarin lexical tones) using a criterion-dependent sound-to-category training paradigm. Sensory and perceptual indices were assayed across operationally defined learning phases (novice, experienced, over-trained, and 8-week retention) by measuring the frequency-following response, a neurophonic potential that reflects fidelity of sensory encoding, and the perceptual identification of a tone continuum. Our results demonstrate that while robust changes in sensory encoding and perceptual identification of Mandarin tones emerged with training and were retained, such changes followed different timescales. Sensory changes were evidenced and related to behavioral performance only when participants were over-trained. In contrast, changes in perceptual identification reflecting improvement in categorical percept emerged relatively earlier. Individual differences in perceptual identification, and not sensory encoding, related to faster learning. Our findings support the RHT—sensory plasticity accompanies, rather than drives, expert levels of non-native speech learning. Reetzke et al. show that as adults are trained to categorize non-native speech sounds, sensory encoding of non-native speech sound patterns improves only after an expert level of behavioral performance. Training-induced changes in sensory encoding relate to behavioral performance and endure beyond the period of training.

Original languageEnglish (US)
Pages (from-to)1419-1427.e4
JournalCurrent Biology
Volume28
Issue number9
DOIs
StatePublished - May 7 2018

Funding

This work was supported by the National Institute On Deafness and Other Communication Disorders of the National Institutes of Health under award numbers R01DC015504 and R01DC013315 (B.C.). Earlier stages of this project were presented as podium presentations at the 2016 and 2017 mid-winter meetings of the Association for Research in Otolaryngology, where we received helpful feedback from peers. The authors would like to thank Jessica Roeder for the development of the dual-task and Erika Skoe for providing the MATLAB codes to create the autocorrelograms and implement the F0 tracking analysis. We also thank the members of the SoundBrain Laboratory for assistance with participant recruitment, data collection, and data preprocessing. Finally, the authors would like to thank three anonymous reviewers for their helpful comments and suggestions.

Keywords

  • auditory
  • frequency-following response
  • perceptual identification
  • perceptual learning
  • plasticity
  • reverse hierarchy theory
  • sensory encoding

ASJC Scopus subject areas

  • General Neuroscience
  • General Biochemistry, Genetics and Molecular Biology
  • General Agricultural and Biological Sciences

Fingerprint

Dive into the research topics of 'Tracing the Trajectory of Sensory Plasticity across Different Stages of Speech Learning in Adulthood'. Together they form a unique fingerprint.

Cite this