Abstract
How does the mind process linguistic and non-linguistic sounds? The current study assessed the different ways that spoken words (e.g., “dog”) and characteristic sounds (e.g., <barking>) provide access to phonological information (e.g., word-form of “dog”) and semantic information (e.g., knowledge that a dog is associated with a leash). Using an eye-tracking paradigm, we found that listening to words prompted rapid phonological activation, which was then followed by semantic access. The opposite pattern emerged for sounds, with early semantic access followed by later retrieval of phonological information. Despite differences in the time courses of conceptual access, both words and sounds elicited robust activation of phonological and semantic knowledge. These findings inform models of auditory processing by revealing the pathways between speech and non-speech input and their corresponding word forms and concepts, which influence the speed, magnitude, and duration of linguistic and nonlinguistic activation.
Original language | English (US) |
---|---|
Pages (from-to) | 1135-1149 |
Number of pages | 15 |
Journal | Quarterly Journal of Experimental Psychology |
Volume | 73 |
Issue number | 8 |
DOIs | |
State | Published - Aug 1 2020 |
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Research reported in this publication was supported in part by the Eunice Kennedy Shriver National Institute Of Child Health & Human Development of the National Institutes of Health under Award Number R01HD059858 to Viorica Marian. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. The authors thank the members of the Northwestern University Bilingualism and Psycholinguistics Research Group for helpful comments and input. The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Research reported in this publication was supported in part by the Eunice Kennedy Shriver National Institute Of Child Health & Human Development of the National Institutes of Health under Award Number R01HD059858 to Viorica Marian. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Keywords
- Speech comprehension
- eye-tracking
- phonology
- psycholinguistics
- semantic competition
- sound processing
ASJC Scopus subject areas
- Experimental and Cognitive Psychology
- Neuropsychology and Physiological Psychology
- General Psychology
- Physiology (medical)
- Physiology
Fingerprint
Dive into the research topics of 'Listening to speech and non-speech sounds activates phonological and semantic knowledge differently'. Together they form a unique fingerprint.Datasets
-
Listening to speech and non-speech sounds activates phonological and semantic knowledge differently
Bartolotti, J. (Creator), Schroeder, S. R. (Creator), Hayakawa, S. (Creator), Rochanavibhata, S. (Contributor), Chen, P. (Contributor) & Marian, V. (Creator), SAGE Journals, 2020
DOI: 10.25384/sage.c.5022380.v1, https://sage.figshare.com/collections/Listening_to_speech_and_non-speech_sounds_activates_phonological_and_semantic_knowledge_differently/5022380/1
Dataset