TY - JOUR
T1 - Relational labeling unlocks inert knowledge
AU - Jamrozik, Anja
AU - Gentner, Dedre
N1 - Funding Information:
This research was funded by ONR Grant N00014-08-1-0040. We thank Laura Willig for her help in coding the data, Becky Bui, Lizabeth Huey, and Benjamin Dionysus for their administrative support, and the Cognition and Language Lab for their helpful suggestions.
Funding Information:
This research was funded by ONR Grant N00014-08-1-0040 . We thank Laura Willig for her help in coding the data, Becky Bui, Lizabeth Huey, and Benjamin Dionysus for their administrative support, and the Cognition and Language Lab for their helpful suggestions. Appendix A
Publisher Copyright:
© 2019 Elsevier B.V.
PY - 2020/3
Y1 - 2020/3
N2 - Insightful solutions often come about by recalling a relevant prior situation—one that shares the same essential relational pattern as the current problem. Unfortunately, our memory retrievals often depend primarily on surface matches, rather than relational matches. For example, a person who is familiar with the idea of positive feedback in sound systems may fail to think of it in the context of global warming. We suggest that one reason for the failure of cross-domain relational retrieval is that relational information is typically encoded variably, in a context-dependent way. In contrast, the surface features of that context—such as objects, animals and characters—are encoded in a relatively stable way, and are therefore easier to retrieve across contexts. We propose that the use of relational language can serve to make situations' relational representations more uniform, thereby facilitating relational retrieval. In two studies, we find that providing relational labels for situations at encoding or at retrieval increased the likelihood of relational retrieval. In contrast, domain labels—labels that highlight situations' contextual features—did not reliably improve domain retrieval. We suggest that relational language allows people to retrieve knowledge that would otherwise remain inert and contributes to domain experts' insight.
AB - Insightful solutions often come about by recalling a relevant prior situation—one that shares the same essential relational pattern as the current problem. Unfortunately, our memory retrievals often depend primarily on surface matches, rather than relational matches. For example, a person who is familiar with the idea of positive feedback in sound systems may fail to think of it in the context of global warming. We suggest that one reason for the failure of cross-domain relational retrieval is that relational information is typically encoded variably, in a context-dependent way. In contrast, the surface features of that context—such as objects, animals and characters—are encoded in a relatively stable way, and are therefore easier to retrieve across contexts. We propose that the use of relational language can serve to make situations' relational representations more uniform, thereby facilitating relational retrieval. In two studies, we find that providing relational labels for situations at encoding or at retrieval increased the likelihood of relational retrieval. In contrast, domain labels—labels that highlight situations' contextual features—did not reliably improve domain retrieval. We suggest that relational language allows people to retrieve knowledge that would otherwise remain inert and contributes to domain experts' insight.
KW - Analogy
KW - Creativity
KW - Memory retrieval
KW - Relational reasoning
KW - Relational transfer
UR - http://www.scopus.com/inward/record.url?scp=85075723396&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85075723396&partnerID=8YFLogxK
U2 - 10.1016/j.cognition.2019.104146
DO - 10.1016/j.cognition.2019.104146
M3 - Article
C2 - 31794891
AN - SCOPUS:85075723396
SN - 0010-0277
VL - 196
JO - Cognition
JF - Cognition
M1 - 104146
ER -