Many psychologists have argued that language acquisition plays an important role in the development of Theory of Mind (ToM) reasoning in children. Several accounts of this interaction exist: some believe that language gives children the ability to express already formed ToM reasoning (e.g. He, Bolz, & Baillargeon, 2011), while others argue that learning specific grammatical structures engenders new reasoning abilities (e.g. de Villiers & Pyers, 1997). Questions remain about the mechanism by which this interaction occurs. In this paper, we show that the Analogical Theory of Mind (AToM; Rabkina et al., 2017) computational model can bootstrap aspects of ToM reasoning from sentential complement training, and that its performance matches improvement patterns of children who are trained using similar stimuli. This provides an implemented algorithmic account of bootstrapping ToM reasoning from language within a broader model of ToM development.
|Original language||English (US)|
|Title of host publication||CogSci 2018 Proceedings|
|Subtitle of host publication||changing/minds|
|Editors||Chuck Kalish, Marina Rau, Jerry Zhu, Tim Rogers|
|Number of pages||6|
|State||Published - 2018|
Rabkina, I., McFate, C., & Forbus, K. D. (2018). Bootstrapping from Language in the Analogical Theory of Mind Model. In C. Kalish, M. Rau, J. Zhu, & T. Rogers (Eds.), CogSci 2018 Proceedings: changing/minds (pp. 924-929)