Abstract
Human attention biases toward moral and emotional information are as prevalent online as they are offline. When these biases interact with content algorithms that curate social media users’ news feeds to maximize attentional capture, moral and emotional information are privileged in the online information ecosystem. We review evidence for these human-algorithm interactions and argue that misinformation exploits this process to spread online. This framework suggests that interventions aimed at combating misinformation require a dual-pronged approach that combines person-centered and design-centered interventions to be most effective. We suggest several avenues for research in the psychological study of misinformation sharing under a framework of human-algorithm interaction.
Original language | English (US) |
---|---|
Article number | 101770 |
Journal | Current opinion in psychology |
Volume | 56 |
DOIs | |
State | Published - Apr 2024 |
Funding
The authors thank participants of the misinformation preconference at the annual meeting of Society for Personality and Social Psychology (SPSP) for helpful feedback.
ASJC Scopus subject areas
- General Psychology