Emotional context at learning systematically biases memory for facial information

Donna J. Bridge, Joan Y. Chiao, Ken A. Paller

Research output: Contribution to journalArticlepeer-review

14 Scopus citations


Emotion influences memory in many ways. For example, when a mood-dependent processing shift is operative, happy moods promote global processing and sad moods direct attention to local features of complex visual stimuli. We hypothesized that an emotional context associated with to-be-learned facial stimuli could preferentially promote global or local processing. At learning, faces with neutral expressions were paired with a narrative providing either a happy or a sad context. At test, faces were presented in an upright or inverted orientation, emphasizing configural or analytical processing, respectively. A recognition advantage was found for upright faces learned in happy contexts relative to those in sad contexts, whereas recognition was better for inverted faces learned in sad contexts than for those in happy contexts. We thus infer that a positive emotional context prompted more effective storage of holistic, configural, or global facial information, whereas a negative emotional context prompted relatively more effective storage of local or feature-based facial information.

Original languageEnglish (US)
Pages (from-to)125-133
Number of pages9
JournalMemory and Cognition
Issue number2
StatePublished - Mar 2010

ASJC Scopus subject areas

  • Neuropsychology and Physiological Psychology
  • Experimental and Cognitive Psychology
  • Arts and Humanities (miscellaneous)


Dive into the research topics of 'Emotional context at learning systematically biases memory for facial information'. Together they form a unique fingerprint.

Cite this