Predicting rhesus monkey eye movements during naturalimage search

Mark A. Segraves*, Emory Kuo, Sara Caddigan, Emily A. Berthiaume, Konrad P. Kording

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

There are three prominent factors that can predict human visual-search behavior in natural scenes: the distinctiveness of a location (salience), similarity to the target (relevance), and features of the environment that predict where the object might be (context). We do not currently know how well these factors are able to predict macaque visual search, which matters because it is arguably the most popular model for asking how the brain controls eye movements. Here we trained monkeys to perform the pedestrian search task previously used for human subjects. Salience, relevance, and context models were all predictive of monkey eye fixations and jointly about as precise as for humans. We attempted to disrupt the influence of scene context on search by testing the monkeys with an inverted set of the same images. Surprisingly, the monkeys were able to locate the pedestrian at a rate similar to that for upright images. The best predictions of monkey fixations in searching inverted images were obtained by rotating the results of the model predictions for the original image. The fact that the same models can predict human and monkey search behavior suggests that the monkey can be used as a good model for understanding how the human brain enables natural-scene search.

Original languageEnglish (US)
Pages (from-to)1-17
Number of pages17
JournalJournal of Vision
Volume17
Issue number3
DOIs
StatePublished - Mar 1 2017

Keywords

  • Behavior
  • Contextual guidance
  • Eye movement
  • Real-world scene
  • Salience
  • Target features
  • Visual search

ASJC Scopus subject areas

  • Sensory Systems
  • Ophthalmology

Fingerprint

Dive into the research topics of 'Predicting rhesus monkey eye movements during naturalimage search'. Together they form a unique fingerprint.

Cite this