Self-calibrating smooth pursuit through active efficient coding

Céline Teulière*, Sébastien Forestier, Luca Lonini, Chong Zhang, Yu Zhao, Bertram Shi, Jochen Triesch

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

21 Scopus citations

Abstract

Abstract This paper presents a model for the autonomous learning of smooth pursuit eye movements based on an efficient coding criterion for active perception. This model accounts for the joint development of visual encoding and eye control. Sparse coding models encode the incoming data at two different spatial resolutions and capture the statistics of the input in spatio-temporal basis functions. A reinforcement learner controls eye velocity so as to maximize a reward signal based on the efficiency of the encoding. We consider the embodiment of the approach in the iCub simulator and real robot. Motion perception and smooth pursuit control are not explicitly expressed as tasks for the robot to achieve but emerge as the result of the system's active attempt to efficiently encode its sensory inputs. Experiments demonstrate that the proposed approach is self-calibrating and robust to strong perturbations of the perception-action link.

Original languageEnglish (US)
Article number2396
Pages (from-to)3-12
Number of pages10
JournalRobotics and Autonomous Systems
Volume71
DOIs
StatePublished - Sep 1 2015

Keywords

  • Active perception
  • Autonomous learning
  • Efficient coding
  • Robotics
  • Smooth pursuit

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Mathematics(all)
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Self-calibrating smooth pursuit through active efficient coding'. Together they form a unique fingerprint.

Cite this