Tracking nonstationary visual appearances by data-driven adaptation

Ming Yang*, Zhimin Fan, Jialue Fan, Ying Wu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

28 Scopus citations

Abstract

Without any prior about the target, the appearance is usually the only cue available in visual tracking. However, in general, the appearances are often nonstationary which may ruin the predefined visual measurements and often lead to tracking failure in practice. Thus, a natural solution is to adapt the observation model to the nonstationary appearances. However, this idea is threatened by the risk of adaptation drift that originates in its ill-posed nature, unless good data-driven constraints are imposed. Different from most existing adaptation schemes, we enforce three novel constraints for the optimal adaptation: 1) negative data, 2) bottom-up pair-wise data constraints, and 3) adaptation dynamics. Substantializing the general adaptation problem as a subspace adaptation problem, this paper presents a closed-form solution as well as a practical iterative algorithm for subspace tracking. Extensive experiments have demonstrated that the proposed approach can largely alleviate adaptation drift and achieve better tracking results for a large variety of nonstationary scenes.

Original languageEnglish (US)
Pages (from-to)1633-1644
Number of pages12
JournalIEEE Transactions on Image Processing
Volume18
Issue number7
DOIs
StatePublished - 2009

Keywords

  • Appearance model adaptation
  • Subspace tracking
  • Visual tracking

ASJC Scopus subject areas

  • Software
  • Computer Graphics and Computer-Aided Design

Fingerprint Dive into the research topics of 'Tracking nonstationary visual appearances by data-driven adaptation'. Together they form a unique fingerprint.

Cite this