Object tracking under nonuniform illumination with adaptive correlation filtering

Kenia Picos, Víctor H. Díaz-Ramírez, Vitaly Kober

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

A real-time system for illumination-invariant object tracking is proposed. The system is able to estimate at high-rate the position of a moving target in an input scene when is corrupted by the presence of a high cluttering background and nonuniform illumination. The position of the target is estimated with the help of a filter bank of space-variant correlation filters. The filters in the bank, adapt their parameters according to the local statistical parameters of the observed scene in a small region centered at coordinates of a predicted position for the target in each frame. The prediction is carried out by exploiting information of present and past frames, and by using a dynamic motion model of the target in a two-dimensional plane. Computer simulation results obtained with the proposed system are presented and discussed in terms of tracking accuracy, computational complexity, and tolerance to nonuniform illumination.

Original languageEnglish
Title of host publicationOptics and Photonics for Information Processing VII
DOIs
StatePublished - 2013
Event7th Conference of Optics and Photonics for Information Processing - San Diego, CA, United States
Duration: 28 Aug 201329 Aug 2013

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume8855
ISSN (Print)0277-786X
ISSN (Electronic)1996-756X

Conference

Conference7th Conference of Optics and Photonics for Information Processing
Country/TerritoryUnited States
CitySan Diego, CA
Period28/08/1329/08/13

Keywords

  • Image processing
  • Nonuniform illumination
  • Object tracking
  • Pattern recognition
  • Space-variant correlation filters

Fingerprint

Dive into the research topics of 'Object tracking under nonuniform illumination with adaptive correlation filtering'. Together they form a unique fingerprint.

Cite this