Adaptive composite filters for pattern recognition in nonoverlapping scenes using noisy training images

Pablo Mario Aguilar-González, Vitaly Kober, Víctor Hugo Díaz-Ramírez

Research output: Contribution to journalArticlepeer-review

21 Scopus citations

Abstract

Correlation filters for target detection are usually designed by analytical optimization of performance criteria. The resulting expressions require explicit knowledge about the appearance and shape of the object of interest. As a result, the performance of correlation filters is significantly affected by changes in the appearance of the object in the input scene. These changes can be caused by factors such rotation and scaling. This has been addressed by the use of composite correlation filters that take into account different views of the object. In this work, we propose an algorithm for the design of adaptive composite filters when an object to be recognized is given in noisy training images and its shape and intensity values are not explicitly known. The impulse responses of optimal correlation filters are used to synthesize composite filters for distortion invariant object detection. Two techniques are used to improve the detection performance: an adaptive procedure that achieves a prespecified performance for a typical scene background, and multiple composite filters (bank of filters) when numerous views are available for training. Computer simulation results obtained with the proposed filters are presented and compared with those of common composite filters in terms of detection capability and location accuracy.

Original languageEnglish
Pages (from-to)83-92
Number of pages10
JournalPattern Recognition Letters
Volume41
Issue number1
DOIs
StatePublished - 1 May 2014

Keywords

  • Composite filters
  • Correlation filters
  • Noisy training images
  • Pattern recognition

Fingerprint

Dive into the research topics of 'Adaptive composite filters for pattern recognition in nonoverlapping scenes using noisy training images'. Together they form a unique fingerprint.

Cite this