Adaptive Particle Filters for Visual Object Tracking using Joint PCA Appearance Model and Consensus Point Correspondences
Paper in proceedings, 2009
This paper addresses issues on moving object tracking from
videos. We propose a novel tracking scheme that jointly exploits local object features using consensus point correspondences, and global object appearance and shape models using adaptive particle filter-based eigen-tracking. The paper include the following main novelties: (a) employ consensus feature point correspondences to estimate the motion vector of shape model; (b) employ adaptive particle filters and motion-corrected state vector for joint appearance- and shape-based eigen-tracking. An adaptive number of particles is chosen automatically based on an updated estimation of covariance matrix. Further, online learning is made adaptive to avoid learning using partially-occluded objects. The proposed scheme is realized by integrating SURF and RANSAC for estimating consensus point correspondences, and modify an existing particle filter-based eigen-tracking. Experimental results on tracking moving objects in videos have shown that the proposed scheme provides more accurate tracking, especially for objects with fast motion or long-term partial occlusions. The average number of particles is significantly reduced. Comparisons have been made with an existing method, results have shown that the proposed scheme has provided an improved tracking accuracy at the cost of more computations.
adaptive particle filters
consensus point correspondences
visual object tracking