Robust visual tracking via constrained correlation filter coding
Artikel i vetenskaplig tidskrift, 2016
Unconstrained correlation filters based trackers achieve superior performance with high speed in visual tracking. However, such unconstrained correlation filters do not impose any hard constraint to their responses to have a certain value, which brings about classification ambiguity on intractable samples (i.e., two similar samples from different classes). To tackle this issue, in this paper, constrained correlation filter is introduced into visual tracking framework to alleviate classification ambiguity for more accurate target location. By imposing distinguishable hard constraints on the response map to different classes, a supervised coding method is proposed to encode various candidate samples by a discriminative filter bank. The learned high-level feature vectors are sent to a Naive Bayes classifier to separate the target from the background. Besides, parameters updating schemes in the constrained filter and classifier are introduced to adapt to appearance changes of the target with less possibility of drifting. Both qualitative and quantitative evaluations on Object Tracking Benchmark (OTB) show that the proposed tracking method achieves favorable performance compared with other state-of-the-art methods.
Discriminative model
Supervised feature coding
Visual tracking
Constrained correlation filter