Robust Visual Tracking Via Consistent Low-Rank Sparse Learning

Tianzhu Zhang, Si Liu, Narendra Ahuja, Ming-Hsuan Yang, Bernard Ghanem

Research output: Contribution to journalArticlepeer-review

254 Scopus citations

Abstract

Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.
Original languageEnglish (US)
Pages (from-to)171-190
Number of pages20
JournalInternational Journal of Computer Vision
Volume111
Issue number2
DOIs
StatePublished - Jun 19 2014

Fingerprint

Dive into the research topics of 'Robust Visual Tracking Via Consistent Low-Rank Sparse Learning'. Together they form a unique fingerprint.

Cite this