Fast Correction Visual Tracking via Feedback Mechanism

Visual tracking is a fundamental problem in computer vision field. Most online visual trackers focus on the appearance information and inference theory to realize tracking frame by frame. However, enough attention has not been paid to the correction ability of a tracking system, which leads to drift...

Full description

Saved in:
Bibliographic Details
Published inIntelligence Science and Big Data Engineering. Image and Video Data Engineering Vol. 9242; pp. 208 - 219
Main Authors Xu, Tianyang, Wu, Xiaojun
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2015
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783319239873
3319239872
ISSN0302-9743
1611-3349
DOI10.1007/978-3-319-23989-7_22

Cover

More Information
Summary:Visual tracking is a fundamental problem in computer vision field. Most online visual trackers focus on the appearance information and inference theory to realize tracking frame by frame. However, enough attention has not been paid to the correction ability of a tracking system, which leads to drift problems or tracking failures in previous works. This paper investigates the contribution of feedback mechanism in a tracking-by-detection framework. Results indicate that the changing values of the target state’s posterior distributions provide superior information to the connection between tracking result and the ground truth. We further analyse the spatial appearance information and propose an adaptive feedback tracking method using Discrete-Quaternion-Fourier-Transform (DQFT). Taking advantages of the stability of closed-loop control and the efficiency of DQFT, the proposed tracker can make a distinction between the easy-tracking frames and the hard-tracking frames, and then re-track hard-tracking frames using further temporal information to realize the correction ability. Experiments over 50 challenging videos demonstrate the effectiveness and robustness of the tracker, and the resulting tracker outperforms the existing state-of-the-art methods.
ISBN:9783319239873
3319239872
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-23989-7_22