Friday , December 6 2019

Robust Object Tracker in Video via Discriminative Model

Mehrez ABDELLAOUI1,2*, Ali DOUIK2 
1,2 Higher Institute of Applied Sciences and Technology, University of Kairouan,
Avenue Beit El Hikma, Kairouan, 3100, Tunisia
mehrez.abdellaoui@enim.rnu.tn (*Corresponding author)
2 NOCCS Laboratory, National Engineering School of Sousse, University of Sousse, Pôle technologique de Sousse,
Route de Ceinture Sahloul, Sousse, 4054, Tunisia
ali.douik@enim.rnu.tn

ABSTRACT: In this paper a new method for object tracking in complex video scenes is presented. Video object tracking is one of the most challenging tasks in the field of image and video processing. Due to the complexity and the variability of the scenes and objects, there are a lot of challenges one faces. Recently, many tracking approaches have used discriminative models to create robust trackers of specific pixels in video frames called interest points. These trackers allow users to extract the target object from the complex background of the scene in real time. Most of the proposed robust trackers are facing bad or wrong predictions of the interest points positions related to the target object. This greatly decreases the performances of the tracker. To overcome this problem, both the interest point detection methods and online boosting discriminative model were combined in order to obtain a robust tracker. The proposed approach was experimented on video clips presenting several aspects of intrinsic and extrinsic factors of real challenges that tracking algorithms may face. The obtained results using two evaluation metrics have shown that the present robust tracker outperforms the recent state-of-the-art techniques in terms of Spatial Robustness Evaluation (SRE) and Temporal Robustness Evaluation (TRE).

KEYWORDS: Object tracking, Discriminative model, Interest points detection, SRE, TRE.

>>FULL TEXT: PDF

CITE THIS PAPER AS:
Mehrez ABDELLAOUI, Ali DOUIKRobust Object Tracker in Video via Discriminative Model, Studies in Informatics and Control, ISSN 1220-1766, vol. 28(3), pp. 337-346, 2019. https://doi.org/10.24846/v28i3y201910