Improved dual-mode compressive tracking integrating balanced colour and texture features

Improved dual-mode compressive tracking integrating balanced colour and texture features

For access to this article, please select a purchase option:

Buy article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
IET Computer Vision — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Discriminative tracking methods can achieve state-of-the-art performance by considering tracking as a classification problem tackled with both object and background information. As a high efficient discriminative tracker, compressive tracking (CT) has attracted much attention recently. However, it may easily fail when the object suffers from long-term occlusions, and severe appearance and illumination changes. To address these issues, the authors develop a robust tracking framework based on CT by considering balanced feature representation as well as dual-mode classifier construction. First, the original measurement matrix of CT works as a dominated texture feature extractor. To obtain a balanced feature representation, they propose to induce a complementary measurement matrix by considering both texture and colour features. Then, they develop two classifiers (dual mode) by using previous and current sample sets, respectively, and subsequently combine them into one ensemble classifier to track the target, which can help to avoid tracking failure suffering from severe appearance changes and long term occlusion. Moreover, they propose a classifier updating schema to prevent the inclusion of unsatisfied positive samples by predicting the occlusions with their ensemble classifier. The extensive experiments demonstrate the superior performance of their tracking framework under various situations.


    1. 1)
      • 1. Babenko, B., Yang, M.H., Belongie, S: ‘Robust object tracking with online multiple instance learning’, IEEE Trans. Pattern Anal. Mach. Intell., 2011, 33, (8), pp. 16191632.
    2. 2)
      • 2. Grabner, H., Grabner, M., Bischof, H.: ‘Real-time tracking via online boosting’. Proc. British Machine Conf., Edinburgh, Scotland, 2006.
    3. 3)
      • 3. Grabner, H., Leistner, C., Bischof, H.: ‘Semi-supervised on-line boosting for robust tracking’. Proc. 10th European Conf. Computer Vision: Part I. Marseille, France, 2008, pp. 234247.
    4. 4)
      • 4. Hare, S., Golodetz, S., Saffari, A., et al: ‘Struck: structured output tracking with Kernels’, IEEE Trans. Pattern Anal. Mach. Intell., 2016, 38, (10), pp. 20962109.
    5. 5)
      • 5. He, S., Yang, Q., Lau, R., et al: ‘Visual tracking via locality sensitive histograms’. Computer Vision and Pattern Recognition (CVPR), 2013 IEEE Conf. 23–28 June 2013, Portland, OR, USA, 2013, pp. 24272434.
    6. 6)
      • 6. Henriques, J.F., Caseiro, R., Martins, P., et al: ‘Exploiting the circulant structure of tracking-by-detection with kernels’. Computer Vision – ECCV 2012: 12th European Conf. Computer Vision, Florence, Italy, October 7–13, 2012, pp. 702715.
    7. 7)
      • 7. Ross, D., Lim, J., Lin, R.S., et al: ‘Incremental learning for robust visual tracking’, Int. J. Comput. Vision, 2008, 77, (1–3), pp. 125141.
    8. 8)
      • 8. Zhang, K., Song, H.: ‘Real-time visual tracking via online weighted multiple instance learning’, Pattern Recognit., 2013, 46, (1), pp. 397411.
    9. 9)
      • 9. Zhang, K., Zhang, L., Yang, M.H.: ‘Real-time compressive tracking’. Proc. 12th European Conf. Computer Vision – Volume Part III. Florence, Italy, 2012, pp. 864877.
    10. 10)
      • 10. Wu, Y., Lim, J., Yang, M.H.: ‘Object tracking benchmark’, IEEE Trans. Pattern Anal. Mach. Intell., 2015, 37, (9), pp. 18341848.
    11. 11)
      • 11. Zhang, K., Zhang, L., Yang, M.H.: ‘Fast compressive tracking’, IEEE Trans. Pattern Anal. Mach. Intell., 2014, 36, (10), pp. 20022015.
    12. 12)
      • 12. Zhang, T., Liu, S., Xu, C., et al: ‘Structural sparse tracking’. 2015 IEEE Conf. Computer Vision and Pattern Recognition (CVPR), 7–12 June 2015, Boston, MA, USA, 2015, pp. 150158.
    13. 13)
      • 13. Dan, Z., Sang, N., Huang, R., et al: ‘Instance transfer boosting for object tracking’, Optik, 2013, 124, (18), pp. 34463450.
    14. 14)
      • 14. Luo, W., Li, X., Li, W., et al: ‘Robust visual tracking via transfer learning’. 2011 18th IEEE Int. Conf. Image Processing: 11–14 September 2011, Brussels, Belgium, 2011, pp. 485488.
    15. 15)
      • 15. Donoho, D.L.: ‘Compressed sensing’, IEEE Trans. Inf. Theor., 2006, 52, (4), pp. 12891306.
    16. 16)
      • 16. Zhu, Q., Yan, J., Zhang, H., et al: ‘Real-time tracking using multiple features based on compressive sensing’, Opt. Precis. Eng., 2013, 21, (2), pp. 437444.
    17. 17)
      • 17. Xu, R., Gu, X., Wei, B.: ‘An improved real time compressive tracking’. Proc. 2013 2nd IAPR Asian Conf. Pattern Recognition. IEEE Computer Society, Naha, Japan, 2013, pp. 692696.
    18. 18)
      • 18. Kalal, Z., Mikolajczyk, K., Matas, J.: ‘Tracking-learning-detection’, IEEE Trans. Pattern Anal. Mach. Intell., 2012, 34, (7), pp. 14091422.
    19. 19)
      • 19. Wu, Y., Lim, J., Yang, M.H.: ‘Online object tracking: a benchmark’. Computer Vision and Pattern Recognition (CVPR), 2013 IEEE Conf. 23–28 June 2013, Portland, OR, USA, 2013, pp. 24112418.
    20. 20)
      • 20. Dessauer, M.P., Dua, S.: ‘Optical flow object detection, motion estimation, and tracking on moving vehicles using wavelet decompositions’. Society of Photo-Optical Instrumentation Engineers (SPIE), Orlando, FL, USA, 2010.
    21. 21)
      • 21. Sun, S., Guo, Q., Dong, F., et al: ‘On-line boosting based real-time tracking with efficient HOG’. 2013 IEEE Int. Conf. Acoustics, Speech and Signal Processing, 26–31 May 2013, Vancouver, Canada, 2013, pp. 22972301.
    22. 22)
      • 22. Senst, T., Eiselein, V., Sikora, T.: ‘Robust local optical flow for feature tracking’, IEEE Trans. Circuits Syst. Video Technol., 2012, 22, (9), pp. 13771387.

Related content

This is a required field
Please enter a valid email address