access icon free Multiscale spatially regularised correlation filters for visual tracking

Recently, discriminative correlation filter based trackers have achieved extremely successful results in many competitions and benchmarks. These methods utilise a periodic assumption of the training samples to efficiently learn a classifier. However, this assumption will produce unwanted boundary effects which severely degrade the tracking performance. Correlation filters with limited boundaries and spatially regularised discriminative correlation filters were proposed to reduce boundary effects. However, their methods use the fixed scale mask or pre-designed weights function, respectively, which are unsuitable for large-scale variation. In this study, the authors proposed multiscale spatially regularised correlation filters (MSRCF) for visual tracking. The authors’ augmented objective can reduce the boundary effect even in large-scale variation, leading to more discriminative model. The proposed multiscale regularisation matrix makes MSRCF fast convergence. The authors’ online tracking algorithm performs favourably against state-of-the-art trackers on OTB-2013 and OTB-2015 Benchmark in terms of efficiency, accuracy and robustness.

Inspec keywords: image filtering; image classification; object tracking; correlation methods; learning (artificial intelligence)

Other keywords: multiscale regularisation matrix; multiscale spatially regularised correlation filters; spatially regularised discriminative correlation filters; boundary effect reduction; large-scale variation; discriminative correlation filter based trackers; fixed scale mask; weight function; tracking performance degradation; classifier learning; visual tracking; MSRCF

Subjects: Image recognition; Filtering methods in signal processing; Knowledge engineering techniques; Computer vision and image processing techniques

References

    1. 1)
      • 21. Danelljan, M., Hager, G., Khan, F.S., et al: ‘Accurate scale estimation for robust visual tracking’. Proc. of the British Machine Vision Conf., 2014.
    2. 2)
      • 23. Henriques, J.F., Caseiro, R., Martins, P., et al: ‘Exploiting the circulant structure of tracking-by-detection with kernels’. ECCV, 2012.
    3. 3)
      • 2. Jepson, A., Fleet, D., EI-Maraghi, T.: ‘Robust online appearance models for visual tracking’, IEEE Trans. Pattern Anal. Mach. Intell., 2003, 25, (10), pp. 12961311.
    4. 4)
      • 28. Bao, C., Wu, Y., Ling, H., et al: ‘Real time robust l1 tracker using accelerated proximal gradient approach’. CVPR, 2012.
    5. 5)
      • 12. Avidan, S.: ‘Ensemble tracking’, IEEE Trans. Pattern Anal. Mach. Intell., 2007, 29, pp. 261271.
    6. 6)
      • 27. Kristan, M., Pflugfelder, R., Leonardis, A., et al: ‘The visual object tracking vot2014 challenge results’. ECCV Workshop, 2014.
    7. 7)
      • 26. Danelljan, M., Hager, G., Khan, F.S., et al: ‘Learning spatially regularized correlation filters for visual tracking’. ICCV, 2015.
    8. 8)
      • 5. Smeulders, A.W.M., Chu, D.M., Cucchiara, R., et al: ‘Visual tracking: an experimental survey’, IEEE Trans. Pattern Anal. Mach. Intell., 2014, 36, (7), pp. 14421468.
    9. 9)
      • 9. Mei, X., Ling, H.: ‘Robust visual tracking and vehicle classification via sparse representation’, IEEE Trans. Pattern Anal. Mach. Intell., 2011, 33, pp. 22592272.
    10. 10)
      • 25. Galoogahi, H.K., Sim, T., Lucey, S.: ‘Correlation filters with limited boundaries’. CVPR, 2015.
    11. 11)
      • 19. Babenko, B., Yang, M.-H., Belongie, S.: ‘Visual tracking with online multiple instance learning’. CVPR, 2009.
    12. 12)
      • 11. Avidan, S.: ‘Support vector tracking’, IEEE Trans. Pattern Anal. Mach. Intell., 2004, 26, pp. 10641072.
    13. 13)
      • 30. Hare, S., Saffari, A., Torr, P.: ‘Struck: structured output tracking with kernels’. ICCV, 2011.
    14. 14)
      • 3. Ross, D.A., Lim, J., Lin, R.-S., et al: ‘Incremental learning for robust visual tracking’, Int. J. Comput. Vis., 2008, 77, (1-3), pp. 125141.
    15. 15)
      • 20. Bolme, D.S., Beveridge, J.R., Draper, B.A., et al: ‘Visual object tracking using adaptive correlation filters’. CVPR, 2010.
    16. 16)
      • 13. Collins, R., Liu, Y., Leordeanu, M.: ‘Online selection of discriminative tracking features’, IEEE Trans. Pattern Anal. Mach. Intell., 2005, 27, pp. 16311643.
    17. 17)
      • 18. Zhang, K., Zhang, L., Yang, M.-H.: ‘Fast compressive tracking’, IEEE Trans. Pattern Anal. Mach. Intell., 2014, 36, (10), pp. 20022015.
    18. 18)
      • 10. Jia, X., Lu, H., Yang, M.-H.: ‘Visual tracking via adaptive structural local sparse appearance model’. Proc. of IEEE Conf. on Computer Vision and Pattern Recognition, 2012, pp. 18221829.
    19. 19)
      • 14. Grabner, H., Grabner, M., Bischof, H.: ‘Real-time tracking via online boosting’. BMVC, 2006, pp. 4756.
    20. 20)
      • 1. Yilmaz, A., Javed, O., Shah, M.: ‘Object tracking: a survey’, ACM Comput. Surv., 2006, 38, (4), pp 145.
    21. 21)
      • 15. Babenko, B., Yang, M., Belongie, S.: ‘Robust object tracking with online multiple instance learning’, IEEE Trans. Pattern Anal. Mach. Intell., 2011, 33, pp. 16191632.
    22. 22)
      • 7. Adam, A., Rivlin, E., Shimshoni, I.: ‘Robust fragments-based tracking using the integral histogram’. IEEE Conf. on Computer Vision and Pattern Recognition, 2006, pp. 798805.
    23. 23)
      • 17. Zhang, K., Song, H.: ‘Real time visual tracking via online weighted multiple instance learning’, Pattern Recognit., 2013, 46, pp. 397411.
    24. 24)
      • 8. Kalal, Z., Mikolajczyk, K., Matas, J.: ‘Tracking-learning-detection’, IEEE Trans. Pattern Anal. Mach. Intell., 2012, 34, (7), pp. 14091422.
    25. 25)
      • 4. Li, X., Hu, W., Shen, C., et al: ‘A survey of appearance models in visual object tracking’, ACM Trans. Intell. Syst. Technol., 2013, 4, (4), p. 58.
    26. 26)
      • 22. Danelljan, M., Khan, F.S., Felsberg, M., et al: ‘Adaptive color attributes for real-time visual tracking’. CVPR, 2014.
    27. 27)
      • 33. Wu, Y., Lim, J., Yang, M.-H.: ‘Object tracking benchmark’. PAMI, 2015.
    28. 28)
      • 29. Zhong, W., Lu, H., Yang, M.-H.: ‘Robust object tracking via sparsity-based collaborative model’. CVPR, 2012.
    29. 29)
      • 31. Zhang, J., Ma, S., Sclaroff, S.: ‘MEEM: robust tracking via multiple experts using entropy minimization’. ECCV, 2014.
    30. 30)
      • 24. Henriques, J.F., Caseiro, R., Martins, P., et al: ‘High-speed tracking with kernelized correlation filters’. TPAMI, 2014.
    31. 31)
      • 16. Kalal, Z., Matas, J., Mikolajczyk, K.: ‘P-N learning: bootstrapping binary classifiers by structural constraints’. IEEE Conf. on Computer Vision and Pattern Recognition, 2010, pp. 4956.
    32. 32)
      • 6. Black, M.J., Jepson, A.D., Tracking, E.: ‘Robust matching and tracking of articulated objects using a view-based representation’, Int. J. Comput. Vis., 1998, 26, (1), pp. 6384.
    33. 33)
      • 32. Wu, Y., Lim, J., Yang, M.-H.: ‘Online object tracking: a benchmark’. CVPR, 2013, pp. 24112418.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cvi.2016.0241
Loading

Related content

content/journals/10.1049/iet-cvi.2016.0241
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading