http://iet.metastore.ingenta.com
1887

ACFT: adversarial correlation filter for robust tracking

ACFT: adversarial correlation filter for robust tracking

For access to this article, please select a purchase option:

Buy article PDF
$19.95
(plus tax if applicable)
Buy Knowledge Pack
10 articles for $120.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
IET Image Processing — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Tracking based on correlation filters has demonstrated outstanding performance in recent visual object tracking studies and competitions. However, the performance is limited since the boundary effects are introduced by the intrinsic circular structure. In this study, a tracker, called adversarial correlation filter tracker (ACFT), is proposed to solve the above problem through Generative Adversarial Networks (GANs) that is specifically strong at producing realistic-looking data from noise circumstances. Especially, a mask is generated by the GANs to assist the conventional correlation filter for the spatial regularisation. By overcoming the feature independence of current regularisation in another tracker, the GANs’ mask can be effectively used to identify the robust features for the target variations representation in the temporal domain. Also in the spatial domain, the background features can be substantially suppressed to obtain the optimisation filter for more reliable matching and updating. In verification, the authors evaluate the proposed tracker on the standard tracking benchmarks, and the experimental results show that their tracker outperforms favourably against other state-of-the-art trackers in the measurements of accuracy and robustness.

References

    1. 1)
      • 1. Wang, Q., Wan, J., Li, X.: ‘Robust hierarchical deep learning for vehicular management’, IEEE Trans. Veh. Technol., 2019, 68, (5), pp. 41484156.
    2. 2)
      • 2. Kristan, M., Leonardis, A., Matas, J., et al: ‘The sixth visual object tracking vot2018 challenge results’. The European Conf. on Computer Vision Workshops, Munich, Germany, 2018.
    3. 3)
      • 3. Wu, Y., Lim, J., Yang, M.: ‘Object tracking benchmark’, IEEE Trans. Pattern Anal. Mach. Intell., 2015, 37, (9), pp. 18341848.
    4. 4)
      • 4. Song, Y., Ma, C., Gong, L., et al: ‘Crest: convolutional residual learning for visual tracking’. The IEEE Int. Conf. on Computer Vision (ICCV), Venice, Italy, 2017.
    5. 5)
      • 5. Kiani Galoogahi, H., Sim, T., Lucey, S.: ‘Correlation filters with limited boundaries’. The IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 2015.
    6. 6)
      • 6. Danelljan, M., Hager, G., Khan, F.S., et al: ‘Learning spatially regularized correlation filters for visual tracking’. Int. Conf. on Computer Vision (ICCV), Santiago, Chile, 2015.
    7. 7)
      • 7. Li, F., Tian, C., Zuo, W., et al: ‘Learning spatial–temporal regularized correlation filters for visual tracking’. The IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 2018.
    8. 8)
      • 8. Lukezic, A., Vojr, T., Zajc, L.C., et al: ‘Discriminative correlation filter tracker with channel and spatial reliability’, Int. J. Comput. Vis., 2018, 126, (7), pp. 671688.
    9. 9)
      • 9. Kiani Galoogahi, H., Fagg, A., Lucey, S.: ‘Learning background-aware correlation filters for visual tracking’. The IEEE Int. Conf. on Computer Vision (ICCV), Venice, Italy, 2017.
    10. 10)
      • 10. Kristan, M., Matas, J., Leonardis, A., et al: ‘A novel performance evaluation methodology for single-target trackers’, IEEE Trans. Pattern Anal. Mach. Intell., 2016, 38, (11), pp. 21372155.
    11. 11)
      • 11. Song, Y., Ma, C., Wu, X., et al: ‘Vital: visual tracking via adversarial learning’. The IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 2018.
    12. 12)
      • 12. Wang, N., Shi, J., Yeung, D., et al: ‘Understanding and diagnosing visual tracking systems’. IEEE Int. Conf. on Computer Vision (ICCV), Santiago, Chile, 2015, pp. 31013109.
    13. 13)
      • 13. Pflugfelder, R.P.: ‘Siamese learning visual tracking: A survey’, CoRR, 2017, abs/1707.00569.
    14. 14)
      • 14. Ma, C., Huang, J., Yang, X., et al: ‘Hierarchical convolutional features for visual tracking’. In: Int. Conf. on Computer Vision (ICCV), Santiago, Chile, 2015, pp. 30743082.
    15. 15)
      • 15. Kristan, M., Leonardis, A., Matas, J., et al: ‘The visual object tracking VOT2017 challenge results’. ICCV Workshops 2017:, Venice, Italy, 1949–1972.
    16. 16)
      • 16. Bolme, D.S., Beveridge, J.R., Draper, B.A., et al: ‘Visual object tracking using adaptive correlation filters’. CVPR 2010:, Boston, MA, USA, 2544–2550.
    17. 17)
      • 17. Henriques, J.A.F., Caseiro, R., Martins, P., et al: ‘Exploiting the circulant structure of tracking-by-detection with kernels’. European Conf. on Computer Vision (ECCV), Florence, Italy, 2012.
    18. 18)
      • 18. Henriques, J.F., Caseiro, R., Martins, P., et al: ‘High-speed tracking with kernelized correlation filters’, IEEE Trans. Pattern Anal. Mach. Intell., 2015, 37, (3), pp. 583596.
    19. 19)
      • 19. Tang, M., Yu, B., Zhang, F., et al: ‘High-speed tracking with multi-kernel correlation filters’, CoRR, 2018, abs/1806.06418.
    20. 20)
      • 20. Danelljan, M., Khan, F.S., Felsberg, M., et al: ‘Adaptive color attributes for real-time visual tracking’. Conf. on Computer Vision and Pattern Recognition (CVPR), Columbus, OH, USA, 2014, pp. 10901097.
    21. 21)
      • 21. Danelljan, M., Häger, G., Khan, F.S., et al: ‘Discriminative scale space tracking’, IEEE Trans. Pattern Anal. Mach. Intell., 2017, 39, (8), pp. 15611575.
    22. 22)
      • 22. Li, Y., Zhu, J.: ‘A scale adaptive kernel correlation filter tracker with feature integration’. The European Conf. on Computer Vision – ECCV 2014 Workshops, Zürich, Switzerland, 2014.
    23. 23)
      • 23. Mueller, M., Smith, N., Ghanem, B.: ‘Context-aware correlation filter tracking’. Conf. on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 2017.
    24. 24)
      • 24. Wang, Q., Zhang, M., Xing, J., et al: ‘Do not lose the details: reinforced representation learning for high performance visual tracking’. Proc. of the Twenty-Seventh Int. Joint Conf. on Artificial Intelligence (IJCAI), Stockholm, Sweden, 2018.
    25. 25)
      • 25. Ma, C., Huang, J., Yang, X., et al: ‘Adaptive correlation filters with long-term and short-term memory for object tracking’, Int. J. Comput. Vis., 2018, 126, (8), pp. 771796.
    26. 26)
      • 26. Danelljan, M., Bhat, G., Khan, F.S., et al: ‘ECO: efficient convolution operators for tracking’. Conf. on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 2017.
    27. 27)
      • 27. Danelljan, M., Robinson, A., Khan, F.S., et al: ‘Beyond correlation filters: learning continuous convolution operators for visual tracking’. European Conf. Computer Vision (ECCV), Amsterdam, The Netherlands, 2016.
    28. 28)
      • 28. Zhang, T., Xu, C., Yang, M.: ‘Multi-task correlation particle filter for robust object tracking’. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 2017.
    29. 29)
      • 29. Guo, Q., Feng, W., Zhou, C., et al: ‘Learning dynamic siamese network for visual object tracking’. The IEEE Int. Conf. on Computer Vision (ICCV), Venice, Italy, 2017.
    30. 30)
      • 30. Wang, Q., Wan, J., Nie, F., et al: ‘Hierarchical feature selection for random projection’, IEEE Trans. Neural Netw. Learn. Syst., 2019, 30, (5), pp. 15811586.
    31. 31)
      • 31. Goodfellow, I., Pouget Abadie, J., Mirza, M., et al: ‘Generative adversarial nets’, in Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N.D., Weinberger, K.Q. (eds.): ‘Advances in neural information processing systems 27’ (Neural Information Processing Systems, San Diego, CA, USA, 2014).
    32. 32)
      • 32. Isola, P., Zhu, J.Y., Zhou, T., et al: ‘Image-to-image translation with conditional adversarial networks’. 2017 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 2017.
    33. 33)
      • 33. Wang, X., Shrivastava, A., Gupta, A.: ‘A-fast-rcnn: hard positive generation via adversary for object detection’. In: The IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 2017.
    34. 34)
      • 34. Wu, Y., Lim, J., Yang, M.: ‘Online object tracking: A benchmark’. The IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Portland, OR, USA, 2013.
    35. 35)
      • 35. Wang, Q., Chen, M., Nie, F., et al: ‘Detecting coherent groups in crowd scenes by multiview clustering’, IEEE Trans. Pattern Anal. Mach. Intell., 2018, pp. 11, https://ieeexplore.ieee.org/document/8486651.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-ipr.2018.6672
Loading

Related content

content/journals/10.1049/iet-ipr.2018.6672
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address