access icon free Tracking objects with co-occurrence matrix and particle filter in infrared video sequences

Tracking objects in infrared video sequences became a very important challenge for many current tracking algorithms due to several complex situations such as illumination variation, night vision, and occlusion. This study proposes a new tracker that uses a set of invariant parameters calculated via the co-occurrence moments to better describe the target object. The usage of the co-occurrence moments gives the ability to exploit the information about the texture of the target to enhance the robustness of the tracking task. This latter is performed without any learning or clustering phase. The qualitative and quantitative studies on challenging sequences demonstrate that the results obtained by the proposed algorithm are very competitive in comparison to several state-of-the-art methods.

Inspec keywords: image filtering; image sequences; video signal processing; object tracking

Other keywords: complex situations; co-occurrence moments; night vision; illumination variation; particle filter; infrared video sequences; co-occurrence matrix; object tracking; clustering phase; invariant parameters; target texture

Subjects: Optical, image and video signal processing; Filtering methods in signal processing; Computer vision and image processing techniques; Video signal processing

References

    1. 1)
      • 11. González, A., Martín-Nieto, R., Bescós, J., et al: ‘Single object long-term tracker for smart control of a PTZ camera’. Proc. Int. Conf. Distributed Smart Cameras, Venezia Mestre, Italy, November 2014, pp. 39:139:6.
    2. 2)
      • 5. Shashua, A., Gdalyahu, Y., Hayun, G.: ‘Pedestrian detection for driving assistance systems: single-frame classification and system level performance’. IEEE Intelligent Vehicles Symp., Parma, Italy, June 2004, pp. 1–6.
    3. 3)
      • 4. Wang, J., Zhang, L., Zhang, D., et al: ‘An adaptive longitudinal driving assistance system based on driver characteristics’, IEEE Trans. Intell. Transp. Syst., 2013, 14, (1), pp. 112.
    4. 4)
      • 25. VOT Challenge’: http://www.votchallenge.net/, accessed February 2017).
    5. 5)
      • 8. Yilmaz, A., Javed, O., Shah, M.: ‘Object tracking: a survey’, ACM Comput. Surv., 2006, 38, (4), pp. 145.
    6. 6)
      • 17. Lebeda, K., Hadfield, S., Matas, J., et al: ‘Texture-Independent long-term tracking using virtual corners’, IEEE Trans. Image Process. Publ. IEEE Signal Process. Soc., 2016, 25, (1), pp. 359371.
    7. 7)
      • 14. Possegger, H., Mauthner, T., Bischof, H.: ‘In defense of color-based model-free tracking’. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), Boston, USA, June 2015.
    8. 8)
      • 20. Tahmasbi, A., Saki, F., Shokouhi, S. B.: ‘Classification of benign and malignant masses based on Zernike moments’, Comput. Biol. Med., 2011, 41, (8), pp. 726735.
    9. 9)
      • 9. Zhu, G., Porikli, F., Li, H.: ‘Beyond local search: tracking objects everywhere With instance-specific proposals’. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), Las Vegas, USA, June. 2016.
    10. 10)
      • 16. Poostchi, M., Aliakbarpour, H., Viguier, R., et al: ‘Semantic depth Map fusion for moving vehicle detection in aerial video’. IEEE Computer Vision and Pattern Recognition Workshops, Las Vegas, USA, June 2016, pp. 3240.
    11. 11)
      • 22. Su, Y., Zhao, Q., Zhao, L., et al: ‘Abrupt motion tracking using a visual saliency embedded particle filter’, Pattern Recognit.., 2014, 47, (5), pp. 18261834.
    12. 12)
      • 19. Hwang, S.-K., Kim, W.-Y.: ‘A novel approach to the fast computation of Zernike moments’, Pattern Recognit.., 2006, 39, (11), pp. 20652076.
    13. 13)
      • 7. Forkan, A.R.M., Khalil, I., Tari, Z., et al: ‘A context-aware approach for long-term behavioural change detection and abnormality prediction in ambient assisted living’, Pattern Recognit.., 2015, 48, (3), pp. 628641.
    14. 14)
      • 21. Haralick, R.M., Shanmugam, K., Dinstein, I.: ‘Textural features for image classification’, IEEE Trans. Syst. Man Cybern., 1973, SMC-3, (6), pp. 610621.
    15. 15)
      • 1. De-la-Torre, M., Granger, E., Sabourin, R., et al: ‘Adaptive skew-sensitive ensembles for face recognition in video surveillance’, Pattern Recognit.., 2015, 48, (11), p. pp. 33853406.
    16. 16)
      • 3. Elafi, I., Jedra, M., Zahid, N.: ‘Unsupervised detection and tracking of moving objects for video surveillance applications’, Pattern Recognit. Lett., 2016, 84, pp. 7077.
    17. 17)
      • 12. Shi, J., Tomasi, C.: ‘Good features to track’. Proc. IEEE Conf. Computer Vision and Pattern Recognition, Seatle, WA, USA, June 1994, pp. 593600.
    18. 18)
      • 13. Comaniciu, D., Ramesh, V., Meer, P.: ‘Real-time tracking of non-rigid objects using mean shift’. IEEE Conf. Computer Vision and Pattern Recognition, SC, USAJune 2000, vol. 2, pp. 142149.
    19. 19)
      • 24. Gustafsson, F.: ‘Particle filter theory and practice with positioning applications’, IEEE Aerosp. Electron. Syst. Mag., 2010, 25, (7), pp. 5382.
    20. 20)
      • 23. Zhou, H., Fei, M., Sadka, A., et al: ‘Adaptive fusion of particle filtering and spatio-temporal motion energy for human tracking’, Pattern Recognit.., 2014, 47, (11), pp. 35523567.
    21. 21)
      • 18. Upneja, R., Singh, C.: ‘Fast computation of jacobi-Fourier moments for invariant image recognition’, Pattern Recognit.., 2015, 48, (5), pp. 18361843.
    22. 22)
      • 2. Wang, X.: ‘Intelligent multi-camera video surveillance: a review’, Pattern Recognit. Lett., 2013, 34, (1), pp. 319.
    23. 23)
      • 10. Maresca, M.E., Petrosino, A.: ‘Clustering local motion estimates for robust and efficient object tracking’. European Conf. Computer Vision, Zurich, Switzerland, September 2014, pp. 244253.
    24. 24)
      • 6. Zhang, Y., Lu, H., Zhang, L., et al: ‘Video anomaly detection based on locality sensitive hashing filters’, Pattern Recognit.., 2016, 59, pp. 302311.
    25. 25)
      • 26. Felsberg, M., Kristan, M., Matas, J., et al: ‘The thermal infrared visual object tracking VOT-TIR2016 challenge results’. Computer Vision – ECCV 2016 Workshops, Amsterdam, The Netherlands, October 2016, pp. 824849.
    26. 26)
      • 15. Akin, O., Erdem, E., Erdem, A., et al: ‘Deformable part-based tracking by coupled global and local correlation filters’, J. Vis. Commun. Image Represent., 2016, 38, pp. 763774.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cvi.2017.0359
Loading

Related content

content/journals/10.1049/iet-cvi.2017.0359
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading