access icon free K-means based multiple objects tracking with long-term occlusion handling

This study presents a novel multiple objects tracking (MOT) approach that models object's appearance based on K-means, while introducing a new statistical measure for association of objects after occlusion. The proposed method is tested on several standard datasets dealing complex situations in both indoor and outdoor environments. The experimental results show that the proposed model successfully tracks multiple objects in the presence of occlusion with high accuracy. Moreover, the presented work has the capability to deal long term and complete occlusion without any prior training of the shape and motion model of the objects. Accuracy of the proposed method is comparable with that of the existing state-of-the-art techniques as it successfully deals with all MOT cases in the standard datasets. Most importantly, the proposed method is cost effective in terms of memory and/or computation as compared with that of the existing state-of-the-art techniques. These traits make the proposed system very useful for real-time embedded video surveillance platforms especially those that have low memory/compute resources.

Inspec keywords: object tracking; statistical analysis; video surveillance

Other keywords: K-means based multiple objects tracking; statistical measure; long-term occlusion handling; MOT approach; real-time embedded video surveillance platform; standard dataset; motion model; state-of-the-art technique

Subjects: Optical, image and video signal processing; Other topics in statistics; Computer vision and image processing techniques; Other topics in statistics

References

    1. 1)
      • 6. An, X., Kim, J., Han, Y.: ‘Optimal colour-based mean shift algorithm for tracking objects’, IET Comput. Vis., 2014, 8, (3), pp. 235244.
    2. 2)
      • 34. Leal-Taix, L., Milan, A., Reid, I.: ‘MOTChallenge 2015: Towards a Benchmark for Multi-Target Tracking’. arXiv:1504.01942 [cs.CV], 2015.
    3. 3)
      • 35. Li, X., et al: ‘A survey of appearance models in visual object tracking’, ACM Trans. Intell. Syst. Technol., 2013, 4, (4), Article No. 58.
    4. 4)
      • 2. Collins, R.T., Lipton, A.J., Kanade, T.: ‘A system for video surveillance and monitoring’. VSAM Final Report, Technical Report CMU-RI-TR-00-12, Carnegie Mellon University, May 2000.
    5. 5)
      • 3. Haritaoglu, I., Harwood, D., Davis, L.S.: ‘Real-time surveillance of people and their activities’, IEEE Trans. Pattern Anal. Mach. Intell., 2000, 22, (8), pp. 809830.
    6. 6)
      • 14. Wu, Y., Yu, T., Hua, G.: ‘Tracking appearances with occlusions’. Proc. of IEEE Computer Society Conf. on CVPR, 2003, vol. 1, pp. 789795.
    7. 7)
      • 4. Stauffer, C., Grimson, W.: ‘Learning patterns of activity using real time tracking’, IEEE Trans. Pattern Anal. Mach. Intell., 2000, 22, (8), pp. 747767.
    8. 8)
      • 9. Papadourakis, V., Argyros, A.: ‘Multiple objects tracking in the presence of long-term occlusions’, Comput. Vis. Image Underst., 2010, 114, (7), pp. 835846.
    9. 9)
      • 12. Azmat, S.: ‘Multilayer background modeling under occlusions for spatio-temporal scene analysis’. PhD thesis, Georgia Institute of Technology, 2014.
    10. 10)
      • 21. Casares, M.: ‘Energy-efficient light weight algorithms for embedded smart cameras: design, implementation and performance analysis’. PhD thesis, Syracuse University, 2014.
    11. 11)
      • 17. Kratz, L., Nishino, K.: ‘Tracking with local spatiotemporal motion patterns in extremely crowded scenes’. CVPR, 2010, pp. 693700.
    12. 12)
      • 25. Intellio ILC-BL series URL http://www.videoline-tvcc.com/upload/pdf/ILC-BL_series_datasheet_ENG.pdf.
    13. 13)
      • 23. CMUCAM5 smart camera. Available at http://www.cmucam.org/projects/cmucam5.
    14. 14)
      • 8. Yang, T., Li, S.Z., Pan, Q., et al: ‘Real-time multiple objects tracking with occlusion handling in dynamic scenes’. CVPR, 2005, vol. 1, pp. 970975.
    15. 15)
      • 20. Khan, S., Shah, M.: ‘Tracking people in presence of occlusion’. Asian Conf. on Computer Vision, 2000, pp. 11321137.
    16. 16)
      • 11. Yang, B., Nevatia, R.: ‘Multi-target tracking by online learning of non-linear motion patterns and robust appearance models’. CVPR, 2012, pp. 19181925.
    17. 17)
      • 16. Nguyen, H.T., Smeulders, A.W.M.: ‘Fast occluded object tracking by a robust appearance filter’, IEEE Trans. Pattern Anal. Mach. Intell., 2004, 26, (8), pp. 10991104.
    18. 18)
      • 15. Senior, A.: ‘Tracking with probabilistic appearance models’. Proc. ECCV workshop on Performance Evaluation of Tracking and Surveillance Systems, 2002, pp. 4855.
    19. 19)
      • 7. Yilmaz, A., Javed, O., Shah, M.: ‘Object tracking: a survey’, ACM Comput. Surv., 2006, 38, (4), Article No. 13.
    20. 20)
      • 30. Lab Data. Available at http://cvrr.ucsd.edu/aton/shadow/data/Laboratory_raw.avi.
    21. 21)
      • 19. Tao, H., Sawhney, H.S., Kumar, R.: ‘Object tracking with bayesian estimation of dynamic layer representations’, IEEE Trans. Pattern Anal. Mach. Intell., 2002, 24, (1), pp. 7589.
    22. 22)
      • 13. Eng, H.-L., Wang, J., Yau, W-Y.: ‘A Bayesian Framework for Robust Human Detection and Occlusion Handling using Human Shape Model’. Proc. of ICPR, 2004, vol. 2, pp. 257260.
    23. 23)
      • 1. Luo, W., Xing, J., Zhang, X., et al: ‘Multiple object tracking: a literature review’. arXiv:1409.7618v3 [cs.CV], sep 2015.
    24. 24)
      • 32. PETS 2009 benchmark data (2009). Available at http://www.cvg.reading.ac.uk/PETS2009/a.html.
    25. 25)
      • 33. Wu, Y., Lim, J., Yang, M.-H.: ‘Online object tracking: a benchmark’. CVPR, 2013, pp. 24112418.
    26. 26)
      • 36. Barry, B., Brick, C., Connor, F.: ‘Always-on vision processing unit for mobile applications’, IEEE Micro, 2015, 35, (8), pp. 5666.
    27. 27)
      • 31. PETS 2001 benchmark data (2001). Available at http://ftp.pets.rdg.ac.uk/pub/PETS2001/.
    28. 28)
      • 26. Sony XCISX100C/XP smart camera. Available at https://pro.sony.com/bbsc/ssr/cat-camerasindustrial/cat-cismartcameras/product-XCISX100C%2FXP/.
    29. 29)
      • 27. Apewokin, S., Valentine, B., Forsthoefel, D.: ‘Multimodal mean adaptive background for embedded real-time video surveillance’. Proc. of the Embedded Computer Vision Workshop (ECVW07), 2007, pp. 16.
    30. 30)
      • 18. Henriques, J.F., Caseiro, R., Batista, J.: ‘Globally optimal solution to multi-object tracking with merged measurements’. CVPR, 2011, pp. 24702477.
    31. 31)
      • 10. Shafique, K., Lee, M.W., Haering, N.: ‘A rank constrained continuous formulation of multi-frame multi-target tracking problem’. CVPR, 2008, pp. 18.
    32. 32)
      • 28. CAVIAR Benchmark Data (2003). Available at http://homepages.inf.ed.ac.uk/rbf/CAVIARDATA1/.
    33. 33)
      • 29. Sanin, A., Sanderson, C., Lovell, B. C.: ‘Shadow detection: a survey and comparative evaluation of recent methods’, Pattern Recogn., 2012, 45, (4), pp. 16841695.
    34. 34)
      • 22. Bobda, C., Velipasalar, S.: (eds.) ‘Distributed embedded smart cameras architectures, design and applicationsBook Chapter 2, (Springer Science + Business Media, New York, 2014).
    35. 35)
      • 24. Chen, P., Ahammas, P., Boyer, C.: ‘CITRIC: A low-bandwidth wireless camera network platform’. Proc. ACM/IEEE Int. Conf. on Distributed Smart Cameras, 2008, pp. 110.
    36. 36)
      • 5. Elbahri, M., Taleb, N., Kpalma, K., et al: ‘Parallel algorithm implementation for multi-object tracking and surveillance’, IET Comput. Vis., 2016, 10, (3), pp. 202211.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cvi.2016.0156
Loading

Related content

content/journals/10.1049/iet-cvi.2016.0156
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading