http://iet.metastore.ingenta.com
1887

K-means based multiple objects tracking with long-term occlusion handling

K-means based multiple objects tracking with long-term occlusion handling

For access to this article, please select a purchase option:

Buy article PDF
$19.95
(plus tax if applicable)
Buy Knowledge Pack
10 articles for $120.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
IET Computer Vision — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

This study presents a novel multiple objects tracking (MOT) approach that models object's appearance based on K-means, while introducing a new statistical measure for association of objects after occlusion. The proposed method is tested on several standard datasets dealing complex situations in both indoor and outdoor environments. The experimental results show that the proposed model successfully tracks multiple objects in the presence of occlusion with high accuracy. Moreover, the presented work has the capability to deal long term and complete occlusion without any prior training of the shape and motion model of the objects. Accuracy of the proposed method is comparable with that of the existing state-of-the-art techniques as it successfully deals with all MOT cases in the standard datasets. Most importantly, the proposed method is cost effective in terms of memory and/or computation as compared with that of the existing state-of-the-art techniques. These traits make the proposed system very useful for real-time embedded video surveillance platforms especially those that have low memory/compute resources.

References

    1. 1)
      • 1. Luo, W., Xing, J., Zhang, X., et al: ‘Multiple object tracking: a literature review’. arXiv:1409.7618v3 [cs.CV], sep 2015.
    2. 2)
      • 2. Collins, R.T., Lipton, A.J., Kanade, T.: ‘A system for video surveillance and monitoring’. VSAM Final Report, Technical Report CMU-RI-TR-00-12, Carnegie Mellon University, May 2000.
    3. 3)
      • 3. Haritaoglu, I., Harwood, D., Davis, L.S.: ‘Real-time surveillance of people and their activities’, IEEE Trans. Pattern Anal. Mach. Intell., 2000, 22, (8), pp. 809830.
    4. 4)
      • 4. Stauffer, C., Grimson, W.: ‘Learning patterns of activity using real time tracking’, IEEE Trans. Pattern Anal. Mach. Intell., 2000, 22, (8), pp. 747767.
    5. 5)
      • 5. Elbahri, M., Taleb, N., Kpalma, K., et al: ‘Parallel algorithm implementation for multi-object tracking and surveillance’, IET Comput. Vis., 2016, 10, (3), pp. 202211.
    6. 6)
      • 6. An, X., Kim, J., Han, Y.: ‘Optimal colour-based mean shift algorithm for tracking objects’, IET Comput. Vis., 2014, 8, (3), pp. 235244.
    7. 7)
      • 7. Yilmaz, A., Javed, O., Shah, M.: ‘Object tracking: a survey’, ACM Comput. Surv., 2006, 38, (4), Article No. 13.
    8. 8)
      • 8. Yang, T., Li, S.Z., Pan, Q., et al: ‘Real-time multiple objects tracking with occlusion handling in dynamic scenes’. CVPR, 2005, vol. 1, pp. 970975.
    9. 9)
      • 9. Papadourakis, V., Argyros, A.: ‘Multiple objects tracking in the presence of long-term occlusions’, Comput. Vis. Image Underst., 2010, 114, (7), pp. 835846.
    10. 10)
      • 10. Shafique, K., Lee, M.W., Haering, N.: ‘A rank constrained continuous formulation of multi-frame multi-target tracking problem’. CVPR, 2008, pp. 18.
    11. 11)
      • 11. Yang, B., Nevatia, R.: ‘Multi-target tracking by online learning of non-linear motion patterns and robust appearance models’. CVPR, 2012, pp. 19181925.
    12. 12)
      • 12. Azmat, S.: ‘Multilayer background modeling under occlusions for spatio-temporal scene analysis’. PhD thesis, Georgia Institute of Technology, 2014.
    13. 13)
      • 13. Eng, H.-L., Wang, J., Yau, W-Y.: ‘A Bayesian Framework for Robust Human Detection and Occlusion Handling using Human Shape Model’. Proc. of ICPR, 2004, vol. 2, pp. 257260.
    14. 14)
      • 14. Wu, Y., Yu, T., Hua, G.: ‘Tracking appearances with occlusions’. Proc. of IEEE Computer Society Conf. on CVPR, 2003, vol. 1, pp. 789795.
    15. 15)
      • 15. Senior, A.: ‘Tracking with probabilistic appearance models’. Proc. ECCV workshop on Performance Evaluation of Tracking and Surveillance Systems, 2002, pp. 4855.
    16. 16)
      • 16. Nguyen, H.T., Smeulders, A.W.M.: ‘Fast occluded object tracking by a robust appearance filter’, IEEE Trans. Pattern Anal. Mach. Intell., 2004, 26, (8), pp. 10991104.
    17. 17)
      • 17. Kratz, L., Nishino, K.: ‘Tracking with local spatiotemporal motion patterns in extremely crowded scenes’. CVPR, 2010, pp. 693700.
    18. 18)
      • 18. Henriques, J.F., Caseiro, R., Batista, J.: ‘Globally optimal solution to multi-object tracking with merged measurements’. CVPR, 2011, pp. 24702477.
    19. 19)
      • 19. Tao, H., Sawhney, H.S., Kumar, R.: ‘Object tracking with bayesian estimation of dynamic layer representations’, IEEE Trans. Pattern Anal. Mach. Intell., 2002, 24, (1), pp. 7589.
    20. 20)
      • 20. Khan, S., Shah, M.: ‘Tracking people in presence of occlusion’. Asian Conf. on Computer Vision, 2000, pp. 11321137.
    21. 21)
      • 21. Casares, M.: ‘Energy-efficient light weight algorithms for embedded smart cameras: design, implementation and performance analysis’. PhD thesis, Syracuse University, 2014.
    22. 22)
      • 22. Bobda, C., Velipasalar, S.: (eds.) ‘Distributed embedded smart cameras architectures, design and applicationsBook Chapter 2, (Springer Science + Business Media, New York, 2014).
    23. 23)
      • 23. CMUCAM5 smart camera. Available at http://www.cmucam.org/projects/cmucam5.
    24. 24)
      • 24. Chen, P., Ahammas, P., Boyer, C.: ‘CITRIC: A low-bandwidth wireless camera network platform’. Proc. ACM/IEEE Int. Conf. on Distributed Smart Cameras, 2008, pp. 110.
    25. 25)
      • 25. Intellio ILC-BL series URL http://www.videoline-tvcc.com/upload/pdf/ILC-BL_series_datasheet_ENG.pdf.
    26. 26)
      • 26. Sony XCISX100C/XP smart camera. Available at https://pro.sony.com/bbsc/ssr/cat-camerasindustrial/cat-cismartcameras/product-XCISX100C%2FXP/.
    27. 27)
      • 27. Apewokin, S., Valentine, B., Forsthoefel, D.: ‘Multimodal mean adaptive background for embedded real-time video surveillance’. Proc. of the Embedded Computer Vision Workshop (ECVW07), 2007, pp. 16.
    28. 28)
      • 28. CAVIAR Benchmark Data (2003). Available at http://homepages.inf.ed.ac.uk/rbf/CAVIARDATA1/.
    29. 29)
      • 29. Sanin, A., Sanderson, C., Lovell, B. C.: ‘Shadow detection: a survey and comparative evaluation of recent methods’, Pattern Recogn., 2012, 45, (4), pp. 16841695.
    30. 30)
      • 30. Lab Data. Available at http://cvrr.ucsd.edu/aton/shadow/data/Laboratory_raw.avi.
    31. 31)
      • 31. PETS 2001 benchmark data (2001). Available at http://ftp.pets.rdg.ac.uk/pub/PETS2001/.
    32. 32)
      • 32. PETS 2009 benchmark data (2009). Available at http://www.cvg.reading.ac.uk/PETS2009/a.html.
    33. 33)
      • 33. Wu, Y., Lim, J., Yang, M.-H.: ‘Online object tracking: a benchmark’. CVPR, 2013, pp. 24112418.
    34. 34)
      • 34. Leal-Taix, L., Milan, A., Reid, I.: ‘MOTChallenge 2015: Towards a Benchmark for Multi-Target Tracking’. arXiv:1504.01942 [cs.CV], 2015.
    35. 35)
      • 35. Li, X., et al: ‘A survey of appearance models in visual object tracking’, ACM Trans. Intell. Syst. Technol., 2013, 4, (4), Article No. 58.
    36. 36)
      • 36. Barry, B., Brick, C., Connor, F.: ‘Always-on vision processing unit for mobile applications’, IEEE Micro, 2015, 35, (8), pp. 5666.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cvi.2016.0156
Loading

Related content

content/journals/10.1049/iet-cvi.2016.0156
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address