access icon free Survey on hardware implementations of visual object trackers

Visual object tracking is an active topic in the computer vision domain with applications extending over numerous fields. The main sub-tasks required to build an object tracker (e.g. object detection, feature extraction and object tracking) are computationally intensive. Also, the real-time operation of the tracker is indispensable for almost all of its applications. Therefore, complete hardware or hardware/software co-design approaches are pursued for better tracker implementations. This study presents a literature survey of the hardware implementations of object trackers over the last two decades. Although several tracking surveys exist in the literature, a survey addressing the hardware implementations of the different trackers is missing. The authors believe this survey would fill the gap and complete the picture with the existing surveys of how to design an efficient tracker and point out the future directions researchers can follow in this field. They highlight the lack of hardware implementations for state-of-the-art tracking algorithms as well as for enhanced classical algorithms. They also stress the need for measuring the tracking performance of the hardware-based trackers. Additionally, enough details of the hardware-based trackers need to be provided to allow a reasonable comparison between the different implementations.

Inspec keywords: object detection; object tracking; computer vision

Other keywords: object detection; computer vision domain; visual object tracking; hardware-based trackers; object tracker; visual object trackers; feature extraction

Subjects: Optical, image and video signal processing; Computer vision and image processing techniques

References

    1. 1)
      • 68. Schlessman, J., Cheng-Yao, C., Wolf, W., et al: ‘Hardware/software co-design of an FPGA-based embedded tracking system’. 2006 Conf. on Computer Vision and Pattern Recognition Workshop (CVPRW'06), New York, USA, 2006, pp. 123123.
    2. 2)
      • 41. Yang, C., Duraiswami, R., Davis, L.: ‘Efficient mean-shift tracking via a new similarity measure’. IEEE Conf. on Computer Vision and Pattern Recognition, San Diego, USA, 2005, pp. 176183.
    3. 3)
      • 37. Ishii, I., Tatebe, T., Gu, Q.Y., et al: ‘2000 fps real-time target tracking vision system based on color histogram’. 2011 Conf. on Real-Time Image and Video Processing, San Francisco, USA, 2011.
    4. 4)
      • 4. Arth, C., Leistner, C., Bischof, H.: ‘Object reacquisition and tracking in large-scale smart camera networks’. 2007 First ACM/IEEE Int. Conf. on Distributed Smart Cameras, Vienna, Austria, 2007, pp. 148155.
    5. 5)
      • 6. Brown, J.A., Capson, D.W.: ‘A framework for 3D model-based visual tracking using a GPU-accelerated particle filter’, IEEE Trans. Vis. Comput. Graphics, 2012, 18, (1), pp. 6880.
    6. 6)
      • 23. Hu, W.M., Tan, T.N., Wang, L., et al: ‘A survey on visual surveillance of object motion and behaviors’, IEEE Trans. Syst. Man Cybern. C, Appl. Rev., 2004, 34, (3), pp. 334352.
    7. 7)
      • 79. Nam, H., Han, B.: ‘Learning multi-domain convolutional neural networks for visual tracking’. IEEE Conf. on Computer Vision and Pattern Recognition, Las Vegas, USA, 2016, pp. 42934302.
    8. 8)
      • 67. Jang, W., Oh, S., Kim, G.: ‘A hardware implementation of pyramidal KLT feature tracker for driving assistance systems’. 2009 12th Int. IEEE Conf. on Intelligent Transportation Systems, St. Louis, MO, 2009, pp. 16.
    9. 9)
      • 52. Kwon, J., Lee, K.M.: ‘Visual tracking decomposition’. 2010 IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, San Francisco, CA, 2010, pp. 12691276.
    10. 10)
      • 45. Li, S.A., Hsu, C.C., Lin, W.L., et al: ‘Hardware/software co-design of particle filter and its application in object tracking’. Proc. 2011 Int. Conf. on System Science and Engineering, Macao, 2011, pp. 8791.
    11. 11)
      • 84. Linares-Barranco, A., Gomez-Rodriguez, F., Villanueva, V., et al: ‘A Usb3.0 FPGA event-based filtering and tracking framework for dynamic vision sensors’. 2015 IEEE Int. Symp. on Circuits and Systems (ISCAS), Lisbon, Portugal, 2015, pp. 24172420.
    12. 12)
      • 29. Arulampalam, M.S., Maskell, S., Gordon, N., et al: ‘A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking’, IEEE Trans. Signal Process., 2002, 50, (2), pp. 174188.
    13. 13)
      • 26. Nguyen, D.T., Li, W.Q., Ogunbona, P.O.: ‘Human detection from images and videos: a survey’, Pattern Recognit., 2016, 51, pp. 148175.
    14. 14)
      • 19. Yilmaz, A., Javed, O., Shah, M.: ‘Object tracking: a survey’, ACM Comput. Surv., 2006, 38, (4), pp. 13.
    15. 15)
      • 44. Kitagawa, G.: ‘Non-Gaussian state-space modeling of nonstationary time-series’, J. Am. Stat. Assoc., 1987, 82, (400), pp. 10321041.
    16. 16)
      • 8. Tavares, Y.M., Nedjah, N., Mourelle, L.D.: ‘Embedded implementation of template matching using correlation and particle swarm optimization’, Computational Science and Its Applications – ICCSA 2016, Pt Ii, Beijing, China, 2016, vol. 9787, pp. 530539.
    17. 17)
      • 20. Yang, H., Shao, L., Zheng, F., et al: ‘Recent advances and trends in visual tracking: a review’, Neurocomputing, 2011, 74, (18), pp. 38233831.
    18. 18)
      • 71. Dias, F., Berry, F., Serot, J., et al: ‘Hardware, design and implementation issues on a FGPA-based smart camera’. 2007 First ACM/IEEE Int. Conf. on Distributed Smart Cameras, Vienna, Austria, 2007, pp. 1723.
    19. 19)
      • 50. Liu, W., Chen, H., Ma, L.: ‘Moving object detection and tracking based on ZYNQ FPGA and ARM SOC’. IET Int. Radar Conf. 2015, Hangzhou, 2015, pp. 14.
    20. 20)
      • 82. Serrano-Gotarredona, R., Oster, M., Lichtsteiner, P., et al: ‘Caviar: a 45k neuron, 5m synapse, 12g connects/S aer hardware sensory-processing-learning-actuating system for high-speed visual object recognition and tracking’, IEEE Trans. Neural Netw., 2009, 20, (9), pp. 14171438.
    21. 21)
      • 74. Chen, J.-Y., Hung, K.-F., Lin, H.-Y., et al: ‘Real-time FPGA-based template matching module for visual inspection application’. 2012 IEEE/ASME Int. Conf. on Advanced Intelligent Mechatronics (Aim), Kaohsiung, Taiwan, 2012, pp. 10721076.
    22. 22)
      • 39. Ali, U., Malik, M.B., Munawar, K.: ‘FPGA/soft-processor based real-time object tracking system’. Proc. 2009 5th Southern Conf. on Programmable Logic, Sao Carlos, Brazil, 2009, pp. 3337.
    23. 23)
      • 75. Ruemmele-Werner, M., Perschke, T., Braun, L., et al: ‘A FPGA based fast runtime reconfigurable real-time multi-object-tracker’. 2011 IEEE Int. Symp. on Circuits and Systems (ISCAS), Rio de Janeiro, Brazil, 2011, pp. 853856.
    24. 24)
      • 21. Smeulders, A.W.M., Chu, D.M., Cucchiara, R., et al: ‘Visual tracking: an experimental survey’, IEEE Trans. Pattern Anal. Mach. Intell., 2014, 36, (7), pp. 14421468.
    25. 25)
      • 31. Hati, K.K., Vardhanan, A.V.: ‘Review and improvement areas of mean shift tracking algorithm’. 18th IEEE Int. Symp. on Consumer Electronics (ISCE 2014), Jeju-si, South Korea,, 2014.
    26. 26)
      • 42. Jeyakar, J., Babu, R.V., Ramakrishnan, K.R.: ‘Robust object tracking with background-weighted local kernels’, Comput. Vis. Image Underst., 2008, 112, (3), pp. 296309.
    27. 27)
      • 65. Bouguet, Y.: ‘Pyramidal implementation of the Lucas Kanade feature tracker description of the algorithm’. Technical Report, Intel Micro-processor Research Labs, 2000.
    28. 28)
      • 47. Tzu-Heng, W., Jing-Ying, C., Liang-Gee, C.: ‘Algorithm and architecture for object tracking using particle filter’. 2009 IEEE Int. Conf. on Multimedia and Exposition, New York, NY, 2009, pp. 13741377.
    29. 29)
      • 16. Hong, S.H., Shi, Z.G., Chen, K.S.: ‘Easy-hardware-implementation MMPF for maneuvering target tracking: algorithm and architecture’, J. Signal Process. Syst. Signal Image Video Technol., 2010, 61, (3), pp. 259269.
    30. 30)
      • 43. Bar-Shalom, Y., Foreman, T.: ‘Tracking and data association’ (Academic Press Inc., 1988), Elsevier, Cambridge, Massachusetts.
    31. 31)
      • 32. Baker, S., Scharstein, D., Lewis, J.P., et al: ‘A database and evaluation methodology for optical flow’, Int. J. Comput. Vis., 2011, 92, (1), pp. 131.
    32. 32)
      • 14. Abd El-Halym, H.A., Mahmoud, I.I., Habib, S.E.D.: ‘Proposed hardware architectures of particle filter for object tracking’, EURASIP J. Adv. Signal Process., 2012, 2012, p. 17.
    33. 33)
      • 80. Ma, C., Huang, J.B., Yang, X.K., et al: ‘Hierarchical convolutional features for visual tracking’. IEEE Int. Conf. on Computer Vision, Santiago, Chile, 2015, pp. 30743082.
    34. 34)
      • 7. Ondruska, P., Kohli, P., Izadi, S.: ‘Mobile fusion: real-time volumetric surface reconstruction and dense tracking on mobile phones’, IEEE Trans. Vis. Comput. Graphics, 2015, 21, (11), pp. 12511258.
    35. 35)
      • 48. Happe, M., Luebbers, E., Platzner, M.: ‘A self-adaptive heterogeneous multi-core architecture for embedded real-time video object tracking’, J. Real-Time Image Process., 2013, 8, (1), pp. 95110.
    36. 36)
      • 46. Alarcon, J., Salvador, R., Moreno, F., et al: ‘A new real-time hardware architecture for road line tracking using a particle filter’. 32nd Annual Conf. on IEEE Industrial Electronics (IECON 2006), Paris, France, 2006, vol. 1-11, p. 1871-+.
    37. 37)
      • 11. Pandey, J. G., Karmakar, A., Shekhar, C., et al: ‘An embedded framework for accurate object localization using center of gravity measure with mean shift procedure’. 2015 19th Int. Symp. on VLSI Design and Test (Vdat), Ahmedabad, India, 2015, pp. 16.
    38. 38)
      • 64. Shi, J.B., Tomasi, C.: ‘Good features to track’. Proc. 1994 IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, Seattle, USA, 1994, pp. 593600.
    39. 39)
      • 70. Ahmed, J., Jafri, M.N., Shah, M., et al: ‘Real-time edge-enhanced dynamic correlation and predictive open-loop car-following control for robust tracking’, Mach. Vis. Appl., 2008, 19, (1), pp. 125.
    40. 40)
      • 38. Vijverberg, J.A., de With, P.H.N.: ‘Hardware acceleration for tracking by computing low-order geometrical moments’. Proc. 2008 IEEE Workshop on Signal Processing Systems: Sips 2008, Washington, USA, 2008, pp. 4348.
    41. 41)
      • 54. Ishii, I., Sukenobe, R., Moriue, Y., et al: ‘Real-time feature point tracking at 1000 Fps’. IEEE Int. Symp. on Computational Intelligence in Robotics and Automation, Daejeon, South Korea, 2009, pp. 515520.
    42. 42)
      • 63. Tomasi, C., Kanade, T.: ‘Detection and tracking of point features’. Technical Report CMU-CS-91-132, Carnegie Mellon University, 1991.
    43. 43)
      • 5. Choi, C., Christensen, H.I.: ‘RGB-D object tracking: a particle filter approach on GPU’. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Tokyo, Japan, 2013.
    44. 44)
      • 76. Kristan, M., Leonardis, A., Matas, J., et al: ‘The visual object tracking Vot2016 challenge results’. Computer Vision - ECCV 2016 Workshops, Pt Ii, Amsterdam, Netherlands, 2016, vol. 9914, pp. 777823.
    45. 45)
      • 3. Chen, Y.L., Wu, B.F., Huang, H.Y., et al: ‘A real-time vision system for nighttime vehicle detection and traffic surveillance’, IEEE Trans. Ind. Electron., 2011, 58, (5), pp. 20302044.
    46. 46)
      • 53. Mühlbauer, F., Bobda, C.: ‘A dynamic reconfigurable hardware/software architecture for object tracking in video streams’, EURASIP J. Embedded Syst., 2006, 2006, (1), p. 082564.
    47. 47)
      • 62. Lucas, B.D., Kanade, T.: ‘An iterative image registration technique with an application to stereo vision’. Proc. of the 7th Int. Joint Conf. on Artificial intelligence, Vancouver, BC, Canada, 1981, vol. 2, pp. 674679.
    48. 48)
      • 60. Do, Y.S., Jeong, Y.J.: ‘A new area efficient surf hardware structure and its application to object tracking’. IEEE Int. Conf. of Region 10 (TENCON), Xian, China, 2013.
    49. 49)
      • 17. Maggio, E., Cavallaro, A.: ‘Video tracking: theory and practice’ (Wiley, Oxford, U.K., 2011, 1st edn.).
    50. 50)
      • 33. Chao, H., Gu, Y., Napolitano, M.: ‘A survey of optical flow techniques for robotics navigation applications’, J. Intell. Robot. Syst., 2014, 73, (1–4), pp. 361372.
    51. 51)
      • 12. Norouznezhad, E., Bigdeli, A., Postula, A., et al: ‘Robust object tracking using local oriented energy features and its hardware/software implementation’. 11th Int. Conf. on Control, Automation, Robotics and Vision (ICARCV 2010), Singapore, 2010.
    52. 52)
      • 2. Li, X., Hu, W.M., Shen, C.H., et al: ‘A survey of appearance models in visual object tracking’, ACM Trans. Intell. Syst. Technol., 2013, 4, (4), article 58.
    53. 53)
      • 1. Wu, Y., Lim, J., Yang, M.H.: ‘Object tracking benchmark’, IEEE Trans. Pattern Anal. Mach. Intell., 2015, 37, (9), pp. 18341848.
    54. 54)
      • 9. Ali, U., Malik, M.B.: ‘Hardware/software co-design of a real-time kernel based tracking system’, J. Syst. Archit., 2010, 56, (8), pp. 317326.
    55. 55)
      • 51. Kaszubiak, J., Tornow, M., Kuhn, R.W., et al: ‘Real-time, 3-D-multi object position estimation and tracking’. Int. Conf. on Pattern Recognition, Cambridge, England, 2004, pp. 785788.
    56. 56)
      • 72. Adiono, T., Armansyah, R.F., Ikram, F.D., et al: ‘Parallel morphological template matching design for efficient human detection application’. 2016 Int. Symp. on Intelligent Signal Processing and Communication Systems (ISPACS), Phuket, 2016, pp. 14.
    57. 57)
      • 28. Salti, S., Cavallaro, A., Di Stefano, L.: ‘Adaptive appearance modeling for video tracking: survey and evaluation’, IEEE Trans. Image Process., 2012, 21, (10), pp. 43344348.
    58. 58)
      • 30. Li, T., Bolic, M., Djuric, P.M.: ‘Resampling methods for particle filtering: classification, implementation, and strategies’, IEEE Signal Process. Mag., 2015, 32, (3), pp. 7086.
    59. 59)
      • 83. Gomez-Rodriguez, F., Miro-Amarante, L., Rivas, M., et al: ‘Neuromorphic real-time objects tracking using address event representation and silicon retina’. Advances in Computational Intelligence, Int. Work-Conf. on Artificial Neural Networks (IWANN 2011), Pt I, Torremolinos, Spain, 2011, vol. 6691, pp. 133140.
    60. 60)
      • 69. Chai, Z., Shi, J.: ‘Improving KLT in embedded systems by processing oversampling video sequence in real-time’. 2011 Int. Conf. on Reconfigurable Computing and FPGAs, Cancun, 2011, pp. 297302.
    61. 61)
      • 22. Kim, I.S., Choi, H.S., Yi, K.M., et al: ‘Intelligent visual surveillance – a survey’, Int. J. Control Autom. Syst., 2010, 8, (5), pp. 926939.
    62. 62)
      • 59. Hwang, Y.T., Tsai, B.C., Pai, Y.T., et al: ‘Feature points based video object tracking for dynamic scenes and its FPGA system prototyping’. 2014 Tenth Int. Conf. on Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP 2014), Kitakyushu, Japan, 2014, pp. 325328.
    63. 63)
      • 77. Morsi, N.N., Abdelhalim, M.B., Shehata, K.A.: ‘Efficient hardware implementation of Pso-based object tracking system’. 2013 Int. Conf. on Electronics, Computer and Computation (ICECCO), Ankara, Turkey, 2013, pp. 155158.
    64. 64)
      • 27. Cheng, G., Han, J.: ‘A survey on object detection in optical remote sensing images’, ISPRS J. Photogramm. Remote Sens., 2016, 117, pp. 1128.
    65. 65)
      • 55. Cho, J.U., Jin, S.H., Pham, X.D., et al: ‘FPGA-based real-time visual tracking system using adaptive color histograms’. 2007 IEEE Int. Conf. on Robotics and Biomimetics, Sanya, China, 2007, vol. 1-5, pp. 172177.
    66. 66)
      • 40. Shen, C.H., Brooks, M.J., van den Hengel, A.: ‘Fast global kernel density mode seeking with application to localisation and tracking’. IEEE Int. Conf. on Computer Vision, Beijing, China, 2005, pp. 15161523.
    67. 67)
      • 15. Cho, J.U., Jin, S.H., Pham, X.D., et al: ‘Multiple objects tracking circuit using particle filters with multiple features’. IEEE Int. Conf. on Robotics and Automation ICRA, Rome, Italy, 2007, p. 4639.
    68. 68)
      • 78. Hsu, C.-C., Kao, W.-C., Chu, Y.-C., et al: ‘Object tracking based on hardware/software co-design of particle filter and particle swarm optimization’. IEEE Int. Conf. on Consumer Electronics, Berlin, 2014, pp. 225227.
    69. 69)
      • 56. Tomioka, Y., Takasu, R., Aoki, T., et al: ‘FPGA implementation of exclusive block matching for robust moving object extraction and tracking’, IEICE Trans. Inf. Syst., 2014, E97D, (3), pp. 573582.
    70. 70)
      • 49. Zhao, P., Zhu, H., Li, H., et al: ‘A directional-edge-based real-time object tracking system employing multiple candidate-location generation’, IEEE Trans. Circuits Syst. Video Technol., 2013, 23, (3), pp. 503517.
    71. 71)
      • 73. Samochin, A., Artyomov, E., Yadid-Pecht, O.: ‘Optimized hardware architecture for object recognition and tracking’, Opt. Eng., 2010, 49, (10), 107001.
    72. 72)
      • 34. Fortun, D., Bouthemy, P., Kervrann, C.: ‘Optical flow modeling and computation: a survey’, Comput. Vis. Image Underst., 2015, 134, pp. 121.
    73. 73)
      • 25. Enzweiler, M., Gavrila, D.M.: ‘Monocular pedestrian detection: survey and experiments’, IEEE Trans. Pattern Anal. Mach. Intell., 2009, 31, (12), pp. 21792195.
    74. 74)
      • 13. Lu, X., Ren, D., Yu, S.: ‘FPGA-based real-time object tracking for mobile robot’. 2010 Int. Conf. on Audio, Language and Image Processing, Shanghai, 2010, pp. 16571662.
    75. 75)
      • 24. Geronimo, D., Lopez, A.M., Sappa, A.D., et al: ‘Survey of pedestrian detection for advanced driver assistance systems’, IEEE Trans. Pattern Anal. Mach. Intell., 2010, 32, (7), pp. 12391258.
    76. 76)
      • 61. Yasukawa, S., Okuno, H., Ishii, K., et al: ‘Real-time object tracking based on scale-invariant features employing bio-inspired hardware’, Neural Netw., 2016, 81, pp. 2938.
    77. 77)
      • 66. Ghiasi, S., Moon, H.J., Nahapetian, A., et al: ‘Collaborative and reconfigurable object tracking’, J. Supercomput., 2004, 30, (3), pp. 213238.
    78. 78)
      • 57. Gu, Q.Y., Aoyama, T., Takaki, T., et al: ‘High frame-rate tracking of multiple color-patterned objects’, J. Real-Time Image Process., 2016, 11, (2), pp. 251269.
    79. 79)
      • 58. Yamaoka, K., Morimoto, T., Adachi, H., et al: ‘Multi-object tracking VLSI architecture using image-scan based region growing and feature matching’. Proc. 2006 IEEE Int. Symp. on Circuits and Systems, Kos, Greece, 2006, vols 1-11, p. 5575-+.
    80. 80)
      • 35. Bradski, G.R.: ‘Computer vision face tracking for use in a perceptual user interface’. IEEE Work-Shop on Applications of Computer Vision, Princeton, NJ, 1998, pp. 214219.
    81. 81)
      • 18. Ali, A., Jalil, A., Niu, J.W., et al: ‘Visual object tracking-classical and contemporary approaches’, Front. Comput. Sci., 2016, 10, (1), pp. 167188.
    82. 82)
      • 81. Tao, R., Gavves, E., Smeulders, A.W.M.: ‘Siamese instance search for tracking’. IEEE Conf. on Computer Vision and Pattern Recognition, Las Vegas, USA, 2016, pp. 14201429.
    83. 83)
      • 10. Pandey, M., Ubhi, J.S., Raju, K.S.: ‘Computational acceleration of real-time kernel-based tracking system’, J. Circuits Syst. Comput., 2016, 25, (4), 1650030.
    84. 84)
      • 36. Comaniciu, D., Ramesh, V., Meer, P.: ‘Real-time tracking of non-rigid objects using mean shift’. IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, South Carolina, USA, 2000, pp. 142149.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-ipr.2018.5952
Loading

Related content

content/journals/10.1049/iet-ipr.2018.5952
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading