Sparsity aware dynamic gesture recognition using radar sensors with angular diversity

Sparsity aware dynamic gesture recognition using radar sensors with angular diversity

For access to this article, please select a purchase option:

Buy article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for $120.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
IET Radar, Sonar & Navigation — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

In this study, two radar sensors with angular diversity are used to recognise dynamic hand gestures by analysing the sparse micro-Doppler signatures. The radar echoes are firstly mapped into the time–frequency domain through the Gaussian-windowed Fourier dictionary at each radar sensor. Then the sparse time–frequency features are extracted via the orthogonal matching pursuit algorithm. Finally, the sparse time–frequency features at two radar sensors are fused and input into the modified-Hausdorff-distance-based nearest neighbour classifier to achieve the dynamic hand gesture recognition. The experimental results based on the measured data under three different experimental scenes demonstrate that (i) the recognition accuracy can be improved by fusing the features extracted at two radar sensors when each radar sensor works well on its own; (ii) the recognition accuracy produced by feature fusion keeps satisfactory even if one of the radar sensors has poor performance, which means that the feature fusion can improve the robustness of the recognition system; and (iii) it would be more helpful if the line-of-sights of the two radar sensors are set to be orthogonal to each other.


    1. 1)
      • 1. John, V., Umetsu, M., Boyali, A., et al: ‘Real-time hand posture and gesture-based touchless automotive user interface using deep learning’. 2017 IEEE Intelligent Vehicles Symp. (IV), Los Angeles, USA, June 2017, pp. 869874.
    2. 2)
      • 2. Gupta, S., Molchanov, P., Yang, X., et al: ‘Towards selecting robust hand gestures for automotive interfaces’. 2016 IEEE Intelligent Vehicles Symp. (IV), Gothenburg, Sweden, June 2016, pp. 13501357.
    3. 3)
      • 3. Molchanov, P., Gupta, S., Kim, K., et al: ‘Short-range FMCW monopulse radar for hand-gesture sensing’. 2015 IEEE Radar Conf. (RadarCon), Arlington, USA, May 2015, pp. 14911496.
    4. 4)
      • 4. Mitra, S., Acharya, T.: ‘Gesture recognition: a survey’, IEEE Trans. Syst. Man Cybern. C, Appl. Rev., 2007, 37, (3), pp. 311324.
    5. 5)
      • 5. Rautaray, S. S., Agrawal, A.: ‘Vision based hand gesture recognition for human computer interaction: a survey’, Artif. Intell. Rev., 2015, 43, (1), pp. 154.
    6. 6)
      • 6. Kim, H. K., Lee, J., Yu, C., et al: ‘3D hand gestures calibration method for multi-display by using a depth camera’. 2017 Int. Conf. on Information and Communication Technology Convergence (ICTC), Jeju Island, Korea (South), October 2017, pp. 10441046.
    7. 7)
      • 7. Parashar, K. N., Oveneke, M. C., Rykunov, M., et al: ‘Micro-Doppler feature extraction using convolutional auto-encoders for low latency target classification’. 2017 IEEE Radar Conf. (RadarConf), Seattle, USA, May 2017, pp. 17391744.
    8. 8)
      • 8. Wang, F. K., Tang, M. C., Chiu, Y. C., et al: ‘Gesture sensing using retransmitted wireless communication signals based on Doppler radar technology’, IEEE Trans. Microw. Theory Tech., 2015, 63, (12), pp. 45924602.
    9. 9)
      • 9. Wan, Q., Li, Y., Li, C., et al: ‘Gesture recognition for smart home applications using portable radar sensors’. Proc. of 36th Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society, Chicago, USA, August 2014, pp. 64146417.
    10. 10)
      • 10. Li, G., Zhang, R., Ritchie, M., et al: ‘Sparsity-driven micro-Doppler feature extraction for dynamic hand gesture recognition’, IEEE Trans. Aerosp. Electron. Syst., 2018, 54, (2), pp. 655665.
    11. 11)
      • 11. Kim, Y., Toomajian, B.: ‘Hand gesture recognition using micro-Doppler signatures with convolutional neural network’, IEEE Access, 2016, 4, pp. 71257130.
    12. 12)
      • 12. Zhang, S., Li, G., Ritchie, M., et al: ‘Dynamic hand gesture classification based on radar micro-Doppler signatures’. Proc. of 2016 CIE Int. Conf. on Radar, Guangzhou, China, October 2016, pp. 19771980.
    13. 13)
      • 13. Thayaparan, T., Abrol, S., Riseborough, E., et al: ‘Analysis of radar micro-Doppler signatures from experimental helicopter and human data’, IET Radar Sonar Navig., 2007, 1, (4), pp. 289299.
    14. 14)
      • 14. Kim, Y., Toomajian, B.: ‘Application of Doppler radar for the recognition of hand gestures using optimized deep convolutional neural networks’. 2017 11th European Conf. on Antennas and Propagation (EUCAP), Paris, France, March 2017, pp. 12581260.
    15. 15)
      • 15. Wu, Q., Zhang, Y.D., Tao, W., et al: ‘Radar-based fall detection based on Doppler time–frequency signatures for assisted living’, IET Radar Sonar Navig., 2015, 9, (2), pp. 164172.
    16. 16)
      • 16. Kim, Y., Ling, H.: ‘Human activity classification based on micro-Doppler signatures using a support vector machine’, IEEE Trans. Geosci. Remote Sens., 2009, 47, (5), pp. 13281337.
    17. 17)
      • 17. Balleri, A., Chetty, K., Woodbridge, K.: ‘Classification of personnel targets by acoustic micro-Doppler signatures’, IET Radar Sonar Navig., 2011, 5, (9), pp. 943951.
    18. 18)
      • 18. Fairchild, D. P., Narayanan, R. M.: ‘Classification of human motions using empirical mode decomposition of human micro-Doppler signatures’, IET Radar Sonar Navig., 2014, 8, (5), pp. 425434.
    19. 19)
      • 19. Fioranelli, F., Ritchie, M., Gürbüz, S., et al: ‘Feature diversity for optimized human micro-Doppler classification using multistatic radar’, IEEE Trans. Aerosp. Electron. Syst., 2017, 53, (2), pp. 640654.
    20. 20)
      • 20. Yang, L., Chen, G., Li, G.: ‘Classification of personnel targets with baggage using dual-band radar’, Remote Sens., 2017, 9, (6), pp. 594693.
    21. 21)
      • 21. Zhang, R., Li, G., Clemente, C., et al: ‘Multi-aspect micro-Doppler signatures for attitude-independent L/N quotient estimation and its application to helicopter classification’, IET Radar Sonar Navig., 2017, 11, (4), pp. 701708.
    22. 22)
      • 22. Fioranelli, F., Ritchie, M., Griffiths, H.: ‘Multistatic human micro-Doppler classification of armed/unarmed personnel’, IET Radar Sonar Navig., 2015, 9, (7), pp. 857865.
    23. 23)
      • 23. Fairchild, D. P., Narayanan, R. M.: ‘Multistatic micro-Doppler radar for determining target orientation and activity classification’, IEEE Trans. Aerosp. Electron. Syst., 2016, 52, (1), pp. 512521.
    24. 24)
      • 24. Kanungo, T., Mount, D. M., Netanyahu, N. S., et al: ‘An efficient k-means clustering algorithm: analysis and implementation’, IEEE Trans. Pattern Anal. Mach. Intell., 2002, 24, (7), pp. 881892.
    25. 25)
      • 25. Dubuisson, M. P., Jain, A. K.: ‘A modified Hausdorff distance for object matching’. Proc. of the Int. Conf. on Pattern Recognition (ICPR'94), Jerusalem, Israel, October 1994, pp. 566568.
    26. 26)
      • 26. Harrington, P.: ‘Machine learning in action’ (Manning Company, Shelter Island, 2012).
    27. 27)
      • 27. Jupri, M., Sarno, R.: ‘Taxpayer compliance classification using C4.5, SVM, KNN, Naive Bayes and MLP’. 2018 Int. Conf. on Information and Communications Technology (ICOIACT), Yogyakarta, Indonesia, March 2018, pp. 297303.

Related content

This is a required field
Please enter a valid email address