access icon free Self co-articulation detection and trajectory guided recognition for dynamic hand gestures

Hand gestures are a natural way of communication among humans in everyday life. Presence of spatiotemporal variations and unwanted movements within a gesture called self co-articulation makes the segmentation a challenging task. The study reveals that the self co-articulation may be used as one of the feature to enhance the performance of hand gesture recognition system. It was detected from the gesture trajectory by addition of speed information along with the pause in the gesture spotting phase. Moreover, a new set of novel features in the feature extraction stage was used such as position of the hand, self co-articulated features, ratio and distance features. The ANN and SVM were used to develop two independent models using new set of features as input. The models based on CRF and HCRF was used to develop the baseline system for the present study. The experimental results suggest that the proposed new set of features provides improvement in terms of accuracy using ANN (7.48%) and SVM (9.38%) based models as compared with baseline CRF based model. There are also significant improvements in the performances of both ANN (2.08%) and SVM (3.98%) based models as compared with HCRF based model.

Inspec keywords: random processes; spatiotemporal phenomena; feature extraction; neural nets; gesture recognition; support vector machines

Other keywords: HCRF; gesture spotting phase; spatiotemporal variation; ANN; gesture trajectory; self-co-articulation detection; artificial neural network; feature extraction; dynamic hand gesture recognition system; multiclass support vector machine; trajectory guided recognition; SVM; hidden conditional random field

Subjects: Other topics in statistics; Other topics in statistics; Image recognition; Neural computing techniques; User interfaces; Computer vision and image processing techniques; Knowledge engineering techniques

References

    1. 1)
    2. 2)
    3. 3)
    4. 4)
      • 16. Elmezain, M., Al-Hamadi, A., Michaelis, B.: ‘Hand gesture recognition based on combined feature extraction’, World Acad. Sci. Eng. Technol., 2009, 3, (12), pp. 822827.
    5. 5)
    6. 6)
      • 8. Ren, Z., Meng, J., Yuan, J.: ‘Depth camera based hand gesture recognition and its applications in human–computer-interaction’. Proc. Eighth Int. Conf. on Information, Communications and Signal Processing, 2011, pp. 15.
    7. 7)
    8. 8)
      • 19. Haykin, S.: ‘Neural networks: a comprehensive foundation’ (Prentice-Hall Inc., NJ, 1999).
    9. 9)
    10. 10)
    11. 11)
    12. 12)
      • 7. Liu, X., Fujimura, K.: ‘Hand gesture recognition using depth data’. Proc. Sixth IEEE Int. Conf. on Automatic Face and Gesture Recognition, 2004, pp. 529534.
    13. 13)
      • 4. Mistry, P., Maes, P., Chang, L.: ‘WUW-wear Ur world: a wearable gestural interface’, Proc. ACM CHI'09 Ext. Abstr. Hum. Factors Comput. Syst., 2009, pp. 41114116.
    14. 14)
    15. 15)
      • 1. Bhuyan, M.K., Bora, P.K., Ghosh, D.: ‘Trajectory guided recognition of hand gestures having only global motions’, World Acad. Sci. Eng. Technol., 2008, 2, (9), pp. 753764.
    16. 16)
    17. 17)
    18. 18)
    19. 19)
      • 22. Vapnik, V.N.: ‘Statistical learning theory’ (John Wiley & Sons, New York, 1998), pp. 423424.
    20. 20)
    21. 21)
    22. 22)
    23. 23)
      • 18. Bhuyan, M.K., Ghosh, D., Bora, P.K.: ‘Feature extraction from 2D gesture trajectory in dynamic hand gesture recognition’. Proc. IEEE Conf. on Cybernetics and Intelligent Systems, 2006, pp. 16.
    24. 24)
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cvi.2014.0432
Loading

Related content

content/journals/10.1049/iet-cvi.2014.0432
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading