Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

access icon free Gait-based human age classification using a silhouette model

Age estimation at a distance has potential applications including visual surveillance and monitoring in public places. Far from the camera, image resolution is significantly degraded. In fact, age estimation using classical methods such as face is not reliable. Given that gait is very sensitive to ageing, gait analysis is the suitable solution for age estimation at a great distance from the camera. Medical and biomechanical studies prove that older adults adapt their walking toward a safer and more stable gait and an established balance. Indeed, in this study the authors propose a gait-based descriptor for age classification using a silhouette projection model. The proposed model encapsulates both spatiotemporal longitudinal (SLP) and transverse (STP) projections of the silhouette during a gait cycle. The proposed model aims to represent the arms' swing, the head's pitch, the hunched posture and the stride's length, which are among the most outstanding ageing characteristics that appear on the elderly's gait. Although age classification using gait is a very challenging task, SLP and STP curves analysis shows a considerable discrimination between young and elderly people. Also, experiments conducted on the OU-ISIR database prove that their proposed descriptor outperforms existing ones by reaching an important recognition rate.

References

    1. 1)
      • 18. Yu, S., Tan, T., Huang, K., et al: ‘A study on gait-based gender classification’, IEEE Trans. Image Process., 2009, 18, (8), pp. 19051910.
    2. 2)
      • 34. Makihara, Y., Okumura, M., Iwama, H., et al: ‘Gait-based age estimation using a whole-generation gait database’. Proc. Int. Joint Conf. on Biometrics, Washington, DC, USA, October 2011, pp. 16.
    3. 3)
      • 7. Seralkhatem, A., Osman, A., Sagayan, V., et al: ‘Age-invariant face recognition system using combined shape and texture features’, IET Biometrics, 2015, 4, (2), pp. 98115.
    4. 4)
      • 17. Li, X., Maybank, S., Yan, S., et al: ‘Gait components and their application to gender recognition’, Trans. Syst. Man Cybern. C, 2008, 38, (2), pp. 145155.
    5. 5)
      • 25. Brandt, T: ‘Vertigo: its multisensory syndromes’ (Springer, 2002), pp. 386387.
    6. 6)
      • 3. Song, Z., Bingbing, N., Guo, D., et al: ‘Learning universal multi-view age estimator using video context’. Proc. Int. Conf. Computer Vision, Barcelona, Spain, November 2011, pp. 241248.
    7. 7)
      • 36. Giaquinto Mda, S., Ciotolaa, E., Margutti, F.: ‘Gait in the water: a comparison between young and elderly subjects’, Disability Rehabil., 2007, 29, (9), pp. 727730.
    8. 8)
      • 39. Avineria, E., Shinarb, D., Susiloa, Y.O.: ‘Pedestrians’ behaviour in cross walks: the effects of fear of falling and age’, Accident Anal. Prevent., 2012, 44, (1), pp. 3034.
    9. 9)
      • 20. Ding, M., Fan, G.: ‘Multilayer joint gait-pose manifolds for human gait motion modeling’, IEEE Trans. Cybern., 2015, 45, (11), pp. 24132424.
    10. 10)
      • 23. Zhang, X., Fan, G.: ‘Dual gait generative models for human motion estimation from a single camera’, IEEE Trans. Syst. Man Cybern. B, Cybern., 2010, 40, (4), pp. 10341049.
    11. 11)
      • 16. Lee, L., Grimson, W.: ‘Gait analysis for recognition and classification’. Proc. Int. Conf. Automatic Face and Gesture Recognition, Washington, DC, USA, May 2002, pp. 155161.
    12. 12)
      • 29. Lapszo, J., Giovanis, V., Prusik, K., et al: ‘Balance control contributors – the relationships between leg strength and balance control ability in seniors’, Acta Bioeng. Biomech., 2012, 14, (3), pp. 38.
    13. 13)
      • 30. Pietraszewski, B., Winiarski, S., Jaroszczuk, S.: ‘Three dimensional human gait pattern – reference data for normal men’, Acta Bioeng. Biomech., 2012, 14, (3), pp. 916.
    14. 14)
      • 35. Hu, M., Wang, Y., Zhang, Z.: ‘Maximisation of mutual information for gait-based soft biometric classification using Gabor features’, IET Biometrics, 2012, 1, (1), pp. 5562.
    15. 15)
      • 6. Choi, S.E., Lee, Y.J., Lee, S.J., et al: ‘Age estimation using a hierarchical classifier based on global and local facial features’, Pattern Recognit., 2011, 44, (6), pp. 12621281.
    16. 16)
      • 2. Cootes, T.F., Edwards, G.J., Taylor, C.J.: ‘Active appearance models’. Proc. European Conf. on Computer Vision, Freiburg, Germany, June 1998, pp. 484498.
    17. 17)
      • 11. Anitha, T., Ramya, M.: ‘Gait analysis for gender classification using CASIA gait database’, Int. J. Innov. Res. Sci. Eng. Technol., 2013, 2, (2), pp. 246251.
    18. 18)
      • 41. Tanawongsuwan, R., Bobick, A.: ‘Gait recognition from time-normalized joint-angle trajectories in the walking plane’. Int. Conf. on Computer Vision and Pattern Recognition, Kauai, HI, 2001, pp. 726731.
    19. 19)
      • 19. BenAbdelkader, C.H., Ross, G.C., Larry, S.D.: ‘Gait recognition using image self-similarity’, EURASIP J. Appl. Signal Process., 2004, 2004, (1), pp. 572585.
    20. 20)
      • 33. Iwama, H., Okumura, M., Makihara, Y., et al: ‘The OU-ISIR gait database comprising the large population dataset and performance evaluation of gait recognition’, Trans. Inf. Forensics Sec., 2012, 7, (5), pp. 15111521.
    21. 21)
      • 10. Tafazzoli, F., Safabakhsh, R.: ‘Model-based human gait recognition using leg and arm movements’, Eng. Appl. Artif. Intell., 2010, 23, (8), pp. 12371246.
    22. 22)
      • 27. Zhang, D., Wang, Y., Bhanu, B.: ‘Age classification based on gait using HMM’. Int. Conf. on Computer Society, Istanbul, Turkey, August 2010, pp. 38343837.
    23. 23)
      • 37. Briani1, R.V., Taborda, B., Martines, É.C.C., et al: ‘Comparison of temporal and kinetic walking parameters among young people and falling and non-falling elderly persons’, Rev. Bras. Geriatria Gerontologia, 2015, 18, (4), pp. 761768.
    24. 24)
      • 1. Wu, T., Turaga, P., Chellappa, R.: ‘Age estimation and face verification across aging using landmarks’, IEEE Trans. Inf. Forensics Sec., 2012, 7, (6), pp. 17801788.
    25. 25)
      • 14. Boulgouris, N., Chi, Z.: ‘Human gait recognition based on matching of body components’, Pattern Recognit., 2007, 40, (6), pp. 17631770.
    26. 26)
      • 5. Chang, K.Y., Chen, C.S.: ‘A learning framework for age rank estimation based on face images with scattering transform’, IEEE Trans. Image Process., 2015, 24, (3), pp. 785798.
    27. 27)
      • 26. Ostrosky, K.M., VanSwearingen, J.M., Burdett, R.G., et al: ‘A comparison of gait characteristics in young and old subjects’, Phys. Therapy, 1994, 74, (7), pp. 637644.
    28. 28)
      • 24. Blanke, D., Hageman, P.A.: ‘Comparison of gait of young men and elderly men’, Phys. Therapy, 1989, 69, (2), pp. 144148.
    29. 29)
      • 43. Ouarda, W., Trichili, H., Alimi, A.M., et al: ‘Face recognition based on geometric features using support vector machines’. Int. Conf. on Soft Computing and Pattern Recognition, Tunis, Tunisia, May 2014, pp. 8995.
    30. 30)
      • 38. Byrne, J., Stergiou, N., Blanke, D., et al: ‘Comparison of gait patterns between young and elderly women: an examination of coordination’, Percept. Mot Skills, 2002, 94, (1), pp. 265280.
    31. 31)
      • 4. Hong, L.J., Wen, D., Fang, C., et al: ‘Face age estimation by using bisection search tree’. Proc. Int. Conf. Machine Learning and Cybernetics, Tianjin, China, July 2013, pp. 370374.
    32. 32)
      • 12. Nixon, M.S., Tan, T.N., Chellappa, R: ‘Human identification based on gait’, Int. Ser. Biometrics, 2006, 4, pp. 14.
    33. 33)
      • 28. Choi, J.S., Kang, D.W., Shin, Y.H., et al: ‘Differences in gait pattern between the elderly and the young during level walking under low illumination’, Acta Bioeng. Biomech., 2014, 16, (1), pp. 39.
    34. 34)
      • 9. Makihara, Y., Sagawa, R., Mukaigawa, Y., et al: ‘Gait recognition using a view transformation model in the frequency domain’. Proc. European Conf. on Computer Vision, Graz, Austria, May 2006, pp. 151163.
    35. 35)
      • 42. Aggarwal, J.K., Cai, Q.: ‘Human motion analysis: a review’, Comput. Vis. Image Underst., 1999, 73, pp. 428440.
    36. 36)
      • 22. Elgammal, A., Lee, C.S.: ‘The role of manifold learning in human motion analysis’, Comput. Imaging Vis., 2009, 36, pp. 2556.
    37. 37)
      • 15. Huang, G., Wang, Y.: ‘Gender classification based on fusion of multi-view gait sequences’. Proc. Asian Conf. on Computer Vision, Tokyo, Japan, September 2007, pp. 462471.
    38. 38)
      • 32. Makihara, Y., Mannami, H., Yagi, Y.: ‘Gait analysis of gender and age using a large-scale multi-view gait database’. Proc. Asian Conf. Computer Vision, Queenstown, New Zealand, November 2010, pp. 440451.
    39. 39)
      • 31. Benz, K., Yeo, C., Tee, C., et al: ‘A preliminary study of gait-based age estimation techniques’. Proc. of APSIPA Annual Summit and Conf., December 2015, pp. 800806.
    40. 40)
      • 40. Derbel, A., Mansouri, N., Ben Jemaa, Y., et al: ‘Comparative study between spatio/temporal descriptors for pedestrians recognition by gait’. Proc. Int. Conf. on Image Analysis and Recognition, Paris, France, March 2013, pp. 3542.
    41. 41)
      • 8. Han, J., Bhanu, B.: ‘Individual recognition using gait energy image’, Trans. Pattern Anal. Mach. Intell., 2006, 28, (2), pp. 316322.
    42. 42)
      • 21. Zhang, X., Ding, M., Fan, G.: ‘Video-based human walking estimation by using joint gait and pose manifolds’, IEEE Trans. Circuits Syst. Video Technol., 2016, pp, (99), p. 1.
    43. 43)
      • 13. Sarkar, S., Phillips, J., Liu, Z., et al: ‘The humanID gait challenge problem: data sets, performance, and analysis’, Trans. Pattern Anal. Mach. Intell., 2005, 27, (2), pp. 162177.
    44. 44)
      • 44. Jarraya, I., Ouarda, W., Alimi, A.M.: ‘A preliminary investigation on horses recognition using facial texture features’. Int. Conf. on Systems, Man, and Cybernetics, Kowloon, Hong Kong, October 2015, pp. 28032808.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-bmt.2016.0176
Loading

Related content

content/journals/10.1049/iet-bmt.2016.0176
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address