Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

access icon free Event-driven system for fall detection using body-worn accelerometer and depth sensor

The authors present efficient and effective algorithms for fall detection on the basis of sequences of depth maps and data from a wireless inertial sensor worn by a monitored person. A set of descriptors is discussed to permit distinguishing between accidental falls and activities of daily living. Experimental validation is carried out on the freely available dataset consisting of synchronised depth and accelerometric data. Extensive experiments are conducted in the scenario with a static camera facing the scene and an active camera observing the same scene from above. Several experiments consisting of person detection, tracking and fall detection in real-time are carried out to show efficiency and reliability of the proposed solutions. The experimental results show that the developed algorithms for fall detection have high sensitivity and specificity.

References

    1. 1)
      • 14. Sathyanarayana, S., Satzoda, K.R., Sathyanarayana, S., et al: ‘Vision-based patient monitoring: a comprehensive review of algorithms and technologies’, J. Ambient Intell. Humanized Comput., 2015, pp. 127.
    2. 2)
      • 32. Yu, M., Rhuma, A., Naqvi, S., et al: ‘A posture recognition-based fall detection system for monitoring an elderly person in a smart home environment’, IEEE Trans. Inf. Technol. Biomed., 2012, 16, (6), pp. 12741286.
    3. 3)
      • 47. Kepski, M., Kwolek, B.: ‘Fall detection using body-worn accelerometer and depth maps acquired by active camera’. Proc. 11th Int. Conf. on Hybrid Artificial Intelligent Systems, 2016, pp. 414426.
    4. 4)
      • 41. Kwolek, B., Kepski, M.: ‘Improving fall detection by the use of depth sensor and accelerometer’, Neurocomputing, 2015, 168, pp. 637645.
    5. 5)
      • 44. Adams, R., Bischof, L.: ‘Seeded region growing’, IEEE Trans. Pattern Anal. Mach. Intell., 1994, 16, (6), pp. 641647.
    6. 6)
      • 5. Murphy, J., Isaacs, B.: ‘The post-fall syndrome: a study of 36 elderly patients’, Gerontology, 1982, 28, pp. 265270.
    7. 7)
      • 36. Sokolova, M.V., Serrano-Cuerda, J., Castillo, J.C., et al: ‘A fuzzy model for human fall detection in infrared video’, J. Intell. Fuzzy Syst., 2013, 24, (2), pp. 215228.
    8. 8)
      • 50. Horn, B.: ‘Robot vision’ (The MIT Press, Cambridge, MA, 1986).
    9. 9)
      • 43. Raguram, R., Frahm, J.-M., Pollefeys, M.: ‘A comparative analysis of RANSAC techniques leading to adaptive real-time random sample consensus’. Proc. 10th European Conf. on Computer Vision: Part II, 2008, pp. 500513.
    10. 10)
      • 53. Jung, S., Hong, S., Kim, J., et al: ‘Wearable fall detector using integrated sensors and energy devices’, Sci. Rep., 2015, 5, Article number: 17081. Available at: http://www.nature.com/articles/srep17081.
    11. 11)
      • 16. Rea, F., Metta, G., Bartolozzi, C.: ‘Event-driven visual attention for the humanoid robot icub’, Front. Neurosci., 2013, 7, p. 234.
    12. 12)
      • 23. Li, Q., Stankovic, J., Hanson, M., et al: ‘Accurate, fast fall detection using gyroscopes and accelerometer-derived posture information’. Sixth Int. Workshop on Wearable and Implantable Body Sensor Networks, June 2009, pp. 138143.
    13. 13)
      • 38. Webster, D., Celik, O.: ‘Systematic review of kinect applications in elderly care and stroke rehabilitation’, J. NeuroEng. Rehabil., 2014, 11.
    14. 14)
      • 6. Tinetti, M.E.: ‘Predictors and prognosis of inability to get up after falls among elderly persons’, JAMA J. Am. Med. Assoc., 1993, 269, (1), p. 65.
    15. 15)
      • 20. Chen, C., Jafari, R., Kehtarnavaz, N.: ‘A survey of depth and inertial sensor fusion for human action recognition’, Multimedia Tools Appl., 2017, 76, (3), pp. 44054425.
    16. 16)
      • 45. Mehnert, A., Jackway, P.: ‘An improved seeded region growing algorithm’, Pattern Recogn. Lett., 1997, 18, (10), pp. 10651071.
    17. 17)
      • 35. Demiroz, L.A.B.E., Salah, A.A.: ‘Coupling fall detection and tracking in omnidirectional cameras’. International Workshop on Human Behavior Understanding, 2014 (LNCS, 8749), pp. 7385.
    18. 18)
      • 49. Kangas, M., Konttila, A., Winblad, I., et al: ‘Determination of simple thresholds for accelerometry-based parameters for fall detection’. 29th Annual Int. Conf. on IEEE Engineering in Medicine and Biology Society, 2007, pp. 13671370.
    19. 19)
      • 39. Cuppens, K., Chen, C.-W., Wong, K., et al: ‘Integrating video and accelerometer signals for nocturnal epileptic seizure detection’. Int. Conf. on Multimodal Interaction, 2012, pp. 161164.
    20. 20)
      • 34. Rougier, C., Meunier, J., St-Arnaud, A., et al: ‘3D head tracking for fall detection using a single calibrated camera’, Image Vision Comput., 2013, 31, (3), pp. 246254.
    21. 21)
      • 37. Rougier, C., Auvinet, E., Rousseau, J., et al: ‘Fall detection from depth map video sequences’. Int. Conf. on Smart Homes and Health Telematics, 2011 (LNCS, 6719), pp. 121128.
    22. 22)
      • 25. Bourke, A., O'Brien, J., Lyons, G.: ‘Evaluation of a threshold-based tri-axial accelerometer fall detection algorithm’, Gait Posture, 2007, 26, (2), pp. 194199.
    23. 23)
      • 19. Kangas, M., Korpelainen, R., Vikman, I., et al: ‘Sensitivity and false alarm rate of a fall sensor in long-term fall detection in the elderly’, Gerontology, 2015, 61, (1), pp. 6168.
    24. 24)
      • 29. Ma, X., Wang, H., Xue, B., et al: ‘Depth-based human fall detection via shape features and improved extreme learning machine’, IEEE J. Biomed. Health Inf., 2014, 18, (6), pp. 19151922.
    25. 25)
      • 33. Wu, C., Aghajan, H.: ‘Real-time human pose estimation: a case study in algorithm design for smart camera networks’, Proc. IEEE, 2008, 96, (10), pp. 17151732.
    26. 26)
      • 30. Chen, C., Jafari, R., Kehtarnavaz, N.: ‘Improving human action recognition using fusion of depth camera and inertial sensors’, IEEE Trans. Human–Mach. Syst., 2015, 45, (1), pp. 5161.
    27. 27)
      • 22. Kangas, M., Konttila, A., Lindgren, P., et al: ‘Comparison of low-complexity fall detection algorithms for body attached accelerometers’, Gait Posture, 2008, 28, (2), pp. 285291.
    28. 28)
      • 52. Kepski, M., Kwolek, B.: ‘Embedded system for fall detection using body-worn accelerometer and depth sensor’. IEEE 8th Int. Conf. Intell. Data Acquisition and Advanced Computing Systems, vol. 2, 2015, pp. 755759.
    29. 29)
      • 15. Wang, Y., Winters, J.: ‘An event-driven dynamic recurrent neuro-fuzzy system for adaptive prognosis in rehabilitation’. Int. Conf. IEEE Engineering in Medicine and Biology Society, 2003, pp. II:1256II:1259.
    30. 30)
      • 1. Rashidi, P., Mihailidis, A.: ‘A survey on ambient-assisted living tools for older adults’, IEEE J. Biomed. Health Inform., 2013, 17, (3), pp. 579590.
    31. 31)
      • 51. Kwolek, B., Kepski, M.: ‘Fall detection using kinect sensor and fall energy image’. Int. Conf. on Hybrid artificial intelligent systems, 2013 (LNCS, 8073), pp. 294303.
    32. 32)
      • 31. Chen, L., Wei, H., Ferryman, J.: ‘A survey of human motion analysis using depth imagery’, Pattern Recogn. Lett., 2013, 34, (15), pp. 19952006.
    33. 33)
      • 9. Litvak, D., Zigel, Y., Gannot, I.: ‘Fall detection of elderly through floor vibrations and sound’. Int. Conf. IEEE Engineering in Medicine and Biology Society, 2008, pp. 46324635.
    34. 34)
      • 46. Spinello, L., Arras, K.: ‘People detection in RGB-D data’. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), September 2011, pp. 38383843.
    35. 35)
      • 11. Morawski, R.Z., Yashchyshyn, Y., Pirek, M., et al: ‘Monitoring of human movements by means of impulse-radar sensors’, Telecommun. Rev. Telecommun. News, 2015, 6, pp. 598602.
    36. 36)
      • 13. Zhang, Z., Conly, C., Athitsos, V.: ‘A survey on vision-based fall detection’. Proc. 8th ACM Int. Conf. Pervasive Technologies Related to Assistive Environments, 2015, pp. 17.
    37. 37)
      • 48. Chen, J., Kwong, K., Chang, D., et al: ‘Wearable sensors for reliable fall detection’. IEEE Int. Conf. on Engineering in Medicine and Biology Society, 2005, pp. 35513554.
    38. 38)
      • 21. Hoflinger, F., Muller, J., Zhang, R., et al: ‘A wireless micro inertial measurement unit (IMU)’, IEEE Trans. Instrum. Meas., 2013, 62, (9), pp. 25832595.
    39. 39)
      • 26. Lustrek, M., Gjoreski, H., Kozina, S., et al: ‘Detecting falls with location sensors and accelerometers’. AAAI Conf. on Artificial Intelligence, 2011, pp. 16621667.
    40. 40)
      • 4. Igual, R., Medrano, C., Plaza, I.: ‘Challenges, issues and trends in fall detection systems’, Biomed. Eng. Online, 2013, 12.
    41. 41)
      • 3. Acampora, G., Cook, D., Rashidi, P., et al: ‘A survey on ambient intelligence in healthcare’, Proc. IEEE, 2013, 101, (12), pp. 24702494.
    42. 42)
      • 8. Hamm, J., Money, A.G., Atwal, A., et al: ‘Fall prevention intervention technologies: a conceptual framework and survey of the state of the art’, J. Biomed. Inf., 2016, 59, pp. 319345.
    43. 43)
      • 7. Noury, N., Rumeau, P., Bourke, A., et al: ‘A proposal for the classification and evaluation of fall detectors’, IRBM, 2008, 29, (6), pp. 340349.
    44. 44)
      • 24. Jacob, J., Nguyen, T., Lie, D., et al: ‘A fall detection study on the sensors placement location and a rule-based multi-thresholds algorithm using both accelerometer and gyroscopes’. IEEE Int. Conf. on Fuzzy Systems, 2011, pp. 666671.
    45. 45)
      • 18. Bagala, F., Becker, C., Cappello, A., et al: ‘Evaluation of accelerometer-based fall detection algorithms on real-world falls’, PLoS ONE, 2012, 7, (5), p. e37062.
    46. 46)
      • 27. Kangas, M., Vikman, I., Nyberg, L., et al: ‘Comparison of real-life accidental falls in older people with experimental falls in middle-aged test subjects’, Gait Posture, 2012, 35, (3), pp. 500505.
    47. 47)
      • 42. Kepski, M., Kwolek, B.: ‘Fall detection using ceiling-mounted 3d depth camera’. Int. Conf. on Computer Vision Theory and Appl. (VISAPP), vol. 2, pp. 640647.
    48. 48)
      • 40. Shotton, J., Sharp, T., Kipman, A., et al: ‘Real-time human pose recognition in parts from single depth images’, Commun. ACM, 2013, 56, pp. 116124.
    49. 49)
      • 28. Kepski, M., Kwolek, B.: ‘Fall detection on embedded platform using kinect and wireless accelerometer’. Int. Conf. on Computers Helping People with Special Needs, 2012, pp. II:407III414.
    50. 50)
      • 10. Zhang, Z., Kapoor, U., Narayanan, M., et al: ‘Design of an unobtrusive wireless sensor network for nighttime falls detection’. Annual Int. Conf. IEEE Engineering in Medicine and Biology Society, August 2011, pp. 52755278.
    51. 51)
      • 2. Friedewald, M., Raabe, O.: ‘Ubiquitous computing: an overview of technology impacts’, Telemat. Inf., 2011, 28, (2), pp. 5565.
    52. 52)
      • 12. Yu, M., Naqvi, S.M., Rhuma, A., et al: ‘One class boundary method classifiers for application in a video-based fall detection system’, IET Comput. Vis., 2012, 6, (2), pp. 90100.
    53. 53)
      • 17. Perez-Carrasco, J., Zhao, B., Serrano, C., et al: ‘Mapping from frame-driven to frame-free event-driven vision systems by low-rate rate coding and coincidence processing–application to feedforward convnets’, IEEE Trans. Pattern Anal. Mach. Intell., 2013, 35, (11), pp. 27062719.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cvi.2017.0119
Loading

Related content

content/journals/10.1049/iet-cvi.2017.0119
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address