http://iet.metastore.ingenta.com
1887

3D motion capture system for assessing patient motion during Fugl-Meyer stroke rehabilitation testing

3D motion capture system for assessing patient motion during Fugl-Meyer stroke rehabilitation testing

For access to this article, please select a purchase option:

Buy article PDF
£12.50
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
IET Computer Vision — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

The authors introduce a novel marker-less multi-camera setup that allows easy synchronisation between 3D cameras as well as a novel pose estimation method that is calculated on the fly based on the human body being tracked, and thus requires no calibration session nor special calibration equipment. They show high accuracy in both calibration and data merging and is on par with equipment-based calibration. They deduce several insights and practical guidelines for the camera setup and for the preferred data merging methods. Finally, they present a test case that computerises the Fugl-Meyer stroke rehabilitation protocol using the authors’ multi-sensor capture system. They conducted a Helsinki-approved research in a hospital in which they collected data on stroke patients and healthy subjects using their multi-camera system. Spatio-temporal features were extracted from the acquired data and machine learning-based evaluations were applied. Results showed that patients and healthy subjects can be correctly classified at a rate of above 90%. Furthermore, they show that the most significant features in the classification are strongly correlated with the Fugl-Meyer guidelines. This demonstrates the feasibility of a low-cost, flexible and non-invasive motion capture system that can potentially be operated in a home setting.

References

    1. 1)
      • 1. Fugl-Meyer, A.R., Jääskö, L., Leyman, I., et al: ‘The poststroke hemiplegic patient. A method for evaluation of physical performance’, Scand. J. Rehabil. Med., 1975, 7, (1), pp. 1331.
    2. 2)
      • 2. VICON: ‘Motion capture system homepage’. Available at http://www.vicon.com, accessed May 2018.
    3. 3)
      • 3. OPTOTRAK: ‘Motion capture system homepage’. Available at https://www.ndigital.com/msci/products/optotrak-certus/, accessed May 2018.
    4. 4)
      • 4. IPIsoft: ‘Motion capture system homepage’. Available at http://www.ipisoft.com/, accessed May 2018.
    5. 5)
      • 5. Microsoft: ‘Kinect for xbox one’. Available at http://www.xbox.com/en-US/xbox-one/accessories/kinect, accessed May 2018.
    6. 6)
      • 6. Shotton, J., Fitzgibbon, A., Cook, M., et al: ‘Real-time human pose recognition in parts from single depth images’. Computer Vision and Pattern Recognition (CVPR), 2011, pp. 12971304.
    7. 7)
      • 7. Shotton, J., Sharp, T., Kipman, A., et al: ‘Real-time human pose recognition in parts from single depth images’, Commun. ACM, 2013, 56, (1), pp. 116124.
    8. 8)
      • 8. Mallick, T., Das, P.P., Majumdar, A.K.: ‘Characterizations of noise in Kinect depth images: a review’, IEEE Sens. J., 2014, 14, (6), pp. 17311740.
    9. 9)
      • 9. Choo, B., Landau, M., DeVore, M., et al: ‘Statistical analysis-based error models for the Microsoft Kinect depth sensor’, Sensors, 2014, 14, (9), pp. 1743017450.
    10. 10)
      • 10. Falie, D., Buzuloiu, V.: ‘Noise characteristics of 3d time-of-flight cameras’. Int. Symp. on Signals, Circuits and Systems (ISSCS), 2007, vol. 1, pp. 14.
    11. 11)
      • 11. Fetić, A., Jurić, D., Osmanković, D.: ‘The procedure of a camera calibration using camera calibration toolbox for matlab’. MIPRO, 2012 Proc. of the 35th Int. Convention, 2012, pp. 17521757.
    12. 12)
      • 12. Galna, B., Barry, G., Jackson, D., et al: ‘Accuracy of the Microsoft Kinect sensor for measuring movement in people with Parkinson's disease’, Gait Posture, 2014, 39, (4), pp. 10621068.
    13. 13)
      • 13. Obdržálek, Š., Kurillo, G., Ofli, F., et al: ‘Accuracy and robustness of Kinect pose estimation in the context of coaching of elderly population’. 2012 Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society (EMBC), 2012, pp. 11881193.
    14. 14)
      • 14. Clark, R.A., Pua, Y.-H., Bryant, A.L., et al: ‘Validity of the Microsoft Kinect for providing lateral trunk lean feedback during gait retraining’, Gait Posture, 2013, 38, (4), pp. 10641066.
    15. 15)
      • 15. Wang, Q., Kurillo, G., Ofli, F., et al: ‘Evaluation of pose tracking accuracy in the first and second generations of Microsoft Kinect’. 2015 Int. Conf. on Healthcare Informatics (ICHI), 2015, pp. 380389.
    16. 16)
      • 16. Bonnechere, B., Jansen, B., Salvia, P., et al: ‘Validity and reliability of the kinect within functional assessment activities: comparison with standard stereophotogrammetry’, Gait Posture, 2014, 39, (1), pp. 593598.
    17. 17)
      • 17. Plotnik, M., Shema, S., Dorfman, M., et al: ‘A motor learning-based intervention to ameliorate freezing of gait in subjects with parkinsons disease’, J. Neurol., 2014, 261, (7), pp. 13291339.
    18. 18)
      • 18. Funaya, H., Shibata, T., Wada, Y., et al: ‘Accuracy assessment of Kinect body tracker in instant posturography for balance disorders’. 2013 7th Int. Symp. on Medical Information and Communication Technology (ISMICT), 2013, pp. 213217.
    19. 19)
      • 19. Liu, T.T., Hsieh, C.T., Chung, R.C., et al: ‘Physical rehabilitation assistant system based on kinect’. Applied Mechanics and Materials, 2013, vol. 284, pp. 16861690.
    20. 20)
      • 20. Alexiadis, D.S., Kelly, P., Daras, P., et al: ‘Evaluating a dancer's performance using kinect-based skeleton tracking’. Proc. of the 19th ACM int. Conf. on Multimedia, 2011, pp. 659662.
    21. 21)
      • 21. Kitsikidis, A., Dimitropoulos, K., Douka, S., et al: ‘Dance analysis using multiple kinect sensors’. 2014 Int. Conf. on Computer Vision Theory and Applications (VISAPP), 2014, vol. 2, pp. 789795.
    22. 22)
      • 22. Han, J., Shao, L., Xu, D., et al: ‘Enhanced computer vision with Microsoft Kinect sensor: a review’, IEEE Trans. Cybernet., 2013, 43, (5), pp. 13181334.
    23. 23)
      • 23. Maimone, A., Bidwell, J., Peng, K., et al: ‘Enhanced personal autostereoscopic telepresence system using commodity depth cameras’, Comput. Graph., 2012, 36, (7), pp. 791807.
    24. 24)
      • 24. Asteriadis, S., Chatzitofis, A., Zarpalas, D., et al: ‘Estimating human motion from multiple kinect sensors’. Proc. of the 6th int. Conf. on Computer Vision/Computer Graphics Collaboration Techniques and Applications, 2013, p. 3.
    25. 25)
      • 25. Otten, P., Kim, J., Son, S.H.: ‘A framework to automate assessment of upperlimb motor function impairment: a feasibility study’, Sensors, 2015, 15, (8), pp. 2009720114.
    26. 26)
      • 26. Wang, J., Yu, L., Wang, J., et al: ‘Automated Fugl-Meyer assessment using svr model’. Bioelectronics and Bioinformatics (ISBB), 2014, pp. 14.
    27. 27)
      • 27. Kim, W.S., Cho, S., Baek, D., et al: ‘Upper extremity functional evaluation by Fugl-Meyer assessment scoring using depth-sensing camera in hemiplegic stroke patients’, PloS One, 2016, 11, (7), p. e0158640.
    28. 28)
      • 28. OpenKinect: ‘Project homepage’. Available at https://openkinect.org, accessed May 2018.
    29. 29)
      • 29. Kabsch, W.: ‘A solution for the best rotation to relate two sets of vectors’, Acta Crystallogr., 1976, 32, (5), pp. 922923.
    30. 30)
      • 30. Kabsch, W.: ‘A discussion of the solution for the best rotation to relate two sets of vectors’, Acta Crystallogr., 1978, 34, (5), pp. 827828.
    31. 31)
      • 31. Fischler, M.A., Bolles, R.C.: ‘Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography’, Commun. ACM, 1981, 24, (6), pp. 381395.
    32. 32)
      • 32. Besl, P.J., McKay, N.D.: ‘Method for registration of 3-d shapes’. Sensor Fusion IV: Control Paradigms and Data Structures, 1992, vol. 1611, pp. 586607.
    33. 33)
      • 33. Zhang, Z.: ‘Flexible camera calibration by viewing a plane from unknown orientations’. The Proc. of the Seventh IEEE Int. Conf. on Computer Vision, 1999, vol. 1, pp. 666673.
    34. 34)
      • 34. Wasenmüller, O., Stricker, D.: ‘Comparison of Kinect v1 and v2 depth images in terms of accuracy and precision’. Asian Conf. on Computer Vision, 2016, pp. 3445.
    35. 35)
      • 35. Eichler, N., Hel-Or, H., Shimshoni, I., et al: ‘Non-invasive motion analysis for stroke rehabilitation using off the shelf 3d sensors’. 2018 Int. Joint Conf. on Neural Networks Neural Networks (IJCNN), 2018.
    36. 36)
      • 36. Benjamin, E.J., Blaha, M.J., Chiuve, S.E., et al: ‘Heart disease and stroke statistics-2017 update: a report from the American heart association’, Circulation, 2017, 135, (10), pp. e146e603.
    37. 37)
      • 37. Cristianini, N., Shawe-Taylor, J.: ‘An introduction to support vector machines and other kernel-based learning methods’ (Cambridge University Press, Cambridge, UK, 2000).
    38. 38)
      • 38. Quinlan, J.R.: ‘C4.5: programming for machine learning’, vol. 38, (Morgan Kauffmann, Burlington, MA, USA, 1993), p. 48.
    39. 39)
      • 39. Ho, T.K.: ‘The random subspace method for constructing decision forests’, IEEE Trans. Pattern Anal. Mach. Intell., 1998, 20, (8), pp. 832844.
    40. 40)
      • 40. Kullback, S., Leibler, R.A.: ‘On information and sufficiency’, Ann. Math. Stat., 1951, 22, (1), pp. 7986.
    41. 41)
      • 41. Levin, M.F., Kleim, J.A., Wolf, S.L.: ‘What do motor ŞrecoveryŤ and ŞcompensationŤ mean in patients following stroke’, Neurorehabil. Neural Repair, 2009, 23, (4), pp. 313319.
    42. 42)
      • 42. Box, J.F.: ‘Guinness, Gosset, Fisher, and small samples’, Stat. Sci., 1987, 2, (1), pp. 4552.
    43. 43)
      • 43. Cirstea, M., Levin, M.F.: ‘Compensatory strategies for reaching in stroke’, Brain, 2000, 123, (5), pp. 940953.
    44. 44)
      • 44. Michaelsen, S.M., Jacobs, S., Roby-Brami, A., et al: ‘Compensation for distal impairments of grasping in adults with hemiparesis’, Exp. Brain Res., 2004, 157, (2), pp. 162173.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cvi.2018.5274
Loading

Related content

content/journals/10.1049/iet-cvi.2018.5274
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address