Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

access icon openaccess Concepts for augmented reality visualisation to support needle guidance inside the MRI

During MRI-guided interventions, navigation support is often separated from the operating field on displays, which impedes the interpretation of positions and orientations of instruments inside the patient's body as well as hand–eye coordination. To overcome these issues projector-based augmented reality can be used to support needle guidance inside the MRI bore directly in the operating field. The authors present two visualisation concepts for needle navigation aids which were compared in an accuracy and usability study with eight participants, four of whom were experienced radiologists. The results show that both concepts are equally accurate ( and ), useful and easy to use, with clear visual feedback about the state and success of the needle puncture. For easier clinical applicability, a dynamic projection on moving surfaces and organ movement tracking are needed. For now, tests with patients with respiratory arrest are feasible.

References

    1. 1)
    2. 2)
    3. 3)
    4. 4)
    5. 5)
      • 24. Bork, F., Fuerst, B., Schneider, A.-K., et al: ‘Auditory and visio-temporal distance coding for 3-dimensional perception in medical augmented reality’. IEEE Int. Symp. on Mixed and Augmented Reality (ISMAR), Fukuoka, Japan, 2015, pp. 712.
    6. 6)
    7. 7)
    8. 8)
      • 22. Wang, X., Zu Berge, C.S., Demirci, S., et al: ‘Improved interventional x-ray appearance’. IEEE and ACM Int. Symp. on Mixed and Augmented Reality, Munich, Germany, 2014, pp. 237242.
    9. 9)
    10. 10)
    11. 11)
    12. 12)
    13. 13)
    14. 14)
    15. 15)
      • 10. Mewes, A., Heinrich, F., Kägebein, U., et al: ‘Projector-based augmented reality system for interventional visualization inside MRI scanners’, J. Med. Robot. Comput. Assisted Surg., 2018, p. e1950, doi: 10.1002/rcs.1950.
    16. 16)
    17. 17)
    18. 18)
    19. 19)
      • 18. Marques, B., Haouchine, N., Plantefeve, R., et al: ‘Improving depth perception during surgical augmented reality’. ACM SIGGRAPH 2015 Posters, Los Angeles, USA, 2015, p. 24.
    20. 20)
    21. 21)
      • 25. Minge, M., Thüring, M., Wagner, I., et al: ‘The meCUE questionnaire. A modular evaluation tool for measuring user experience’, in Soares, M., Falcão, C., Ahram, T.Z. (Eds.): ‘Advances in Ergonomics Modeling, Usability & Special Populations. Proceedings of the 7th Applied Human Factors and Ergonomics Society Conference 2016’ (Springer International Press, Cham, 2016), pp. 115128.
    22. 22)
      • 11. Seitel, A., Maier-Hein, L., Schawo, S., et al: ‘In vitro evaluation of different visualization approaches for computer assisted targeting in soft tissue’, Int. J. Comput. Assisted Radiol. Surg., 2007, 2, pp. S188S190.
    23. 23)
    24. 24)
    25. 25)
    26. 26)
      • 19. Bichlmeier, C., Wimmer, F., Heining, S.M., et al: ‘Contextual anatomic mimesis hybrid in-situ visualization method for improving multi-sensory depth perception in medical augmented reality’. IEEE Int. Symp. on Mixed and Augmented Reality (ISMAR), Nara, Japan, 2007, pp. 129138.
    27. 27)
    28. 28)
      • 5. Kägebein, U., Godenschweger, F., Boese, A., et al: ‘Automatic image plane alignment for percutaneous MR guided interventions using an optical moiré’ phase tracking system’, Magn. Reson. Mater. Phys. Biol. Med., 2015, 28, pp. S159S160.
    29. 29)
    30. 30)
http://iet.metastore.ingenta.com/content/journals/10.1049/htl.2018.5076
Loading

Related content

content/journals/10.1049/htl.2018.5076
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address