Concepts for augmented reality visualisation to support needle guidance inside the MRI
- Author(s): André Mewes 1, 2 ; Florian Heinrich 1, 2 ; Bennet Hensen 2, 3 ; Frank Wacker 2, 3 ; Kai Lawonn 4 ; Christian Hansen 1, 2
-
-
View affiliations
-
Affiliations:
1:
Faculty of Computer Science , Otto-von-Guericke University Magdeburg , Germany ;
2: Research Campus STIMULATE, Otto-von-Guericke University Magdeburg , Germany ;
3: Institute of Diagnostic and Interventional Radiology, Hanover Medical School , Germany ;
4: Faculty of Computer Science , University of Koblenz-Landau , Germany
-
Affiliations:
1:
Faculty of Computer Science , Otto-von-Guericke University Magdeburg , Germany ;
- Source:
Volume 5, Issue 5,
October
2018,
p.
172 – 176
DOI: 10.1049/htl.2018.5076 , Online ISSN 2053-3713
During MRI-guided interventions, navigation support is often separated from the operating field on displays, which impedes the interpretation of positions and orientations of instruments inside the patient's body as well as hand–eye coordination. To overcome these issues projector-based augmented reality can be used to support needle guidance inside the MRI bore directly in the operating field. The authors present two visualisation concepts for needle navigation aids which were compared in an accuracy and usability study with eight participants, four of whom were experienced radiologists. The results show that both concepts are equally accurate ( and ), useful and easy to use, with clear visual feedback about the state and success of the needle puncture. For easier clinical applicability, a dynamic projection on moving surfaces and organ movement tracking are needed. For now, tests with patients with respiratory arrest are feasible.
Inspec keywords: data visualisation; medical image processing; biomedical MRI; needles; augmented reality
Other keywords: projector-based augmented reality; operating field; MRI-guided interventions; visual feedback; hand–eye coordination; positions; needle guidance; needle puncture; displays; needle navigation aids; augmented reality visualisation; patient
Subjects: Biomedical magnetic resonance imaging and spectroscopy; Biology and medical computing; Optical, image and video signal processing; Virtual reality; Computer vision and image processing techniques; Patient diagnostic methods and instrumentation; Medical magnetic resonance imaging and spectroscopy
References
-
-
1)
-
14. Krempien, R., Hoppe, H., Kahrs, L., et al: ‘Projector-based augmented reality for intuitive intraoperative guidance in image-guided 3d interstitial brachytherapy’, Int. J. Radiat. Oncol. Biol. Phys., 2008, 70, (3), pp. 944–952 (doi: 10.1016/j.ijrobp.2007.10.048).
-
-
2)
-
23. Black, D., Hettig, J., Luz, M., et al: ‘Auditory feedback to support image-guided medical needle placement’, Int. J. Comput. Assist Radiol. Surg., 2017, 12, (9), pp. 1655–1663 (doi: 10.1007/s11548-017-1537-1).
-
-
3)
-
3. Rempp, H., Loh, H., Hoffmann, R., et al: ‘Liver lesion conspicuity during real-time MR-guided radiofrequency applicator placement using spoiled gradient echo and balanced steady-state free precession imaging’, J. Magn. Reson. Imag., 2014, 40, (2), pp. 432–439 (doi: 10.1002/jmri.24371).
-
-
4)
-
16. Fichtinger, G., Deguet, A., Masamune, K., et al: ‘Image overlay guidance for needle insertion in CT scanner’, IEEE Trans. Biomed. Eng., 2005, 52, (8), pp. 1415–1424 (doi: 10.1109/TBME.2005.851493).
-
-
5)
-
24. Bork, F., Fuerst, B., Schneider, A.-K., et al: ‘Auditory and visio-temporal distance coding for 3-dimensional perception in medical augmented reality’. IEEE Int. Symp. on Mixed and Augmented Reality (ISMAR), Fukuoka, Japan, 2015, pp. 7–12.
-
-
6)
-
20. Kersten-Oertel, M., Chen, S.J., Collins, D.L.: ‘An evaluation of depth enhancing perceptual cues for vascular volume visualization in neurosurgery’, IEEE Trans. Vis. Comput. Graphics, 2014, 20, (3), pp. 391–403 (doi: 10.1109/TVCG.2013.240).
-
-
7)
-
13. Gavaghan, K., Oliveira-Santos, T., Peterhans, M., et al: ‘Evaluation of a portable image overlay projector for the visualisation of surgical navigation data: phantom studies’, Int. J. Comput. Assisted Radiol. Surg., 2012, 7, (4), pp. 547–556 (doi: 10.1007/s11548-011-0660-7).
-
-
8)
-
22. Wang, X., Zu Berge, C.S., Demirci, S., et al: ‘Improved interventional x-ray appearance’. IEEE and ACM Int. Symp. on Mixed and Augmented Reality, Munich, Germany, 2014, pp. 237–242.
-
-
9)
-
12. Oliveira-Santos, T., Klaeser, B., Weitzel, T., et al: ‘A navigation system for percutaneous needle interventions based on PET/CT images: design, workflow and error analysis of soft tissue and bone punctures’, Comput. Aided Surg., 2011, 16, (5), pp. 203–219 (doi: 10.3109/10929088.2011.597566).
-
-
10)
-
30. Moche, M., Zajonz, D., Kahn, T., et al: ‘MRI-guided procedures in various regions of the body using a robotic assistance system in a closed-bore scanner: preliminary clinical experience and limitations’, J. Magn. Reson. Imag., 2010, 31, (4), pp. 964–974 (doi: 10.1002/jmri.21990).
-
-
11)
-
7. Manzey, D., Röttger, S., Bahner-Heyne, J.E., et al: ‘Image-guided navigation: the surgeon's perspective on performance consequences and human factors issues’, Int. J. Med. Robot. Comput. Assisted Surg., 2009, 5, (3), pp. 297–308 (doi: 10.1002/rcs.261).
-
-
12)
-
17. Fritz, J., Thainual, P.U., Ungi, T., et al: ‘Augmented reality visualization with image overlay for MRI-guided intervention: accuracy for lumbar spinal procedures with a 1.5-t MRI system’, Am. J. Roentgenolog., 2012y, 198, (3), pp. W266–W273 (doi: 10.2214/AJR.11.6918).
-
-
13)
-
41. Hart, S., Staveland, J.: ‘Development of NASA-TLX (task load index): results of empirical and theoretical research’, Adv. Psychol., 1988, 52, pp. 139–183 (doi: 10.1016/S0166-4115(08)62386-9).
-
-
14)
-
15. Das, M., Sauer, F., Schoepf, U.J., et al: ‘Augmented reality visualization for CT-guided interventions: system description, feasibility, and initial evaluation in an abdominal phantom 1’, Radiology, 2006, 240, (1), pp. 230–235 (doi: 10.1148/radiol.2401040018).
-
-
15)
-
10. Mewes, A., Heinrich, F., Kägebein, U., et al: ‘Projector-based augmented reality system for interventional visualization inside MRI scanners’, J. Med. Robot. Comput. Assisted Surg., 2018, p. e1950, doi: 10.1002/rcs.1950.
-
-
16)
-
9. Weiss, C.R., Nour, S.G., Lewin, J.S.: ‘MR-guided biopsy: a review of current techniques and applications’, J. Magn. Reson. Imag., 2008, 27, (2), pp. 311–325 (doi: 10.1002/jmri.21270).
-
-
17)
-
8. Wacker, F.K., Vogt, S., Khamene, A., et al: ‘An augmented reality system for mr image-guided needle biopsy: initial results in a swine model 1’, Radiology, 2006, 238, (2), pp. 497–504 (doi: 10.1148/radiol.2382041441).
-
-
18)
-
27. Rothgang, E., Gilson, W.D., Wacker, F., et al: ‘Rapid freehand mr guided percutaneous needle interventions: an image-based approach to improve workflow and feasibility’, J. Magn. Reson. Imag., 2013, 37, (5), pp. 1202–1212 (doi: 10.1002/jmri.23894).
-
-
19)
-
18. Marques, B., Haouchine, N., Plantefeve, R., et al: ‘Improving depth perception during surgical augmented reality’. ACM SIGGRAPH 2015 Posters, Los Angeles, USA, 2015, p. 24.
-
-
20)
-
2. Fischbach, F., Bunke, J., Thormann, M., et al: ‘MR-guided freehand biopsy of liver lesions with fast continuous imaging using a 1.0-t open MRI scanner: experience in 50 patients’, Cardiovasc. Intervent. Radiol., 2011, 34, (1), pp. 188–192 (doi: 10.1007/s00270-010-9836-8).
-
-
21)
-
25. Minge, M., Thüring, M., Wagner, I., et al: ‘The meCUE questionnaire. A modular evaluation tool for measuring user experience’, in Soares, M., Falcão, C., Ahram, T.Z. (Eds.): ‘Advances in Ergonomics Modeling, Usability & Special Populations. Proceedings of the 7th Applied Human Factors and Ergonomics Society Conference 2016’ (Springer International Press, Cham, 2016), pp. 115–128.
-
-
22)
-
11. Seitel, A., Maier-Hein, L., Schawo, S., et al: ‘In vitro evaluation of different visualization approaches for computer assisted targeting in soft tissue’, Int. J. Comput. Assisted Radiol. Surg., 2007, 2, pp. S188–S190.
-
-
23)
-
29. Busse, H., Garnov, N., Thörmer, G., et al: ‘Flexible add-on solution for MR image guided interventions in a closed-bore scanner environment’, Magn. Reson. Med., 2010, 64, (3), pp. 922–928 (doi: 10.1002/mrm.22464).
-
-
24)
-
6. Seitel, A., Bellemann, N., Hafezi, M., et al: ‘Towards markerless navigation for percutaneous needle insertions’, Int. J. Comput. Assist. Radiol. Surg., 2016, 11, (1), pp. 107–117 (doi: 10.1007/s11548-015-1156-7).
-
-
25)
-
28. Meyer, B.C., Brost, A., Kraitchman, D.L., et al: ‘Percutaneous punctures with MR imaging guidance: comparison between mr imaging-enhanced fluoroscopic guidance and real-time MR imaging guidance’, Radiology, 2013, 266, (3), pp. 912–919 (doi: 10.1148/radiol.12120117).
-
-
26)
-
19. Bichlmeier, C., Wimmer, F., Heining, S.M., et al: ‘Contextual anatomic mimesis hybrid in-situ visualization method for improving multi-sensory depth perception in medical augmented reality’. IEEE Int. Symp. on Mixed and Augmented Reality (ISMAR), Nara, Japan, 2007, pp. 129–138.
-
-
27)
-
1. Schullian, P., Widmann, G., Lang, T.B., et al: ‘Accuracy and diagnostic yield of CT-guided stereotactic liver biopsy of primary and secondary liver tumors’, Comput. Aided Surg., 2011, 16, (4), pp. 181–187 (doi: 10.3109/10929088.2011.578367).
-
-
28)
-
5. Kägebein, U., Godenschweger, F., Boese, A., et al: ‘Automatic image plane alignment for percutaneous MR guided interventions using an optical moiré’ phase tracking system’, Magn. Reson. Mater. Phys. Biol. Med., 2015, 28, pp. S159–S160.
-
-
29)
-
4. Hoffmann, R., Thomas, C., Rempp, H., et al: ‘Performing MR-guided biopsies in clinical routine: factors that influence accuracy and procedure time’, Eur. Radiol., 2012, 22, (3), pp. 663–671 (doi: 10.1007/s00330-011-2297-x).
-
-
30)
-
21. Lawonn, K., Luz, M., Hansen, C.: ‘Improving spatial perception of vascular models using supporting anchors and illustrative visualization’, Comput. Graph., 2017, 63, pp. 37–49 (doi: 10.1016/j.cag.2017.02.002).
-
-
1)