ARssist: augmented reality on a head-mounted display for the first assistant in robotic surgery
- Author(s): Long Qian 1 ; Anton Deguet 1 ; Peter Kazanzides 1
-
-
View affiliations
-
Affiliations:
1:
Johns Hopkins University , Baltimore , Maryland , USA
-
Affiliations:
1:
Johns Hopkins University , Baltimore , Maryland , USA
- Source:
Volume 5, Issue 5,
October
2018,
p.
194 – 200
DOI: 10.1049/htl.2018.5065 , Online ISSN 2053-3713

Full text loading...
Inspec keywords: endoscopes; surgery; helmet mounted displays; augmented reality; rendering (computer graphics); medical robotics; stereo image processing
Other keywords: robotic instruments; endoscopy feedback; hand-held instruments; real-time three-dimensional rendering; FA's hand-eye coordination; real-time stereo endoscopy; augmented reality application; ARssist; first assistant; trocar planning; robot docking; hybrid tracking scheme; robot-assisted laparoscopic surgery; optical see-through head-mounted display
Subjects: Virtual reality; Computer vision and image processing techniques; Biology and medical computing; Optical and laser radiation (biomedical imaging/measurement); Patient care and treatment; Graphics techniques; Robotics; Biological and medical control systems; Optical and laser radiation (medical uses); Display equipment and systems; Patient care and treatment; Patient diagnostic methods and instrumentation
References
-
-
1)
-
2. Martin, S.: ‘The role of the first assistant in robotic assisted surgery’, Br. J. Perioper. Nurs., 2004, 14, (4), pp. 159–163.
-
-
2)
-
19. Wang, J., Qian, L., Azimi, E., et al: ‘Prioritization and static error compensation for multi-camera collaborative tracking in augmented reality’. IEEE Virtual Reality (VR), Los Angeles, CA, USA, March 2017, pp. 335–336.
-
-
3)
-
7. Qian, L., Barthel, A., Johnson, A., et al: ‘Comparison of optical see-through head-mounted displays for surgical interventions with object-anchored 2D-display’, Int. J. Comput. Assisted Radiol. Surg., 2017, 12, (6), pp. 901–910 (doi: 10.1007/s11548-017-1564-y).
-
-
4)
-
4. Nayyar, R., Yadav, S., Singh, P., et al: ‘Impact of assistant surgeon on outcomes in robotic surgery’, Indian J. Urol., 2016, 32, (3), p. 204 (doi: 10.4103/0970-1591.185095).
-
-
5)
-
20. Kwartowitz, D.M., Herrell, S.D., Galloway, R.L.: ‘Toward image-guided robotic surgery: determining intrinsic accuracy of the da Vinci robot’, Int. J. Comput. Assist. Radiol. Surg., 2006, 1, (3), pp. 157–165 (doi: 10.1007/s11548-006-0047-3).
-
-
6)
-
15. Buchs, N.C., Volonte, F., Pugin, F., et al: ‘Augmented environments for the targeting of hepatic lesions during image-guided robotic liver surgery’, J. Surg. Res., 2013, 184, (2), pp. 825–831 (doi: 10.1016/j.jss.2013.04.032).
-
-
7)
-
21. Abawi, D.F., Bienwald, J., Dorner, R.: ‘Accuracy in optical tracking with fiducial markers: an accuracy function for ARToolKit’. Proc. of the 3rd IEEE/ACM Int. Symp. on Mixed and Augmented Reality, Arlington, VA, USA, November 2004, pp. 260–261.
-
-
8)
-
24. Owen, C.B., Zhou, J., Tang, A., et al: ‘Display-relative calibration for optical see-through head-mounted displays’. IEEE/ACM Intl. Symp. on Mixed and Augmented Reality (ISMAR), Arlington, VA, USA, November 2004, pp. 70–78.
-
-
9)
-
25. Qian, L., Azimi, E., Kazanzides, P., et al: ‘Comprehensive tracker based display calibration for holographic optical see-through head-mounted display’, 2017, arXiv:1703.05834.
-
-
10)
-
17. ‘Meta’, Available at http://www.metavision.com/, accessed: 6 June 2018.
-
-
11)
-
1. Kumar, R., Hemal, A.K.: ‘The ‘scrubbed surgeon’ in robotic surgery’, World J. Urol., 2006, 24, (2), pp. 144–147 (doi: 10.1007/s00345-006-0068-0).
-
-
12)
-
32. ‘Hololensartoolkit’. Available at https://github.com/qian256/HoloLensARToolKit, accessed: 6 June 2018.
-
-
13)
-
33. ‘Locatable camera’. Available at https://docs.microsoft.com/en-us/windows/mixed-reality/locatable-camera, accessed: 6 June 2018.
-
-
14)
-
31. Kato, H., Billinghurst, M.: ‘Marker tracking and HMD calibration for a video-based augmented reality conferencing system’. IEEE/ACM Intl. Workshop on Augmented Reality (IWAR), San Francisco, CA, USA, October 1999, pp. 85–94.
-
-
15)
-
3. Sgarbura, O., Vasilescu, C.: ‘The decisive role of the patient-side surgeon in robotic surgery’, Surg. Endosc., 2010, 24, (12), pp. 3149–3155 (doi: 10.1007/s00464-010-1108-9).
-
-
16)
-
34. Fontanelli, G., Ficuciello, F., Villani, L., et al: ‘Da Vinci research kit: PSM and MTM dynamic modelling’. IROS Workshop on Shared Platforms for Medical Robotics Research, Vancouver, Canada, 2017.
-
-
17)
-
38. Itoh, Y., Hamasaki, T., Sugimoto, M.: ‘Occlusion leak compensation for optical see-through displays using a single-layer transmissive spatial light modulator’, IEEE Trans. Visual. Comput. Graphics, 2017, 23, (11), pp. 2463–2473 (doi: 10.1109/TVCG.2017.2734427).
-
-
18)
-
18. Bernhardt, S., Nicolau, S.A., Soler, L., et al: ‘The status of augmented reality in laparoscopic surgery as of 2016’, Med. Image Anal., 2017, 37, pp. 66–90 (doi: 10.1016/j.media.2017.01.007).
-
-
19)
-
11. Lo, B., Chung, A.J., Stoyanov, D., et al: ‘Real-time intraoperative 3D tissue deformation recovery’. IEEE Intl. Symp. on Biomedical Imaging: From Nano to Macro (ISBI), Paris, France, May 2008, pp. 1387–1390.
-
-
20)
-
30. Quigley, M., Conley, K., Gerkey, B., et al: ‘ROS: an open-source robot operating system’. ICRA Workshop on Open Source Software, Kobe, Japan, 2009.
-
-
21)
-
18. Thrun, S., Leonard, J.J.: ‘Simultaneous localization and mapping’, in Siciliano, B., Khatib, O. (Eds.): ‘Springer handbook of robotics’ (Springer, Berlin & Heidelberg, 2008), pp. 871–889.
-
-
22)
-
8. Chen, L., Day, T.W., Tang, W., et al: ‘Recent developments and future challenges in medical mixed reality’. IEEE Intl. Symp. on Mixed and Augmented Reality (ISMAR), Nantes, France, October 2017, pp. 123–135.
-
-
23)
-
28. Kazanzides, P., Chen, Z., Deguet, A., et al: ‘An open-source research kit for the da Vinci R surgical system’. IEEE Intl. Conf. on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014, pp. 6434–6439.
-
-
24)
-
14. Koppel, D., Wang, Y.-F., Lee, H.: ‘Image-based rendering and modeling in videoendoscopy’. IEEE Intl. Symp. on Biomedical Imaging: Nano to Macro, Arlington, VA, USA, April 2004, pp. 269–272.
-
-
25)
-
6. Rolland, J.P., Fuchs, H.: ‘Optical versus video see-through head-mounted displays in medical visualization’, Presence, Teleoperators Virtual Environ., 2000, 9, (3), pp. 287–309 (doi: 10.1162/105474600566808).
-
-
26)
-
12. Maier-Hein, L., Mountney, P., Bartoli, A., et al: ‘Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery’, Med. Image Anal., 2013, 17, (8), pp. 974–996 (doi: 10.1016/j.media.2013.04.003).
-
-
27)
-
26. Hartley, R., Zisserman, A.: ‘Multiple view geometry in computer vision’ (Cambridge University Press, New York, NY, USA, 2003).
-
-
28)
-
36. Edgcumbe, P., Pratt, P., Yang, G.-Z., et al: ‘Pico Lantern: surface reconstruction and augmented reality in laparoscopic surgery using a pick-up laser projector’, Med. Image Anal., 2015, 25, (1), pp. 95–102 (doi: 10.1016/j.media.2015.04.008).
-
-
29)
-
37. Vagvolgyi, B., Niu, W., Chen, Z., et al: ‘Augmented virtuality for model-based teleoperation’. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Vancouver, Canada, 2017, pp. 3826–3833.
-
-
30)
-
13. Koreeda, Y., Obata, S., Nishio, Y., et al: ‘Development and testing of an endoscopic pseudo-viewpoint alternating system’, Int. J. Comput. Assisted Radiol. Surg., 2015, 10, (5), pp. 619–628 (doi: 10.1007/s11548-014-1083-z).
-
-
31)
-
23. Itoh, Y., Klinker, G.: ‘Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization’. IEEE Symp. on 3D User Interfaces (3DUI), Minneapolis, MN, USA, March 2014, pp. 75–82.
-
-
32)
-
22. Tuceryan, M., Genc, Y., Navab, N.: ‘Single-point active alignment method (SPAAM) for optical see-through HMD calibration for augmented reality’, Presence, Teleoperators Virtual Environ., 2002, 11, (3), pp. 259–276 (doi: 10.1162/105474602317473213).
-
-
33)
-
29. DiMaio, S., Hasser, C.: ‘The da Vinci research interface’. MICCAI Workshop on Systems and Arch. for Computer Assisted Interventions, Midas Journal, 2008, Available at http://hdl.handle.net/10380/1464.
-
-
34)
-
5. Sung, G.T., Gill, I.S.: ‘Robotic laparoscopic surgery: a comparison of the da Vinci and Zeus systems’, Urology, 2001, 58, (6), pp. 893–898 (doi: 10.1016/S0090-4295(01)01423-6).
-
-
35)
-
27. Qian, L., Unberath, M., Yu, K., et al: ‘Towards virtual monitors for image guided interventions-real-time streaming to optical see-through head-mounted displays’, 2017, arXiv:1710.00808.
-
-
36)
-
10. Wentink, B.: ‘Eye-hand coordination in laparoscopy – an overview of experiments and supporting aids’, Minim Invasive Ther. Allied Technol., 2001, 10, (3), pp. 155–162 (doi: 10.1080/136457001753192277).
-
-
37)
-
35. Azimi, E., Qian, L., Kazanzides, P., et al: ‘Robust optical see-through head-mounted display calibration: taking anisotropic nature of user interaction errors into account’. IEEE Virtual Reality (VR 2017), Los Angeles, CA, USA, March 2017, pp. 219–220.
-
-
38)
-
16. ‘Microsoft hololens’, Available at https://www.microsoft.com/en-us/hololens, accessed: 6 June 2018.
-
-
1)

Related content
