Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

access icon openaccess ARssist: augmented reality on a head-mounted display for the first assistant in robotic surgery

In robot-assisted laparoscopic surgery, the first assistant (FA) is responsible for tasks such as robot docking, passing necessary materials, manipulating hand-held instruments, and helping with trocar planning and placement. The performance of the FA is critical for the outcome of the surgery. The authors introduce ARssist, an augmented reality application based on an optical see-through head-mounted display, to help the FA perform these tasks. ARssist offers (i) real-time three-dimensional rendering of the robotic instruments, hand-held instruments, and endoscope based on a hybrid tracking scheme and (ii) real-time stereo endoscopy that is configurable to suit the FA's hand–eye coordination when operating based on endoscopy feedback. ARssist has the potential to help the FA perform his/her task more efficiently, and hence improve the outcome of robot-assisted laparoscopic surgeries.

References

    1. 1)
      • 2. Martin, S.: ‘The role of the first assistant in robotic assisted surgery’, Br. J. Perioper. Nurs., 2004, 14, (4), pp. 159163.
    2. 2)
      • 19. Wang, J., Qian, L., Azimi, E., et al: ‘Prioritization and static error compensation for multi-camera collaborative tracking in augmented reality’. IEEE Virtual Reality (VR), Los Angeles, CA, USA, March 2017, pp. 335336.
    3. 3)
    4. 4)
    5. 5)
    6. 6)
    7. 7)
      • 21. Abawi, D.F., Bienwald, J., Dorner, R.: ‘Accuracy in optical tracking with fiducial markers: an accuracy function for ARToolKit’. Proc. of the 3rd IEEE/ACM Int. Symp. on Mixed and Augmented Reality, Arlington, VA, USA, November 2004, pp. 260261.
    8. 8)
      • 24. Owen, C.B., Zhou, J., Tang, A., et al: ‘Display-relative calibration for optical see-through head-mounted displays’. IEEE/ACM Intl. Symp. on Mixed and Augmented Reality (ISMAR), Arlington, VA, USA, November 2004, pp. 7078.
    9. 9)
      • 25. Qian, L., Azimi, E., Kazanzides, P., et al: ‘Comprehensive tracker based display calibration for holographic optical see-through head-mounted display’, 2017, arXiv:1703.05834.
    10. 10)
      • 17. ‘Meta’, Available at http://www.metavision.com/, accessed: 6 June 2018.
    11. 11)
    12. 12)
      • 32. Hololensartoolkit’. Available at https://github.com/qian256/HoloLensARToolKit, accessed: 6 June 2018.
    13. 13)
      • 33. Locatable camera’. Available at https://docs.microsoft.com/en-us/windows/mixed-reality/locatable-camera, accessed: 6 June 2018.
    14. 14)
      • 31. Kato, H., Billinghurst, M.: ‘Marker tracking and HMD calibration for a video-based augmented reality conferencing system’. IEEE/ACM Intl. Workshop on Augmented Reality (IWAR), San Francisco, CA, USA, October 1999, pp. 8594.
    15. 15)
    16. 16)
      • 34. Fontanelli, G., Ficuciello, F., Villani, L., et al: ‘Da Vinci research kit: PSM and MTM dynamic modelling’. IROS Workshop on Shared Platforms for Medical Robotics Research, Vancouver, Canada, 2017.
    17. 17)
    18. 18)
    19. 19)
      • 11. Lo, B., Chung, A.J., Stoyanov, D., et al: ‘Real-time intraoperative 3D tissue deformation recovery’. IEEE Intl. Symp. on Biomedical Imaging: From Nano to Macro (ISBI), Paris, France, May 2008, pp. 13871390.
    20. 20)
      • 30. Quigley, M., Conley, K., Gerkey, B., et al: ‘ROS: an open-source robot operating system’. ICRA Workshop on Open Source Software, Kobe, Japan, 2009.
    21. 21)
      • 18. Thrun, S., Leonard, J.J.: ‘Simultaneous localization and mapping’, in Siciliano, B., Khatib, O. (Eds.): ‘Springer handbook of robotics’ (Springer, Berlin & Heidelberg, 2008), pp. 871889.
    22. 22)
      • 8. Chen, L., Day, T.W., Tang, W., et al: ‘Recent developments and future challenges in medical mixed reality’. IEEE Intl. Symp. on Mixed and Augmented Reality (ISMAR), Nantes, France, October 2017, pp. 123135.
    23. 23)
      • 28. Kazanzides, P., Chen, Z., Deguet, A., et al: ‘An open-source research kit for the da Vinci R surgical system’. IEEE Intl. Conf. on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014, pp. 64346439.
    24. 24)
      • 14. Koppel, D., Wang, Y.-F., Lee, H.: ‘Image-based rendering and modeling in videoendoscopy’. IEEE Intl. Symp. on Biomedical Imaging: Nano to Macro, Arlington, VA, USA, April 2004, pp. 269272.
    25. 25)
    26. 26)
    27. 27)
      • 26. Hartley, R., Zisserman, A.: ‘Multiple view geometry in computer vision’ (Cambridge University Press, New York, NY, USA, 2003).
    28. 28)
    29. 29)
      • 37. Vagvolgyi, B., Niu, W., Chen, Z., et al: ‘Augmented virtuality for model-based teleoperation’. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Vancouver, Canada, 2017, pp. 38263833.
    30. 30)
    31. 31)
      • 23. Itoh, Y., Klinker, G.: ‘Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization’. IEEE Symp. on 3D User Interfaces (3DUI), Minneapolis, MN, USA, March 2014, pp. 7582.
    32. 32)
    33. 33)
      • 29. DiMaio, S., Hasser, C.: ‘The da Vinci research interface’. MICCAI Workshop on Systems and Arch. for Computer Assisted Interventions, Midas Journal, 2008, Available at http://hdl.handle.net/10380/1464.
    34. 34)
    35. 35)
      • 27. Qian, L., Unberath, M., Yu, K., et al: ‘Towards virtual monitors for image guided interventions-real-time streaming to optical see-through head-mounted displays’, 2017, arXiv:1710.00808.
    36. 36)
    37. 37)
      • 35. Azimi, E., Qian, L., Kazanzides, P., et al: ‘Robust optical see-through head-mounted display calibration: taking anisotropic nature of user interaction errors into account’. IEEE Virtual Reality (VR 2017), Los Angeles, CA, USA, March 2017, pp. 219220.
    38. 38)
      • 16. ‘Microsoft hololens’, Available at https://www.microsoft.com/en-us/hololens, accessed: 6 June 2018.
http://iet.metastore.ingenta.com/content/journals/10.1049/htl.2018.5065
Loading

Related content

content/journals/10.1049/htl.2018.5065
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address