Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

access icon openaccess Real-time geometry-aware augmented reality in minimally invasive surgery

The potential of augmented reality (AR) technology to assist minimally invasive surgery (MIS) lies in its computational performance and accuracy in dealing with challenging MIS scenes. Even with the latest hardware and software technologies, achieving both real-time and accurate augmented information overlay in MIS is still a formidable task. In this Letter, the authors present a novel real-time AR framework for MIS that achieves interactive geometric aware AR in endoscopic surgery with stereo views. The authors’ framework tracks the movement of the endoscopic camera and simultaneously reconstructs a dense geometric mesh of the MIS scene. The movement of the camera is predicted by minimising the re-projection error to achieve a fast tracking performance, while the three-dimensional mesh is incrementally built by a dense zero mean normalised cross-correlation stereo-matching method to improve the accuracy of the surface reconstruction. The proposed system does not require any prior template or pre-operative scan and can infer the geometric information intra-operatively in real time. With the geometric information available, the proposed AR framework is able to interactively add annotations, localisation of tumours and vessels, and measurement labelling with greater precision and accuracy compared with the state-of-the-art approaches.

References

    1. 1)
      • 18. Mountney, P., Stoyanov, D., Davison, A., et al: ‘Simultaneous stereoscope localization and soft-tissue mapping for minimal invasive surgery’. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2006, January 2006.
    2. 2)
    3. 3)
    4. 4)
      • 31. Blender: ‘Blender – free and open 3d creation software’, 2016, Available at https://www.blender.org/, accessed 6 November 2016.
    5. 5)
      • 23. Bay, H., Tuytelaars, T., Van Gool, L.: ‘Surf: speeded up robust features’. Computer Vision – ECCV 2006, 2006, pp. 404417.
    6. 6)
    7. 7)
      • 8. Mountney, P., Yang, G.-Z.: ‘Soft tissue tracking for minimally invasive surgery: learning local deformation online’, Med. Image Comput. Comput. Assist. Interv., 2008, 11, (Pt 2), pp. 364372.
    8. 8)
    9. 9)
      • 26. Stoyanov, D., Scarzanella, M.V., Pratt, P., et al: ‘Real-time stereo reconstruction in robotically assisted minimally invasive surgery’. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2010, January 2010.
    10. 10)
      • 16. Engel, J., Schï£ips, T., Cremers, D.: ‘LSD-SLAM: large-scale direct monocular SLAM’. Computer Vision – ECCV 2014, 2014, pp. 834849.
    11. 11)
    12. 12)
    13. 13)
    14. 14)
      • 21. Totz, J., Mountney, P., Stoyanov, D., et al: ‘Dense surface reconstruction for enhanced navigation in MIS’. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2011, January 2011.
    15. 15)
      • 13. Lin, B., Johnson, A., Qian, X., et al: ‘Simultaneous tracking, 3D reconstruction and deforming point detection for stereoscope guided surgery’. Lecture Notes in Computer Science, 2013, pp. 3544.
    16. 16)
    17. 17)
    18. 18)
      • 1. Kim, J.-H., Bartoli, A., Collins, T., et al: ‘Tracking by detection for interactive image augmentation in laparoscopy’. Lecture Notes in Computer Science, 2012, pp. 246255.
    19. 19)
      • 12. Hostettler, A., Doignon, C., Soler, L., et al: ‘Orbslam-based endoscope tracking and 3d reconstruction’. MICCAI 2016 CARE, 2016.
    20. 20)
      • 29. Imperial College London: ‘Hamlyn centre laparoscopic/endoscopic video datasets’, 2016. Available at http://hamlyn.doc.ic.ac.uk/vision/, accessed 6 November 2016.
    21. 21)
      • 17. Chang, P.-L., Handa, A., Davison, A.J., et al: ‘Robust real-time visual odometry for stereo endoscopy using dense quadrifocal tracking’. Int. Conf. on Information Processing in Computer-Assisted Interventions, 2014, pp. 1120.
    22. 22)
      • 10. Klein, G., Murray, D.: ‘Parallel tracking and mapping for small AR workspaces’. Proc. 6th IEEE and ACM Int. Symp. Mixed and Augmented Reality, November 2007, pp. 225234.
    23. 23)
      • 27. Chang, P.-L., Stoyanov, D., Davison, A.J., et al: ‘Real-time dense stereo reconstruction using convex optimisation with a cost-volume for image-guided robotic surgery’. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2013, January 2013.
    24. 24)
      • 19. Mountney, P., Yang, G.Z.: ‘Dynamic view expansion for minimally invasive surgery using simultaneous localization and mapping’. 2009 Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society, September 2009, pp. 11841187.
    25. 25)
    26. 26)
    27. 27)
      • 4. Haouchine, N., Dequidt, J., Peterlik, I., et al: ‘Image-guided simulation of heterogeneous tissue deformation for augmented reality during hepatic surgery’. IEEE Int. Symp. on Mixed and Augmented Reality (ISMAR), 2013, 2013, pp. 199208.
    28. 28)
      • 20. Mountney, P., Yang, G.-Z.: ‘Motion compensated slam for image guided surgery’. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2010, January 2010.
    29. 29)
      • 22. Rublee, E., Rabaud, V., Konolige, K., et al: ‘Orb: an efficient alternative to sift or surf’. IEEE Int. Conf. on Computer Vision (ICCV), 2011, 2011, pp. 25642571.
    30. 30)
      • 9. Turan, M., Almalioglu, Y., Araujo, H., et al: ‘A non-rigid map fusion-based RGB-depth SLAM method for endoscopic capsule robots’. ArXiv e-prints, May 2017.
    31. 31)
    32. 32)
      • 28. Marton, Z.C., Rusu, R.B., Beetz, M.: ‘On fast surface reconstruction methods for large and noisy point clouds’. 2009 IEEE Int. Conf. on Robotics and Automation, May 2009, pp. 32183223.
http://iet.metastore.ingenta.com/content/journals/10.1049/htl.2017.0068
Loading

Related content

content/journals/10.1049/htl.2017.0068
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address