This is an open access article published by the IET under the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/)
The potential of augmented reality (AR) technology to assist minimally invasive surgery (MIS) lies in its computational performance and accuracy in dealing with challenging MIS scenes. Even with the latest hardware and software technologies, achieving both real-time and accurate augmented information overlay in MIS is still a formidable task. In this Letter, the authors present a novel real-time AR framework for MIS that achieves interactive geometric aware AR in endoscopic surgery with stereo views. The authors’ framework tracks the movement of the endoscopic camera and simultaneously reconstructs a dense geometric mesh of the MIS scene. The movement of the camera is predicted by minimising the re-projection error to achieve a fast tracking performance, while the three-dimensional mesh is incrementally built by a dense zero mean normalised cross-correlation stereo-matching method to improve the accuracy of the surface reconstruction. The proposed system does not require any prior template or pre-operative scan and can infer the geometric information intra-operatively in real time. With the geometric information available, the proposed AR framework is able to interactively add annotations, localisation of tumours and vessels, and measurement labelling with greater precision and accuracy compared with the state-of-the-art approaches.
References
-
-
1)
-
18. Mountney, P., Stoyanov, D., Davison, A., et al: ‘Simultaneous stereoscope localization and soft-tissue mapping for minimal invasive surgery’. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2006, January 2006.
-
2)
-
15. Kumar, A., Wang, Y.-Y., Wu, C.-J., et al: ‘Stereoscopic visualization of laparoscope image using depth information from 3D model’, Comput. Methods Programs Biomed., 2014, 113, (3), pp. 862–868 (doi: 10.1016/j.cmpb.2013.12.013).
-
3)
-
11. Mur-Artal, R., Montiel, J.M.M., Tardós, J.D.: ‘Orb-SLAM: a versatile and accurate monocular SLAM system’, IEEE Trans. Robot., 2015, 31, (5), pp. 114–1163 (doi: 10.1109/TRO.2015.2463671).
-
4)
-
31. Blender: ‘Blender – free and open 3d creation software’, 2016, .
-
5)
-
23. Bay, H., Tuytelaars, T., Van Gool, L.: ‘Surf: speeded up robust features’. Computer Vision – ECCV 2006, 2006, pp. 404–417.
-
6)
-
24. Lowe, D.G.: ‘Distinctive image features from scale-invariant keypoints’, Int. J. Comput. Vis., 2004, 60, (2), pp. 91–110 (doi: 10.1023/B:VISI.0000029664.99615.94).
-
7)
-
8. Mountney, P., Yang, G.-Z.: ‘Soft tissue tracking for minimally invasive surgery: learning local deformation online’, Med. Image Comput. Comput. Assist. Interv., 2008, 11, (, pp. 364–372.
-
8)
-
32. Kratzer, W., Fritz, V., Mason, R.A., et al: ‘Factors affecting liver size: a sonographic survey of 2080 subjects’, J. Ultrasound Med., Off. J. Am. Inst. Ultrasound Med., 2003, 22, pp. 1155–1161 (doi: 10.7863/jum.2003.22.11.1155).
-
9)
-
26. Stoyanov, D., Scarzanella, M.V., Pratt, P., et al: ‘Real-time stereo reconstruction in robotically assisted minimally invasive surgery’. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2010, January 2010.
-
10)
-
16. Engel, J., Schï£ips, T., Cremers, D.: ‘LSD-SLAM: large-scale direct monocular SLAM’. Computer Vision – ECCV 2014, 2014, pp. 834–849.
-
11)
-
3. Bourdel, N., Collins, T., Pizarro, D., et al: ‘Use of augmented reality in laparoscopic gynecology to visualize myomas’, Fertility Sterility, 2017, 107, pp. 737–739 (doi: 10.1016/j.fertnstert.2016.12.016).
-
12)
-
14. Grasa, O.G., Bernal, E., Casado, S., et al: ‘Visual slam for handheld monocular endoscope’, IEEE Trans. Med. Imag., 2014, 33, (1), pp. 135–146 (doi: 10.1109/TMI.2013.2282997).
-
13)
-
2. Su, L.-M., Vagvolgyi, B.P., Agarwal, R., et al: ‘Augmented reality during robot-assisted laparoscopic partial nephrectomy: toward real-time 3D-CT to stereoscopic video registration’, Urology, 2009, 73, (4), pp. 896–900 (doi: 10.1016/j.urology.2008.11.040).
-
14)
-
21. Totz, J., Mountney, P., Stoyanov, D., et al: ‘Dense surface reconstruction for enhanced navigation in MIS’. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2011, January 2011.
-
15)
-
13. Lin, B., Johnson, A., Qian, X., et al: ‘Simultaneous tracking, 3D reconstruction and deforming point detection for stereoscope guided surgery’. Lecture Notes in Computer Science, 2013, pp. 35–44.
-
16)
-
5. Haouchine, N., Cotin, S., Peterlik, I., et al: ‘Impact of soft tissue heterogeneity on augmented reality for liver surgery’, IEEE Trans. Vis. Comput. Graph., 2015, 21, (5), pp. 584–597 (doi: 10.1109/TVCG.2014.2377772).
-
17)
-
25. Davison, A.J., Reid, I.D., Molton, N.D., et al: ‘Monoslam: real-time single camera SLAM’, IEEE Trans. Pattern Anal. Mach. Intell., 2007, 29, (6), pp. 1052–1067 (doi: 10.1109/TPAMI.2007.1049).
-
18)
-
1. Kim, J.-H., Bartoli, A., Collins, T., et al: ‘Tracking by detection for interactive image augmentation in laparoscopy’. Lecture Notes in Computer Science, 2012, pp. 246–255.
-
19)
-
12. Hostettler, A., Doignon, C., Soler, L., et al: ‘Orbslam-based endoscope tracking and 3d reconstruction’. MICCAI 2016 CARE, 2016.
-
20)
-
29. Imperial College London: ‘Hamlyn centre laparoscopic/endoscopic video datasets’, 2016. .
-
21)
-
17. Chang, P.-L., Handa, A., Davison, A.J., et al: ‘Robust real-time visual odometry for stereo endoscopy using dense quadrifocal tracking’. Int. Conf. on Information Processing in Computer-Assisted Interventions, 2014, pp. 11–20.
-
22)
-
10. Klein, G., Murray, D.: ‘Parallel tracking and mapping for small AR workspaces’. Proc. 6th IEEE and ACM Int. Symp. Mixed and Augmented Reality, November 2007, pp. 225–234.
-
23)
-
27. Chang, P.-L., Stoyanov, D., Davison, A.J., et al: ‘Real-time dense stereo reconstruction using convex optimisation with a cost-volume for image-guided robotic surgery’. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2013, January 2013.
-
24)
-
19. Mountney, P., Yang, G.Z.: ‘Dynamic view expansion for minimally invasive surgery using simultaneous localization and mapping’. 2009 Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society, September 2009, pp. 1184–1187.
-
25)
-
6. Du, X., Clancy, N., Arya, S., et al: ‘Robust surface tracking combining features, intensity and illumination compensation’, Int. J. Comput. Assist. Radiol. Surg., 2015, 10, (12), pp. 1915–1926 (doi: 10.1007/s11548-015-1243-9).
-
26)
-
30. Mountney, P., Stoyanov, D., Yang, G.-Z.: ‘Three-dimensional tissue deformation recovery and tracking’, IEEE Signal Process. Mag., 2010, 27, (4), pp. 14–24 (doi: 10.1109/MSP.2010.936728).
-
27)
-
4. Haouchine, N., Dequidt, J., Peterlik, I., et al: ‘Image-guided simulation of heterogeneous tissue deformation for augmented reality during hepatic surgery’. IEEE Int. Symp. on Mixed and Augmented Reality (ISMAR), 2013, 2013, pp. 199–208.
-
28)
-
20. Mountney, P., Yang, G.-Z.: ‘Motion compensated slam for image guided surgery’. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2010, January 2010.
-
29)
-
22. Rublee, E., Rabaud, V., Konolige, K., et al: ‘Orb: an efficient alternative to sift or surf’. IEEE Int. Conf. on Computer Vision (ICCV), 2011, 2011, pp. 2564–2571.
-
30)
-
9. Turan, M., Almalioglu, Y., Araujo, H., et al: , May 2017.
-
31)
-
7. Planteféve, R., Peterlik, I., Haouchine, N., et al: ‘Patient-specific biomechanical modeling for guidance during minimally-invasive hepatic surgery’, Ann. Biomed. Eng., 2016, 44, (1), pp. 139–153 (doi: 10.1007/s10439-015-1419-z).
-
32)
-
28. Marton, Z.C., Rusu, R.B., Beetz, M.: ‘On fast surface reconstruction methods for large and noisy point clouds’. 2009 IEEE Int. Conf. on Robotics and Automation, May 2009, pp. 3218–3223.
http://iet.metastore.ingenta.com/content/journals/10.1049/htl.2017.0068
Related content
content/journals/10.1049/htl.2017.0068
pub_keyword,iet_inspecKeyword,pub_concept
6
6