access icon openaccess Multi-modal imaging, model-based tracking, and mixed reality visualisation for orthopaedic surgery

Orthopaedic surgeons are still following the decades old workflow of using dozens of two-dimensional fluoroscopic images to drill through complex 3D structures, e.g. pelvis. This Letter presents a mixed reality support system, which incorporates multi-modal data fusion and model-based surgical tool tracking for creating a mixed reality environment supporting screw placement in orthopaedic surgery. A red–green–blue–depth camera is rigidly attached to a mobile C-arm and is calibrated to the cone-beam computed tomography (CBCT) imaging space via iterative closest point algorithm. This allows real-time automatic fusion of reconstructed surface and/or 3D point clouds and synthetic fluoroscopic images obtained through CBCT imaging. An adapted 3D model-based tracking algorithm with automatic tool segmentation allows for tracking of the surgical tools occluded by hand. This proposed interactive 3D mixed reality environment provides an intuitive understanding of the surgical site and supports surgeons in quickly localising the entry point and orienting the surgical tool during screw placement. The authors validate the augmentation by measuring target registration error and also evaluate the tracking accuracy in the presence of partial occlusion.

Inspec keywords: diagnostic radiography; image registration; bone; image reconstruction; image fusion; image segmentation; orthopaedics; surgery; iterative methods; computerised tomography; medical image processing

Other keywords: real-time automatic fusion; iterative closest point algorithm; workflow; CBCT imaging; automatic tool segmentation; complex 3D structures; 3D point clouds; mobile C-arm; cone-beam computed tomography imaging space; orthopaedic surgery; pelvis; mixed reality environment; model-based surgical tool tracking; multimodal imaging; two-dimensional fluoroscopic images; adapted 3D model-based tracking algorithm; synthetic fluoroscopic images; target registration error; red-green-blue-depth camera; entry point; tracking accuracy; surgical tools; partial occlusion; reconstructed surface; multimodal data fusion; screw placement; mixed reality visualisation; interactive 3D mixed reality environment

Subjects: Patient care and treatment; Interpolation and function approximation (numerical analysis); Biology and medical computing; X-rays and particle beams (medical uses); Numerical approximation and analysis; Computer vision and image processing techniques; Patient diagnostic methods and instrumentation; X-ray techniques: radiography and computed tomography (biomedical imaging/measurement); Patient care and treatment; Optical, image and video signal processing; Interpolation and function approximation (numerical analysis)

References

    1. 1)
    2. 2)
    3. 3)
      • 24. Fotouhi, J., Fuerst, B., Lee, S.,, et al: ‘Interventional 3d augmented reality for orthopedic and trauma surgery’. 16th Annual Meeting of the International Society for Computer Assisted Orthopedic Surgery (CAOS), 2016.
    4. 4)
    5. 5)
    6. 6)
    7. 7)
      • 22. Uckermann, A., Elbrechter, C., Haschke, R.,, et al: ‘3D scene segmentation for autonomous robot grasping’. 2012 IEEE/RSJ Int. Conf. Intelligent Robots and Systems, October 2012, pp. 17341740.
    8. 8)
    9. 9)
      • 23. Rusinkiewicz, S., Levoy, M.: ‘Efficient variants of the ICP algorithm. 3D digital imaging and modeling’. Int. Conf. 3D Digital Imaging and Modeling, 2001.
    10. 10)
    11. 11)
    12. 12)
      • 21. Rusu, R.B., Blodow, N., Beetz, M.: ‘Fast point feature histograms (FPFH) for 3d registration’. Proc. 2009 IEEE Int. Conf. Robotics and Automation, ICRA'09, Piscataway, NJ, USA, 2009, pp. 18481853.
    13. 13)
    14. 14)
      • 14. Held, R., Gupta, A., Curless, B.,, et al: ‘3d puppetry: a kinect-based interface for 3d animation’. Proc. 25th Annual ACM Symp. User Interface Software and Technology, UIST ‘12, New York, NY, USA, 2012, pp. 423434.
    15. 15)
    16. 16)
    17. 17)
      • 9. Diotte, B., Fallavollita, P., Wang, L.,, et al: ‘Radiation-free drill guidance in interlocking of intramedullary nails’ (Springer Berlin Heidelberg, Berlin, Heidelberg, 2012), pp. 1825.
    18. 18)
      • 4. Liu, L., Ecker, T., Schumann, S.,, et al: ‘Computer assisted planning and navigation of periacetabular osteotomy with range of motion optimization’ (Springer International Publishing, Cham, 2014), pp. 643650.
    19. 19)
    20. 20)
      • 10. Habert, S., Gardiazabal, J., Fallavollita, P.,, et al: ‘Rgbdx: first design and experimental validation of a mirror-based RGBD X-ray imaging system’. 2015 IEEE Int. Symp. Mixed and Augmented Reality (ISMAR), September 2015, pp. 1318.
    21. 21)
    22. 22)
    23. 23)
    24. 24)
      • 19. Tateno, K., Tombari, F., Navab, N.: ‘Real-time and scalable incremental segmentation on dense slam’. 2015 IEEE/RSJ Int. Conf. Intelligent Robots and Systems (IROS), September 2015, pp. 44654472.
http://iet.metastore.ingenta.com/content/journals/10.1049/htl.2017.0066
Loading

Related content

content/journals/10.1049/htl.2017.0066
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading