This is an open access article published by the IET under the Creative Commons Attribution-NonCommercial-NoDerivs License (http://creativecommons.org/licenses/by-nc-nd/3.0/)
Image-guided neurosurgery, or neuronavigation, has been used to visualise the location of a surgical probe by mapping the probe location to pre-operative models of a patient's anatomy. One common limitation of this approach is that it requires the surgeon to divert their attention away from the patient and towards the neuronavigation system. In order to improve this type of application, the authors designed a system that sonifies (i.e. provides audible feedback of) distance information between a surgical probe and the location of the anatomy of interest. A user study (n = 15) was completed to determine the utility of sonified distance information within an existing neuronavigation platform (Intraoperative Brain Imaging System (IBIS) Neuronav). The authors’ results were consistent with the idea that combining auditory distance cues with existing visual information from image-guided surgery systems may result in greater accuracy when locating specified points on a pre-operative scan, thereby potentially reducing the extent of the required surgical openings, as well as potentially increasing the precision of individual surgical tasks. Further, the authors’ results were also consistent with the hypothesis that combining auditory and visual information reduces the perceived difficulty in locating a target location within a three-dimensional volume.
References
-
-
1)
-
6. Hansen, C., Black, D., Lange, C., et al: ‘Auditory support for resection guidance in navigated liver surgery’, Int. J. Med. Robot. Comput. Assist. Surg., 2013, 9, (1), pp. 36–43 (doi: 10.1002/rcs.1466).
-
2)
-
8. Drouin, S., Kochanowska, A., Kersten-Oertel, M., et al: ‘IBIS: an OR ready open-source platform for image-guided neurosurgery’, Int. J. Comput. Assist. Radiol. Surg., 2017, 12, (3), pp. 363–378 (doi: 10.1007/s11548-016-1478-0).
-
3)
-
10. Puckette, M.: ‘Pure data: another integrated computer music environment’. Proc. of the Second Intercollege Computer Music Concerts, 1996, pp. 37–41.
-
4)
-
1. Willems, P.W.A., Noordmans, H.J., van Overbeeke, J.J., et al: ‘The impact of auditory feedback on neuronavigation’, Acta Neurochirurgica, 2005, 147, (2), pp. 167–173 (doi: 10.1007/s00701-004-0412-3).
-
5)
-
2. Black, D., Hansen, C., Nabavi, A., et al: ‘A survey of auditory display in image-guided interventions’, Int. J. Comput. Assist. Radiol. Surg., 2017, pp. 1–12.
-
6)
-
7. Dubus, G., Bresin, R.: ‘A systematic review of mapping strategies for the sonification of physical quantities’, PloS one, 2013, 8, (12), pp. e82491 (doi: 10.1371/journal.pone.0082491).
-
7)
-
8)
-
3. Cherry, E.C.: ‘Some experiments on the recognition of speech, with one and with two ears’, J. Acoust. Soc. Am., 1953, 25, (5), pp. 975–979 (doi: 10.1121/1.1907229).
-
9)
-
5. Parseihian, G., Gondre, C., Aramaki, M., et al: ‘Comparison and evaluation of sonification strategies for guidance tasks’, IEEE Trans. Multimed., 2016, 18, (4), pp. 674–686 (doi: 10.1109/TMM.2016.2531978).
-
10)
-
4. Bork, F., Fuers, B., Schneider, A.K., et al: ‘Auditory and visio-temporal distance coding for 3-dimensional perception in medical augmented reality’. IEEE Int. Symp. on Mixed and Augmented Reality (ISMAR), 2015, 2015, pp. 7–12.
-
11)
-
9. Drouin, S., Kersten-Oertel, M., Chen, S.J.S., et al: ‘A realistic test and development environment for mixed reality in neurosurgery’. Augmented Environments for Computer Assisted Interventions, 2012 (, 7264), pp. 13–23.
http://iet.metastore.ingenta.com/content/journals/10.1049/htl.2017.0074
Related content
content/journals/10.1049/htl.2017.0074
pub_keyword,iet_inspecKeyword,pub_concept
6
6