access icon free Retrieval of striated toolmarks using convolutional neural networks

The authors propose TripNet as method for calculating similarities between striated toolmark images. The objective for this system is detecting and comparing characteristics of the tools while being invariant to varying parameters like angle of attack, substrate material, and lighting conditions. Instead of designing a handcrafted feature extractor customised for this task, the authors propose the use of a convolutional neural network. With the proposed system, one-dimensional profiles extracted from images of striated toolmarks are mapped into an embedding. The system is trained by minimising a triplet loss function, so that a similarity measure is defined by the distance in this embedding. The performance is evaluated on the NFI Toolmark database containing 300 striated toolmarks of screwdrivers published by the Netherlands Forensic Institute. The system proposed is able to adapt to a large range of angles of attack, achieving a mean average precision of 0.95 for toolmark comparisons with differences in angle of attack of . Furthermore, four different triplet selection approaches are proposed and their effect on the retrieval of toolmarks from a database of unseen tools is evaluated in detail.

Inspec keywords: image retrieval; neural nets

Other keywords: triplet loss function; TripNet; similarity measure; convolutional neural networks; one-dimensional profiles; NFI toolmark database; triplet selection approaches; striated toolmark image retrieval; angle of attack

Subjects: Computer vision and image processing techniques; Neural computing techniques; Information retrieval techniques; Optical, image and video signal processing

References

    1. 1)
      • 15. Schroff, F., Kalenichenko, D., Philbin, J.: ‘FaceNet: a unified embedding for face recognition and clustering’. Proc. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), 2015.
    2. 2)
      • 1. Spotts, R., Chumbley, L.S., Ekstrand, L., et al: ‘Optimization of a statistical algorithm for objective comparison of toolmarks’, J. Forensic Sci., 2015, 60, (2), pp. 303314.
    3. 3)
      • 17. Balntas, V., Johns, E., Tang, L., et al: ‘PN-Net: conjoined triple deep network for learning local image descriptors’, ArXiv, 2016.
    4. 4)
      • 22. Hinton, G.E., Srivastava, N., Krizhevsky, A., et al: ‘Improving neural networks by preventing coadaptation of feature detectors’, ArXiv, 2012.
    5. 5)
      • 11. Song, J., Song, J.F., Vorburger, T.V.: ‘Proposed bullet signature comparisons autocorrelation functions using’. Proc. National Conf. of Standards Laboratories, 2000.
    6. 6)
      • 16. Parkhi, O.M., Vedaldi, A., Zisserman, A.: ‘Deep face recognition’. Proc. British Machine Vision Conf. (BMVC), 2015.
    7. 7)
      • 20. Ioffe, S., Szegedy, C.: ‘Batch normalization: accelerating deep network training by reducing internal covariate shift’, ArXiv, 2015.
    8. 8)
      • 18. Srivastava, A., Klassen, E., Joshi, S.H., et al: ‘Shape analysis of elastic curves in Euclidean spaces’, IEEE Trans. Pattern Anal. Mach. Intell., 2011, 33, (7), pp. 14151428.
    9. 9)
      • 12. Lecun, Y., Bottou, L., Bengio, Y., et al: ‘Gradient-based learning applied to document recognition’, Proc. IEEE, 1998, 86, (11), pp. 22782324.
    10. 10)
      • 21. LeCun, Y., Bottou, L., Orr, G.B., et al: ‘Neural networks: tricks of the trade’ (Springer Lecture Notes in Computer Sciences, 1998), p. 432.
    11. 11)
      • 4. Baiker, M., Keereweer, I., Pieterman, R., et al: ‘Quantitative comparison of striated toolmarks’, Forensic Sci. Int., 2014, 242, pp. 186199.
    12. 12)
      • 13. Chopra, S., Hadsell, R., LeCun, Y.: ‘Learning a similarity metric discriminatively, with application to face verification’. Proc. IEEE Computer Society Conf. on Computer Vision and Pattern Recognition (CVPR), vol. 1, 2005. pp. 539546.
    13. 13)
      • 23. Manning, C.D., Raghavan, P., Schütze, H.: ‘Introduction to information retrieval’ (Cambridge University Press, 2008).
    14. 14)
      • 9. Petraco, N.D.K., Chan, H., Forest, P.R.D., et al: ‘Application of Machine Learning to Toolmarks – Statistically Based Methods for Impression Pattern Comparisons’ NCJRS (239048), 2012.
    15. 15)
      • 10. Roth, J., Carriveau, A., Liu, X., et al: ‘Learning-based ballistic breech face impression image matching’. Proc. IEEE Seventh Int. Conf. on Biometrics Theory, Applications and Systems (BTAS), 2015, pp. 18.
    16. 16)
      • 6. Baiker, M., Petraco, N.D.K., Gambino, C., et al: ‘Virtual and simulated striated toolmarks for forensic applications’, Forensic Sci. Int., 2016, 261, pp. 4352.
    17. 17)
      • 2. Bachrach, B., Jain, A., Jung, S., et al: ‘A statistical validation of the individuality and repeatability of striated tool marks: screwdrivers and tongue and groove pliers’, J. Forensic Sci., 2010, 55, (2), pp. 348357.
    18. 18)
      • 3. Page, M., Taylor, J., Blenkin, M.: ‘Uniqueness in the forensic identification sciences-Fact or fiction?’, Forensic Sci. Int., 2011, 206, (1), pp. 1218.
    19. 19)
      • 14. Zagoruyko, S., Komodakis, N.: ‘Learning to compare image patches via convolutional neural networks’. Proc. IEEE Computer Society Conf. on Computer Vision and Pattern Recognition (CVPR), 2015, pp. 43534361.
    20. 20)
      • 19. Hoffer, E., Ailon, N.: ‘Deep metric learning using triplet network’, ArXiv, 2014.
    21. 21)
      • 8. Chumbley, L.S., Morris, M.D., Kreiser, M.J., et al: ‘Validation of tool mark comparisons obtained using a quantitative, comparative, statistical algorithm’, J. Forensic Sci., 2010, 55, (4), pp. 953961.
    22. 22)
      • 5. Baiker, M., Pieterman, R., Zoon, P.: ‘Toolmark variability and quality depending on the fundamental parameters: angle of attack, toolmark depth and substrate material’, Forensic Sci. Int., 2015, 251, pp. 4049.
    23. 23)
      • 7. Chu, W., Thompson, R.M., Song, J., et al: ‘Automatic identification of bullet signatures based on consecutive matching striae (CMS) criteria’, Forensic Sci. Int., 2013, 231, (1), pp. 137141.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-cvi.2017.0161
Loading

Related content

content/journals/10.1049/iet-cvi.2017.0161
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading