Deep hashing network for material defect image classification

Deep hashing network for material defect image classification

For access to this article, please select a purchase option:

Buy eFirst article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
IET Computer Vision — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Common non-destructive material testing technology has some well-known problems such as slow detection, low detection accuracy, and low level of information obtained. To solve these problems, this study applied recent advances in convolution neural networks to propose an effective deep learning network using casting datasets. The approach achieves non-destructive material testing with automatic, intelligent detection technology. For most existing deep learning networks, an image is eventually transformed into a multidimensional visual feature vector for comparison and classification. However, such vectors may not optimally improve detection precision and speed, and can lead to significant storage problems. A deep hashing network is proposed in which images are mapped into compact binary codes. There are three key components: (i) a sub-network with multiple convolution-pooling layers to capture image representations; (ii) a hashing layer to generate compact binary hash codes; (iii) an encoder module to divide the image feature vector from the output of the sub-network above into multiple branches, each encoded into one hash bit. Extensive experiments using a casting dataset show promising performance compared with the state-of-the-art approach.

Related content

This is a required field
Please enter a valid email address