Domain adaptation for ear recognition using deep convolutional neural networks

Domain adaptation for ear recognition using deep convolutional neural networks

For access to this article, please select a purchase option:

Buy article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
IET Biometrics — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Here, the authors have extensively investigated the unconstrained ear recognition problem. The authors have first shown the importance of domain adaptation, when deep convolutional neural network (CNN) models are used for ear recognition. To enable domain adaptation, the authors have collected a new ear data set using the Multi-PIE face data set, which they named as Multi-PIE ear data set. The authors have analysed in depth the effect of ear image quality, for example, illumination and aspect ratio, on the classification performance. Finally, the authors have addressed the problem of data set bias in the ear recognition field. Experiments on the UERC data set have shown that domain adaptation leads to a significant performance improvement. For example, when VGG-16 model is used and the domain adaptation is applied, an absolute increase of around 10% has been achieved. Combining different deep CNN models has further improved the accuracy by 4%. In the experiments that the authors have conducted to examine the data set bias, given an ear image, they were able to classify the data set that it has come from with 99.71% accuracy, which indicates a strong bias among the ear recognition data sets.

Related content

This is a required field
Please enter a valid email address