http://iet.metastore.ingenta.com
1887

Trimmed categorical cross-entropy for deep learning with label noise

Trimmed categorical cross-entropy for deep learning with label noise

For access to this article, please select a purchase option:

Buy eFirst article PDF
£12.50
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
Electronics Letters — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Deep learning methods are nowadays considered as state-of-the-art approach in many sophisticated problems, such as computer vision, speech understanding or natural language processing. However, their performance relies on the quality of large annotated datasets. If the data are not well-annotated and label noise occur, such data-driven models become less reliable. In this Letter, the authors present very simple way to make the training process robust to noisy labels. Without changing network architecture and learning algorithm, the authors apply modified error measure that improves network generalisation when trained with label noise. Preliminary results obtained for deep convolutional neural networks, trained with novel trimmed categorical cross-entropy loss function, revealed its improved robustness for several levels of label noise.

http://iet.metastore.ingenta.com/content/journals/10.1049/el.2018.7980
Loading

Related content

content/journals/10.1049/el.2018.7980
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address