Classification of Mouth Gestures in German Sign Language using 3D Convolutional Neural Networks
Classification of Mouth Gestures in German Sign Language using 3D Convolutional Neural Networks
- Author(s): N. Wilson ; M. Brumm ; R.-R. Grigat
- DOI: 10.1049/cp.2019.0248
For access to this article, please select a purchase option:
Buy conference paper PDF
Buy Knowledge Pack
IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.
10th International Conference on Pattern Recognition Systems — Recommend this title to your library
Thank you
Your recommendation has been sent to your librarian.
- Author(s): N. Wilson ; M. Brumm ; R.-R. Grigat Source: 10th International Conference on Pattern Recognition Systems, 2019 p. 10 (52 – 57)
- Conference: 10th International Conference on Pattern Recognition Systems
- DOI: 10.1049/cp.2019.0248
- ISBN: 978-1-83953-108-8
- Location: Tours, France
- Conference date: 8-10 July 2019
- Format: PDF
Automatic recognition of sign language gestures is becoming necessary with an increased interest into human-computer interaction in sign language as well as automatic translation from sign language. Most of the research on sign language recognition focuses on hand gesture recognition. However, there are also non-manual signals in sign language. Mouth gestures represent mouth shapes that add information to the hand gestures not related to spoken language visemes. For German Sign Language, mouth gesture recognition would be an important addition to manual gesture recognition. This research work evaluates the method 3D convolutional neural networks for recognising mouth gestures in German Sign Language. For the recognition of certain mouth gestures, temporal information is mandatory and the extraction of both spatial and temporal features by 3D convolutional networks makes the classification of all gestures easier. Our research work compares how different initialisations affect learning and classification by the network. We achieve an accuracy of around 68% on testing 10 classes of mouth gestures in German Sign Language.
Inspec keywords: learning (artificial intelligence); sign language recognition; feature extraction; neural nets; gesture recognition
Subjects: User interfaces; Neural computing techniques; Computer vision and image processing techniques; Knowledge engineering techniques; Image recognition; Other topics in statistics
Related content
content/conferences/10.1049/cp.2019.0248
pub_keyword,iet_inspecKeyword,pub_concept
6
6
