Hand gesture recognition is still a topic of great interest for the computer vision community. In particular, sign language and semaphoric hand gestures are two foremost areas of interest due to their importance in Human-Human Communication (HHC) and Human-Computer Interaction (HCI), respectively. Any hand gesture can be represented by sets of feature vectors that change over time. Recurrent Neural Networks (RNNs) are suited to analyse this type of sets thanks to their ability to model the long term contextual information of temporal sequences. In this paper, a RNN is trained by using as features the angles formed by the finger bones of the human hands. The selected features, acquired by a Leap Motion Controller (LMC) sensor, are chosen because the majority of human hand gestures produce joint movements that generate truly characteristic corners. The proposed method, including the effectiveness of the selected angles, was initially tested by creating a very challenging dataset composed by a large number of gestures defined by the American Sign Language (ASL). On the latter, an accuracy of over 96% was achieved. Afterwards, by using the SHREC dataset, a wide collection of semaphoric hand gestures, the method was also proven to outperform in accuracy competing approaches of the current literature.

Exploiting Recurrent Neural Networks and Leap Motion Controller for the Recognition of Sign Language and Semaphoric Hand Gestures / Avola, D.; Bernardi, M.; Cinque, L.; Foresti, G. L.; Massaroni, C.. - In: IEEE TRANSACTIONS ON MULTIMEDIA. - ISSN 1520-9210. - ELETTRONICO. - 21:1(2019), pp. 234-245. [10.1109/TMM.2018.2856094]

Exploiting Recurrent Neural Networks and Leap Motion Controller for the Recognition of Sign Language and Semaphoric Hand Gestures

D. Avola
Primo
;
M. Bernardi;L. Cinque;G. L. Foresti;C. Massaroni
2019

Abstract

Hand gesture recognition is still a topic of great interest for the computer vision community. In particular, sign language and semaphoric hand gestures are two foremost areas of interest due to their importance in Human-Human Communication (HHC) and Human-Computer Interaction (HCI), respectively. Any hand gesture can be represented by sets of feature vectors that change over time. Recurrent Neural Networks (RNNs) are suited to analyse this type of sets thanks to their ability to model the long term contextual information of temporal sequences. In this paper, a RNN is trained by using as features the angles formed by the finger bones of the human hands. The selected features, acquired by a Leap Motion Controller (LMC) sensor, are chosen because the majority of human hand gestures produce joint movements that generate truly characteristic corners. The proposed method, including the effectiveness of the selected angles, was initially tested by creating a very challenging dataset composed by a large number of gestures defined by the American Sign Language (ASL). On the latter, an accuracy of over 96% was achieved. Afterwards, by using the SHREC dataset, a wide collection of semaphoric hand gestures, the method was also proven to outperform in accuracy competing approaches of the current literature.
2019
Gesture recognition; assistive technology; feature extraction; thumb; three-dimensional displays; hand gesture recognition; sign language
01 Pubblicazione su rivista::01a Articolo in rivista
Exploiting Recurrent Neural Networks and Leap Motion Controller for the Recognition of Sign Language and Semaphoric Hand Gestures / Avola, D.; Bernardi, M.; Cinque, L.; Foresti, G. L.; Massaroni, C.. - In: IEEE TRANSACTIONS ON MULTIMEDIA. - ISSN 1520-9210. - ELETTRONICO. - 21:1(2019), pp. 234-245. [10.1109/TMM.2018.2856094]
File allegati a questo prodotto
File Dimensione Formato  
Avola_Exploiting_2019.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 3.18 MB
Formato Adobe PDF
3.18 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1139418
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 144
  • ???jsp.display-item.citation.isi??? 104
social impact