Hand gesture recognition finds application in several heterogeneous fields, such as Human-Computer Interaction, serious games, sign language interpretation, and more. Modern recognition approaches use Deep Learning methods due to their ability in extracting features without human intervention. The drawback of this approach is the need for huge datasets which, depending on the task, are not always available. In some cases, handcrafted features increase the capability of a model in achieving the proposed task, and usually require fewer data with respect to Deep Learning approaches. In this paper, we propose a method that synergistically makes use of handcrafted features and Deep Learning for performing hand gesture recognition. Concerning the features, they are engineered from hand joints, while for Deep Learning, a simple LSTM together with a multilayer perceptron is used. The tests were performed on the DHG dataset, comparing the proposed method with both state-of-the-art methods that use handcrafted features and methods that use learned features. Our approach overcomes the state-of-the-art handcrafted features methods in both 14 and 28 gestures recognition tests, while we overcome the state-of-the-art learned features methods for the 14 gesture recognition test, proving that it is possible to use a simpler model with well engineered features.

Hand Gesture Recognition Exploiting Handcrafted Features and LSTM / Avola, D.; Cinque, L.; Emam, E.; Fontana, F.; Foresti, G. L.; Marini, M. R.; Pannone, D.. - 14233:(2023), pp. 500-511. (Intervento presentato al convegno Proceedings of the 22nd International Conference on Image Analysis and Processing, ICIAP 2023 tenutosi a ita) [10.1007/978-3-031-43148-7_42].

Hand Gesture Recognition Exploiting Handcrafted Features and LSTM

Avola D.;Cinque L.;Emam E.;Fontana F.;Foresti G. L.;Marini M. R.;Pannone D.
2023

Abstract

Hand gesture recognition finds application in several heterogeneous fields, such as Human-Computer Interaction, serious games, sign language interpretation, and more. Modern recognition approaches use Deep Learning methods due to their ability in extracting features without human intervention. The drawback of this approach is the need for huge datasets which, depending on the task, are not always available. In some cases, handcrafted features increase the capability of a model in achieving the proposed task, and usually require fewer data with respect to Deep Learning approaches. In this paper, we propose a method that synergistically makes use of handcrafted features and Deep Learning for performing hand gesture recognition. Concerning the features, they are engineered from hand joints, while for Deep Learning, a simple LSTM together with a multilayer perceptron is used. The tests were performed on the DHG dataset, comparing the proposed method with both state-of-the-art methods that use handcrafted features and methods that use learned features. Our approach overcomes the state-of-the-art handcrafted features methods in both 14 and 28 gestures recognition tests, while we overcome the state-of-the-art learned features methods for the 14 gesture recognition test, proving that it is possible to use a simpler model with well engineered features.
2023
Proceedings of the 22nd International Conference on Image Analysis and Processing, ICIAP 2023
Deep Learning; Handcrafted Feature; LSTM
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Hand Gesture Recognition Exploiting Handcrafted Features and LSTM / Avola, D.; Cinque, L.; Emam, E.; Fontana, F.; Foresti, G. L.; Marini, M. R.; Pannone, D.. - 14233:(2023), pp. 500-511. (Intervento presentato al convegno Proceedings of the 22nd International Conference on Image Analysis and Processing, ICIAP 2023 tenutosi a ita) [10.1007/978-3-031-43148-7_42].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1696518
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact