The ability to detect lies is a crucial skill in essential situations like police interrogations and court trials. At present, several devices, such as polygraphs and magnetic resonance, can ease the deception detection task. However, the effectiveness of these tools can be compromised by intentional behavioral changes due to the subject awareness of such appliances, suggesting that alternative ways must be explored to detect lies without using physical devices. In this context, this paper presents an approach focused on the extraction of meaningful features from hand gestures. The latter provide cues on the person’s behavior and are used to address the deception detection task in RGB videos of trials. Specifically, the proposed system extracts hands skeletons from an RGB video sequence and generates novel handcrafted features from the extrapolated keypoints to reflect the subject behavior through hand movements. Then, a long short-term memory (LSTM) neural network is used to classify these features and estimate whether the person is lying or not. Extensive experiments were performed to assess the quality of the derived features on a public collection of famous real-life trials. On this dataset, the proposed system sets new state-of-the-art performance on the unimodal hand-gesture deception detection task, demonstrating the effectiveness of the proposed approach and its handcrafted features.

LieToMe: An LSTM-Based Method for Deception Detection by Hand Movements / Avola, D.; Cinque, L.; De Marsico, M.; Di Mambro, A.; Fagioli, A.; Foresti, G. L.; Lanzino, R.; Scarcello, F.. - 14233:(2023), pp. 387-398. (Intervento presentato al convegno Proceedings of the 22nd International Conference on Image Analysis and Processing, ICIAP 2023 tenutosi a ita) [10.1007/978-3-031-43148-7_33].

LieToMe: An LSTM-Based Method for Deception Detection by Hand Movements

Avola D.
;
Cinque L.;De Marsico M.;Di Mambro A.;Foresti G. L.;
2023

Abstract

The ability to detect lies is a crucial skill in essential situations like police interrogations and court trials. At present, several devices, such as polygraphs and magnetic resonance, can ease the deception detection task. However, the effectiveness of these tools can be compromised by intentional behavioral changes due to the subject awareness of such appliances, suggesting that alternative ways must be explored to detect lies without using physical devices. In this context, this paper presents an approach focused on the extraction of meaningful features from hand gestures. The latter provide cues on the person’s behavior and are used to address the deception detection task in RGB videos of trials. Specifically, the proposed system extracts hands skeletons from an RGB video sequence and generates novel handcrafted features from the extrapolated keypoints to reflect the subject behavior through hand movements. Then, a long short-term memory (LSTM) neural network is used to classify these features and estimate whether the person is lying or not. Extensive experiments were performed to assess the quality of the derived features on a public collection of famous real-life trials. On this dataset, the proposed system sets new state-of-the-art performance on the unimodal hand-gesture deception detection task, demonstrating the effectiveness of the proposed approach and its handcrafted features.
2023
Proceedings of the 22nd International Conference on Image Analysis and Processing, ICIAP 2023
Deception detection; Hand gestures; LSTM
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
LieToMe: An LSTM-Based Method for Deception Detection by Hand Movements / Avola, D.; Cinque, L.; De Marsico, M.; Di Mambro, A.; Fagioli, A.; Foresti, G. L.; Lanzino, R.; Scarcello, F.. - 14233:(2023), pp. 387-398. (Intervento presentato al convegno Proceedings of the 22nd International Conference on Image Analysis and Processing, ICIAP 2023 tenutosi a ita) [10.1007/978-3-031-43148-7_33].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1696511
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact