In this paper, we investigate the detection of laughter from the user's nonverbal full-body movement in social and ecological contexts. Eight hundred and one laughter and nonlaughter segments of full-body movement were examined from a corpus of motion capture data of subjects participating in social activities that stimulated laughter. A set of 13 full-body movement features was identified, and corresponding automated extraction algorithms were developed. These features were extracted from the laughter and nonlaughter segments, and the resulting dataset was provided as input to supervised machine learning techniques. Both discriminative (radial basis function-support vector machines, k-nearest neighbor, and random forest) and probabilistic (naive Bayes and logistic regression) classifiers were trained and evaluated. A comparison of automated classification with the ratings of human observers for the same laughter and nonlaughter segments showed that the performance of our approach for automated laughter detection is comparable with that of humans. The highest F-score (0.74) was obtained by the random forest classifier, whereas the F-score obtained by human observers was 0.70. Based on the analysis techniques introduced in the paper, a vision-based system prototype for automated laughter detection was designed and evaluated. Support vector machines (SVMs) and Kohonen's self-organizing maps were used for training, and the highest F-score was obtained with SVM (0.73).

Automated Laughter Detection from Full-Body Movements / Niewiadomski, Radoslaw; Mancini, Maurizio; Varni, Giovanna; Volpe, Gualtiero; Camurri, Antonio. - In: IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS. - ISSN 2168-2291. - 46:(2016), pp. 113-123. [10.1109/THMS.2015.2480843]

Automated Laughter Detection from Full-Body Movements

MANCINI, MAURIZIO;VOLPE, GUALTIERO;
2016

Abstract

In this paper, we investigate the detection of laughter from the user's nonverbal full-body movement in social and ecological contexts. Eight hundred and one laughter and nonlaughter segments of full-body movement were examined from a corpus of motion capture data of subjects participating in social activities that stimulated laughter. A set of 13 full-body movement features was identified, and corresponding automated extraction algorithms were developed. These features were extracted from the laughter and nonlaughter segments, and the resulting dataset was provided as input to supervised machine learning techniques. Both discriminative (radial basis function-support vector machines, k-nearest neighbor, and random forest) and probabilistic (naive Bayes and logistic regression) classifiers were trained and evaluated. A comparison of automated classification with the ratings of human observers for the same laughter and nonlaughter segments showed that the performance of our approach for automated laughter detection is comparable with that of humans. The highest F-score (0.74) was obtained by the random forest classifier, whereas the F-score obtained by human observers was 0.70. Based on the analysis techniques introduced in the paper, a vision-based system prototype for automated laughter detection was designed and evaluated. Support vector machines (SVMs) and Kohonen's self-organizing maps were used for training, and the highest F-score was obtained with SVM (0.73).
2016
Automated analysis of full-body movement; body expressivity; detection; laughter; motion capture; multimodal interaction; Artificial Intelligence; Signal Processing; Human Factors and Ergonomics; Computer Networks and Communications; Computer Science Applications1707 Computer Vision and Pattern Recognition; Human-Computer Interaction; Control and Systems Engineering
01 Pubblicazione su rivista::01a Articolo in rivista
Automated Laughter Detection from Full-Body Movements / Niewiadomski, Radoslaw; Mancini, Maurizio; Varni, Giovanna; Volpe, Gualtiero; Camurri, Antonio. - In: IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS. - ISSN 2168-2291. - 46:(2016), pp. 113-123. [10.1109/THMS.2015.2480843]
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1528122
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 25
  • ???jsp.display-item.citation.isi??? 17
social impact