In this paper we focus on three aspects of multimodal expressions of laughter. First, we propose a procedural method to synthesize rhythmic body movements of laughter based on spectral analysis of laughter episodes. For this purpose, we analyze laughter body motions from motion capture data and we reconstruct them with appropriate harmonics. Then we reduce the parameter space to two dimensions. These are the inputs of the actual model to generate a continuum of laughs rhythmic body movements. In the paper, we also propose a method to integrate rhytmic body movements generated by our model with other synthetized expressive cues of laughter such as facial expressions and additional body movements. Finally, we present a real-time human-virtual character interaction scenario where virtual character applies our model to answer to human's laugh in real-time.

Rhythmic Body Movements of Laughter / Niewiadomski, Radoslaw; Mancini, Maurizio; Ding, Yu; Pelachaud, Catherine; Volpe, Gualtiero. - (2014), pp. 299-306. (Intervento presentato al convegno 16th International Conference on Multimodal Interaction tenutosi a Istanbul, Turkey) [10.1145/2663204.2663240].

Rhythmic Body Movements of Laughter

MANCINI, MAURIZIO;VOLPE, GUALTIERO
2014

Abstract

In this paper we focus on three aspects of multimodal expressions of laughter. First, we propose a procedural method to synthesize rhythmic body movements of laughter based on spectral analysis of laughter episodes. For this purpose, we analyze laughter body motions from motion capture data and we reconstruct them with appropriate harmonics. Then we reduce the parameter space to two dimensions. These are the inputs of the actual model to generate a continuum of laughs rhythmic body movements. In the paper, we also propose a method to integrate rhytmic body movements generated by our model with other synthetized expressive cues of laughter such as facial expressions and additional body movements. Finally, we present a real-time human-virtual character interaction scenario where virtual character applies our model to answer to human's laugh in real-time.
2014
16th International Conference on Multimodal Interaction
laughter; nonverbal behaviors; realtime interaction; virtual character
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Rhythmic Body Movements of Laughter / Niewiadomski, Radoslaw; Mancini, Maurizio; Ding, Yu; Pelachaud, Catherine; Volpe, Gualtiero. - (2014), pp. 299-306. (Intervento presentato al convegno 16th International Conference on Multimodal Interaction tenutosi a Istanbul, Turkey) [10.1145/2663204.2663240].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1528275
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 19
  • ???jsp.display-item.citation.isi??? ND
social impact