We aim at the realization of an Embodied Conversational Agent able to interact naturally and emotionally with user(s). In previous work, we have elaborated a model that computes the nonverbal behaviors associated to a given set of communicative functions. Specifying for a given emotion, its corresponding facial expression will not produce the sensation of expressivity. To do so, one needs to specify parameters such as intensity, tension, movement property. Moreover, emotion affects also lip shapes during speech. Simply adding the facial expression of emotion to the lip shape does not produce lip readable movement. In this paper we present a model that adds expressivity to the animation of an agent at the level of facial expression as well as of the lip shapes
Speaking with Emotions / E., Bevacqua; Mancini, M; C., Pelachaud. - (2004). (Intervento presentato al convegno AISB 2004 Symposium on Language, Speech and Gesture for Expressive Characters tenutosi a Leeds).
Speaking with Emotions
MANCINI M;
2004
Abstract
We aim at the realization of an Embodied Conversational Agent able to interact naturally and emotionally with user(s). In previous work, we have elaborated a model that computes the nonverbal behaviors associated to a given set of communicative functions. Specifying for a given emotion, its corresponding facial expression will not produce the sensation of expressivity. To do so, one needs to specify parameters such as intensity, tension, movement property. Moreover, emotion affects also lip shapes during speech. Simply adding the facial expression of emotion to the lip shape does not produce lip readable movement. In this paper we present a model that adds expressivity to the animation of an agent at the level of facial expression as well as of the lip shapesI documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.