Automatic synthesis of facial animation in Computer Graphics is a challenging task and although the problem is three decades old by now, there is still not a unified method to solve it. This is mainly due to the complex mathematical model required to reproduce the visual meanings of facial expressions coupled with the computational speed needed to run interactive applications. In this thesis, there are two different proposed methods to address the problem of the animation of 3D realistic faces at interactive rate. The first method is an integrated physically-based method which mimics the facial movements by reproducing the anatomical structure of a human head and the interaction among the bony structure, the facial muscles and the skin. Differently from previously proposed approaches in the literature, the muscles are organized in a layered, interweaving structure laying on the skull; their shape can be affected both by the simulation of active contraction and by the motion of the underlying anatomical parts. A design tool has been developed in order to assist the user in defining the muscles in a natural manner by sketching their shape directly on the already existing bones and other muscles. The dynamics of the face motion is computed through a position-based schema ensuring real-time performance, control and ro- bustness. Experiments demonstrate that through this model it can be effectively synthesized realistic expressive facial animation on different input face models in real-time on consumer class platforms. The second method for automatically achieving animation consists in a novel facial motion cloning technique. It is a purely geometric algorithm and it is able to transfer the motion from an animated source face to a different target face mesh, initially static, allowing to reuse facial motion from already animated virtual heads. Its robustness and flexibility are assessed over several input data sets.

A Computational Muscoloskeletal Model for Animating Virtual Faces / Fratarcangeli, Marco. - STAMPA. - (2009).

A Computational Muscoloskeletal Model for Animating Virtual Faces

FRATARCANGELI, Marco
01/01/2009

Abstract

Automatic synthesis of facial animation in Computer Graphics is a challenging task and although the problem is three decades old by now, there is still not a unified method to solve it. This is mainly due to the complex mathematical model required to reproduce the visual meanings of facial expressions coupled with the computational speed needed to run interactive applications. In this thesis, there are two different proposed methods to address the problem of the animation of 3D realistic faces at interactive rate. The first method is an integrated physically-based method which mimics the facial movements by reproducing the anatomical structure of a human head and the interaction among the bony structure, the facial muscles and the skin. Differently from previously proposed approaches in the literature, the muscles are organized in a layered, interweaving structure laying on the skull; their shape can be affected both by the simulation of active contraction and by the motion of the underlying anatomical parts. A design tool has been developed in order to assist the user in defining the muscles in a natural manner by sketching their shape directly on the already existing bones and other muscles. The dynamics of the face motion is computed through a position-based schema ensuring real-time performance, control and ro- bustness. Experiments demonstrate that through this model it can be effectively synthesized realistic expressive facial animation on different input face models in real-time on consumer class platforms. The second method for automatically achieving animation consists in a novel facial motion cloning technique. It is a purely geometric algorithm and it is able to transfer the motion from an animated source face to a different target face mesh, initially static, allowing to reuse facial motion from already animated virtual heads. Its robustness and flexibility are assessed over several input data sets.
2009
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/516181
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact