Hierarchical Retargetting of Fine Facial Motions

We present a novel technique for retargetting captured facial animation to new facial models. We use dense motion data that can express fine motions such as wrinkles. We use a normal mesh, which is a special multi‐resolution mesh, to represent source and target models. A normal mesh is defined by th...

Full description

Saved in:
Bibliographic Details
Published inComputer graphics forum Vol. 23; no. 3; pp. 687 - 695
Main Authors Na, Kyunggun, Jung, Moonryul
Format Journal Article
LanguageEnglish
Published Oxford, UK and Boston, USA Blackwell Publishing, Inc 01.09.2004
Blackwell Publishing Ltd
Subjects
Online AccessGet full text
ISSN0167-7055
1467-8659
DOI10.1111/j.1467-8659.2004.00801.x

Cover

More Information
Summary:We present a novel technique for retargetting captured facial animation to new facial models. We use dense motion data that can express fine motions such as wrinkles. We use a normal mesh, which is a special multi‐resolution mesh, to represent source and target models. A normal mesh is defined by the base mesh and sequence of normal offsets from it. Our retargetting consists of two steps: base mesh and detail mesh retargetting. For base mesh retargetting, we use an example‐based technique to take advantage of the intuition of the user in specifying the similarity between source and target expressions. In detail mesh retargetting, the variations of normal offsets in the source mesh are hierarchically computed and transferred to the target mesh. Our retargetting scheme is able to produce robust and delicate results for unusual target models from highly detailed motion data. Categories and Subject Descriptors (according to ACM CCS): I.3.7 [Computer Graphics]: Three Dimensional Graphics and Realism ‐ Animation; I.3.5 [Computer Graphics]: Computational Geometry and Object Modelling ‐ hierarchy and geometric transformation, object hierarchy
Bibliography:ark:/67375/WNG-CR6ZHCR3-1
istex:8AEA7733CD9A91C8BDEDA8AE0746CD60826CFC70
ArticleID:CGF801
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ISSN:0167-7055
1467-8659
DOI:10.1111/j.1467-8659.2004.00801.x