Dense 3D face alignment from 2D videos in real-time

To enable real-time, person-independent 3D registration from 2D video, we developed a 3D cascade regression approach in which facial landmarks remain invariant across pose over a range of approximately 60 degrees. From a single 2D image of a person's face, a dense 3D shape is registered in real...

Full description

Saved in:
Bibliographic Details
Published in2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG) Vol. 1; pp. 1 - 8
Main Authors Jeni, Laszlo A., Cohn, Jeffrey F., Kanade, Takeo
Format Conference Proceeding Journal Article
LanguageEnglish
Published United States IEEE 01.05.2015
Subjects
Online AccessGet full text
ISSN2326-5396
2326-5396
DOI10.1109/FG.2015.7163142

Cover

More Information
Summary:To enable real-time, person-independent 3D registration from 2D video, we developed a 3D cascade regression approach in which facial landmarks remain invariant across pose over a range of approximately 60 degrees. From a single 2D image of a person's face, a dense 3D shape is registered in real time for each frame. The algorithm utilizes a fast cascade regression framework trained on high-resolution 3D face-scans of posed and spontaneous emotion expression. The algorithm first estimates the location of a dense set of markers and their visibility, then reconstructs face shapes by fitting a part-based 3D model. Because no assumptions are required about illumination or surface properties, the method can be applied to a wide range of imaging conditions that include 2D video and uncalibrated multi-view video. The method has been validated in a battery of experiments that evaluate its precision of 3D reconstruction and extension to multi-view reconstruction. Experimental findings strongly support the validity of real-time, 3D registration and reconstruction from 2D video. The software is available online at http://zface.org.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2326-5396
2326-5396
DOI:10.1109/FG.2015.7163142