Face attribute analysis from structured light: an end-to-end approach

In this work we explore the use of structured-light imaging for face analysis. Towards this and due to lack of a publicly available structured-light face dataset, we (a) firstly generate a synthetic structured-light face dataset constructed based on the RGB-dataset London Face and the RGB-D dataset...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 82; no. 7; pp. 10471 - 10490
Main Authors Thamizharasan, Vikas, Das, Abhijit, Battaglino, Daniele, Bremond, Francois, Dantcheva, Antitza
Format Journal Article
LanguageEnglish
Published New York Springer US 01.03.2023
Springer Nature B.V
Springer Verlag
Subjects
Online AccessGet full text
ISSN1380-7501
1573-7721
1573-7721
DOI10.1007/s11042-022-13224-0

Cover

More Information
Summary:In this work we explore the use of structured-light imaging for face analysis. Towards this and due to lack of a publicly available structured-light face dataset, we (a) firstly generate a synthetic structured-light face dataset constructed based on the RGB-dataset London Face and the RGB-D dataset Bosphorus 3D Face. We then (b) propose a conditional adversarial network for depth map estimation from generated synthetic data. Associated quantitative and qualitative results suggest the efficiency of the proposed depth estimation technique. Further, we (c) study the estimation of gender and age directly from (i) structured-light, (ii) binarized structured-light, as well as (iii) estimated depth maps from structured-light. In this context we (d) study the impact of different subject-to-camera distances, as well as pose-variations. Finally, we (e) validate the proposed gender and age models that we train on synthetic data on a small set of real data, which we acquire. While these are early results, our findings clearly indicate the suitability of structured-light based approaches in facial analysis.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1380-7501
1573-7721
1573-7721
DOI:10.1007/s11042-022-13224-0