Age Estimation by Super-Resolution Reconstruction Based on Adversarial Networks

Age estimation using facial images is applicable in various fields, such as age-targeted marketing, analysis of demand and preference for goods, skin care, remote medical service, and age statistics, for describing a specific place. However, if a low-resolution camera is used to capture the images,...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 8; pp. 17103 - 17120
Main Authors Nam, Se Hyun, Kim, Yu Hwan, Truong, Noi Quang, Choi, Jiho, Park, Kang Ryoung
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN2169-3536
2169-3536
DOI10.1109/ACCESS.2020.2967800

Cover

More Information
Summary:Age estimation using facial images is applicable in various fields, such as age-targeted marketing, analysis of demand and preference for goods, skin care, remote medical service, and age statistics, for describing a specific place. However, if a low-resolution camera is used to capture the images, or facial images are obtained from the subjects standing afar, the resolution of the images is degraded. In such a case, information regarding wrinkles and the texture of the face are lost, and features that are crucial for age estimation cannot be obtained. Existing studies on age estimation did not consider the degradation of resolution but used only high-resolution facial images. To overcome this limitation, this paper proposes a deep convolutional neural network (CNN)-based age estimation method that reconstructs low-resolution facial images as high-resolution images using a conditional generative adversarial network (GAN), and then uses the images as inputs. An experiment is conducted using two open databases (PAL and MORPH databases). The results demonstrate that the proposed method achieves higher accuracy in high-resolution reconstruction and age estimation than the state-of-the art methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.2967800