深層学習手法による顔魅力要因の解釈可能性と産業応用への展望

Saved in:
Bibliographic Details
Published in応用数理 Vol. 33; no. 2; pp. 89 - 93
Main Author 佐野, 貴紀
Format Journal Article
LanguageJapanese
Published 一般社団法人 日本応用数理学会 23.06.2023
Online AccessGet full text
ISSN2432-1982
DOI10.11540/bjsiam.33.2_89

Cover

Author 佐野, 貴紀
Author_xml – sequence: 1
  fullname: 佐野, 貴紀
  organization: 慶應義塾大学
BookMark eNo9kE1LAkEAhocoyMxzv2JtvpydPUVIXyB0KToOM9tsuWjFrpdurRFZHoLQkx4KS4WwkA51CX_MMqv-iwqjy_senpfn8C6B-ZPTEw3ACoJZhHIUrio_LMpylpAsFtyZAylMCbaQw_EiyIShDyEkBDLKWQocJB9DM3wyg-54dJ_c1JP3Zhy9xNVaXK1PHxvTwZW5bU26kWk9xNHrpNeZXtfM3dvk8iu56MVRf9zoJM8DM2qPG_04-vzZmGEzabeWwYInS6HO_HUa7G9u7OW3rcLu1k5-vWD5GCNssUNp25whlypOtGS2RznUXDkuYRiyHKe25Aoij7hcQ-RoqrCESkMPM9dTiqTB2szrhxV5pMVZUCzL4FzIoFJ0S1rMvhCECPwb3Pkn7rEMhC_JNwgcfa8
ContentType Journal Article
Copyright 2023 日本応用数理学会
Copyright_xml – notice: 2023 日本応用数理学会
DOI 10.11540/bjsiam.33.2_89
DeliveryMethod fulltext_linktorsrc
EISSN 2432-1982
EndPage 93
ExternalDocumentID article_bjsiam_33_2_33_89_article_char_ja
GroupedDBID ALMA_UNASSIGNED_HOLDINGS
JSF
KQ8
RJT
ID FETCH-LOGICAL-j2212-6da77861c4b83ea67f480e8b9c362065847a8b01f3c8e019e4b2a0be0f26cfbb3
IngestDate Wed Sep 03 06:30:54 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Issue 2
Language Japanese
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-j2212-6da77861c4b83ea67f480e8b9c362065847a8b01f3c8e019e4b2a0be0f26cfbb3
OpenAccessLink https://www.jstage.jst.go.jp/article/bjsiam/33/2/33_89/_article/-char/ja
PageCount 5
ParticipantIDs jstage_primary_article_bjsiam_33_2_33_89_article_char_ja
PublicationCentury 2000
PublicationDate 2023/06/23
PublicationDateYYYYMMDD 2023-06-23
PublicationDate_xml – month: 06
  year: 2023
  text: 2023/06/23
  day: 23
PublicationDecade 2020
PublicationTitle 応用数理
PublicationYear 2023
Publisher 一般社団法人 日本応用数理学会
Publisher_xml – name: 一般社団法人 日本応用数理学会
References [23] J. Xu, L. Jin, L. Liang, Z. Feng, and D. Xie. A new humanlike facial attractiveness predictor with cascaded fine-tuning deep learning model. arXiv preprint, arXiv:1511.02465, 2015.
[1] D. Gray, K. Yu, W. Xu, and Y. Gong. Predicting facial beauty without landmarks. In Computer Vision: ECCV 2010, pp. 434–447. Springer, 2010.
[10] L. Liu, J. Xing, S. Liu, H. Xu, X. Zhou, and S. Yan. Wow! you are so beautiful today! ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM), Vol. 11, No. 1s, Article No. 20, 2014.
[2] J. He, C. Wang, Y. Zhang, J. Guo, and Y. Guo. FA-GANs: Facial attractiveness enhancement with generative adversarial networks on frontal faces. arXiv preprint, arXiv:2005.08168, 2020.
[8] L. Liang, L. Lin, L. Jin, D. Xie, and M. Li. Scut-fbp5500: A diverse benchmark dataset for multi-paradigm facial beauty prediction. In 2018 24th International Conference on Pattern Recognition (ICPR), pp. 1598–1603. IEEE, 2018.
[6] L. Liang, L. Jin, and X. Li. Facial skin beautification using adaptive region-aware masks. IEEE Transactions on Cybernetics, Vol. 44, No. 12, pp. 2600–2612, 2014.
[17] J. C. Peterson, S. Uddenberg, T. L. Griffiths, A. Todorov, and J. W. Suchow. Deep models of superficial face judgments. Proceedings of the National Academy of Sciences, Vol. 119, No. 17, e2115228119, 2022.
[21] R. R. Selvaraju, M. Cogswell, A. Das, R. Vedantam, D. Parikh, and D. Batra. Grad-cam: Visual explanations from deep networks via gradientbased localization. In Proceedings of the 2017 IEEE International Conference on Computer Vision, pp. 618–626, 2017.
[19] J. Saeed and A. M. Abdulazeez. Facial beauty prediction and analysis based on deep convolutional neural network: A review. Journal of Soft Computing and Data Mining, Vol. 2, No. 1, pp. 1–12, 2021.
[11] X. Liu, T. Li, H. Peng, I. Chuoying Ouyang, T. Kim, and R. Wang. Understanding beauty via deep facial features. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pp. 246–256, 2019.
[22] D. Xie, L. Liang, L. Jin, J. Xu, and M. Li. Scut-fbp: A benchmark dataset for facial beauty perception. In 2015 IEEE International Conference on Systems, Man, and Cybernetics, pp. 1821–1826. IEEE, 2015.
[15] D. I. Perrett, K. J. Lee, I. Penton-Voak, D. Rowland, S. Yoshikawa, D. M. Burt, S. P. Henzi, D. L. Castles, and S. Akamatsu. Effects of sexual dimorphism on facial attractiveness. Nature, Vol. 394, No. 6696, pp. 884–887, 1998.
[14] T. V. Nguyen, S. Liu, B. Ni, J. Tan, Y. Rui, and S. Yan. Sense beauty via face, dressing, and/or voice. In Proceedings of the 20th ACM International Conference on Multimedia, pp. 239–248, 2012.
[5] J. H. Langlois, L. Kalakanis, A. J. Rubenstein, A. Larson, M. Hallam, and M. Smoot. Maxims or myths of beauty? A meta-analytic and theoretical review. Psychological Bulletin, Vol. 126, No. 3, pp. 390–423, 2000.
[27] L. Zhang, D. Zhang, M.-M. Sun, and F.-M. Chen. Facial beauty analysis based on geometric feature: Toward attractiveness assessment application. Expert Systems with Applications, Vol. 82, pp. 252–265, 2017.
[16] D. I. Perrett, K. A. May, and S. Yoshikawa. Facial shape and judgements of female attractiveness. Nature, Vol. 368, No. 6468, pp. 239–242, 1994.
[26] Y. Zhai, Y. Huang, Y. Xu, J. Gan, H. Cao, W. Deng, R. D. Labati, V. Piuri, and F. Scotti. Asian female facial beauty prediction using deep neural networks via transfer learning and multi-channel feature fusion. IEEE Access, Vol. 8, pp. 56892–56907, 2020.
[12] N. Murray, L. Marchesotti, and F. Perronnin. Ava: A large-scale database for aesthetic visual analysis. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2408–2415. IEEE, 2012.
[25] L. Xu, J. Xiang, and X. Yuan. Transferring rich deep features for facial beauty prediction. arXiv preprint, arXiv:1803.07253, 2018.
[7] L. Liang, L. Jin, X. Zhang, and Y. Xu. Multiple facial image editing using edge–aware pde learning. Computer Graphics Forum, Vol. 34, pp. 203–212. 2015.
[4] A. L. Jones, R. Russell, and R. Ward. Cosmetics alter biologically-based factors of beauty: Evidence from facial contrast. Evolutionary Psychology, Vol. 13, No. 1, 147470491501300113, 2015.
[18] R. Russell. A sex difference in facial contrast and its exaggeration by cosmetics. Perception, Vol. 38, No. 8, pp. 1211–1219, 2009.
[13] T. V. Nguyen and L. Liu. Smart mirror: Intelligent makeup recommendation and synthesis. In Proceedings of the 25th ACM International Conference on Multimedia, pp. 1253–1254, 2017.
[3] V. S. Johnston. Mate choice decisions: the role of facial beauty. Trends in Cognitive Sciences, Vol. 10, No. 1, pp. 9–13, 2006.
[20] T. Sano. Visualization of facial attractiveness factors using gradient-weighted class activation mapping to understand the connection between facial features and perception of attractiveness. International Journal of Affective Engineering, IJAE–D, 2022.
[24] J. Xu, L. Jin, L. Liang, Z. Feng, D. Xie, and H. Mao. Facial attractiveness prediction using psychologically inspired convolutional neural network (picnn). In 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 1657–1661. IEEE, 2017.
[9] L. Lin, L. Liang, and L. Jin. Regression guided by relative ranking using convolutional neural network (r3cnn) for facial beauty prediction. IEEE Transactions on Affective Computing, Vol. 13, No. 1, pp. 122–134, 2019.
References_xml – reference: [15] D. I. Perrett, K. J. Lee, I. Penton-Voak, D. Rowland, S. Yoshikawa, D. M. Burt, S. P. Henzi, D. L. Castles, and S. Akamatsu. Effects of sexual dimorphism on facial attractiveness. Nature, Vol. 394, No. 6696, pp. 884–887, 1998.
– reference: [22] D. Xie, L. Liang, L. Jin, J. Xu, and M. Li. Scut-fbp: A benchmark dataset for facial beauty perception. In 2015 IEEE International Conference on Systems, Man, and Cybernetics, pp. 1821–1826. IEEE, 2015.
– reference: [9] L. Lin, L. Liang, and L. Jin. Regression guided by relative ranking using convolutional neural network (r3cnn) for facial beauty prediction. IEEE Transactions on Affective Computing, Vol. 13, No. 1, pp. 122–134, 2019.
– reference: [11] X. Liu, T. Li, H. Peng, I. Chuoying Ouyang, T. Kim, and R. Wang. Understanding beauty via deep facial features. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pp. 246–256, 2019.
– reference: [14] T. V. Nguyen, S. Liu, B. Ni, J. Tan, Y. Rui, and S. Yan. Sense beauty via face, dressing, and/or voice. In Proceedings of the 20th ACM International Conference on Multimedia, pp. 239–248, 2012.
– reference: [19] J. Saeed and A. M. Abdulazeez. Facial beauty prediction and analysis based on deep convolutional neural network: A review. Journal of Soft Computing and Data Mining, Vol. 2, No. 1, pp. 1–12, 2021.
– reference: [5] J. H. Langlois, L. Kalakanis, A. J. Rubenstein, A. Larson, M. Hallam, and M. Smoot. Maxims or myths of beauty? A meta-analytic and theoretical review. Psychological Bulletin, Vol. 126, No. 3, pp. 390–423, 2000.
– reference: [4] A. L. Jones, R. Russell, and R. Ward. Cosmetics alter biologically-based factors of beauty: Evidence from facial contrast. Evolutionary Psychology, Vol. 13, No. 1, 147470491501300113, 2015.
– reference: [6] L. Liang, L. Jin, and X. Li. Facial skin beautification using adaptive region-aware masks. IEEE Transactions on Cybernetics, Vol. 44, No. 12, pp. 2600–2612, 2014.
– reference: [18] R. Russell. A sex difference in facial contrast and its exaggeration by cosmetics. Perception, Vol. 38, No. 8, pp. 1211–1219, 2009.
– reference: [20] T. Sano. Visualization of facial attractiveness factors using gradient-weighted class activation mapping to understand the connection between facial features and perception of attractiveness. International Journal of Affective Engineering, IJAE–D, 2022.
– reference: [3] V. S. Johnston. Mate choice decisions: the role of facial beauty. Trends in Cognitive Sciences, Vol. 10, No. 1, pp. 9–13, 2006.
– reference: [2] J. He, C. Wang, Y. Zhang, J. Guo, and Y. Guo. FA-GANs: Facial attractiveness enhancement with generative adversarial networks on frontal faces. arXiv preprint, arXiv:2005.08168, 2020.
– reference: [24] J. Xu, L. Jin, L. Liang, Z. Feng, D. Xie, and H. Mao. Facial attractiveness prediction using psychologically inspired convolutional neural network (picnn). In 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 1657–1661. IEEE, 2017.
– reference: [7] L. Liang, L. Jin, X. Zhang, and Y. Xu. Multiple facial image editing using edge–aware pde learning. Computer Graphics Forum, Vol. 34, pp. 203–212. 2015.
– reference: [10] L. Liu, J. Xing, S. Liu, H. Xu, X. Zhou, and S. Yan. Wow! you are so beautiful today! ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM), Vol. 11, No. 1s, Article No. 20, 2014.
– reference: [16] D. I. Perrett, K. A. May, and S. Yoshikawa. Facial shape and judgements of female attractiveness. Nature, Vol. 368, No. 6468, pp. 239–242, 1994.
– reference: [25] L. Xu, J. Xiang, and X. Yuan. Transferring rich deep features for facial beauty prediction. arXiv preprint, arXiv:1803.07253, 2018.
– reference: [26] Y. Zhai, Y. Huang, Y. Xu, J. Gan, H. Cao, W. Deng, R. D. Labati, V. Piuri, and F. Scotti. Asian female facial beauty prediction using deep neural networks via transfer learning and multi-channel feature fusion. IEEE Access, Vol. 8, pp. 56892–56907, 2020.
– reference: [8] L. Liang, L. Lin, L. Jin, D. Xie, and M. Li. Scut-fbp5500: A diverse benchmark dataset for multi-paradigm facial beauty prediction. In 2018 24th International Conference on Pattern Recognition (ICPR), pp. 1598–1603. IEEE, 2018.
– reference: [23] J. Xu, L. Jin, L. Liang, Z. Feng, and D. Xie. A new humanlike facial attractiveness predictor with cascaded fine-tuning deep learning model. arXiv preprint, arXiv:1511.02465, 2015.
– reference: [1] D. Gray, K. Yu, W. Xu, and Y. Gong. Predicting facial beauty without landmarks. In Computer Vision: ECCV 2010, pp. 434–447. Springer, 2010.
– reference: [21] R. R. Selvaraju, M. Cogswell, A. Das, R. Vedantam, D. Parikh, and D. Batra. Grad-cam: Visual explanations from deep networks via gradientbased localization. In Proceedings of the 2017 IEEE International Conference on Computer Vision, pp. 618–626, 2017.
– reference: [12] N. Murray, L. Marchesotti, and F. Perronnin. Ava: A large-scale database for aesthetic visual analysis. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2408–2415. IEEE, 2012.
– reference: [27] L. Zhang, D. Zhang, M.-M. Sun, and F.-M. Chen. Facial beauty analysis based on geometric feature: Toward attractiveness assessment application. Expert Systems with Applications, Vol. 82, pp. 252–265, 2017.
– reference: [13] T. V. Nguyen and L. Liu. Smart mirror: Intelligent makeup recommendation and synthesis. In Proceedings of the 25th ACM International Conference on Multimedia, pp. 1253–1254, 2017.
– reference: [17] J. C. Peterson, S. Uddenberg, T. L. Griffiths, A. Todorov, and J. W. Suchow. Deep models of superficial face judgments. Proceedings of the National Academy of Sciences, Vol. 119, No. 17, e2115228119, 2022.
SSID ssj0003306486
ssib025294388
Score 1.943659
SourceID jstage
SourceType Publisher
StartPage 89
Title 深層学習手法による顔魅力要因の解釈可能性と産業応用への展望
URI https://www.jstage.jst.go.jp/article/bjsiam/33/2/33_89/_article/-char/ja
Volume 33
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
ispartofPNX 応用数理, 2023/06/23, Vol.33(2), pp.89-93
journalDatabaseRights – providerCode: PRVAFT
  databaseName: Colorado Digital library
  databaseCode: KQ8
  dateStart: 20160101
  customDbUrl:
  isFulltext: true
  eissn: 2432-1982
  dateEnd: 99991231
  titleUrlDefault: http://grweb.coalliance.org/oadl/oadl.html
  omitProxy: true
  ssIdentifier: ssj0003306486
  providerName: Colorado Alliance of Research Libraries
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnR1Na9VAMNR68SKKit_04J4kNdnN28wekzaPoigILfYWsnl54AOr6PPiQXwVsdqDIO2pPSjVtiBVige9SH_MI6_tv3Bmk9QUC9bCY9k3OzM7s7PZnQ3ZGcu65rZ1SwtIbJ1pwAOKbtm60aJ8xMqBVCLQpAO6fUdOTHk3pxvTQ8ee12-XdPVo-uzAeyVHsSrC0K50S_Y_LLvHFAFYR_tiiRbG8lA2ZpFkoc9Cl0UNKgOPKsE4CySLEN5kihMOKAahQRZMNVgk0H9kQWgqnAFUFYQoFrhMeaYyzqBBDCFgCpuA2ILpC_8GTsUnMk0-CwRRgW8YIlWTBU1qAsHCcSOGQ2glFZCE2BFRSRYYsUkLlHmsaoISOYRaX0ZT0kISpgrr7vVBHCQhh46BOAz23kSyyCO5EFZKbb4LJYFDzkLPDCA62k79vQinHBV2cXXZzGTDBEgzUhSHwfSMdgijapxEbeARHuDvupHKJ60LLYjs35LXjIuMUPcydKtZx7knuO0q2LfpFNE_yoeL13aQIqFS6YsUySP_3uXQzca5qTtP7icPRoUY5XFFti90eDkx4wIxFiLmVICKqxa62hd38HxxnPtSUgaQW3f31mPe4MoTZTg78mwEHVdBlhGySIob-2VAN66Dh5rqg0jjo02esk6Wh6uRoOj2tDXUSc5Y9wY_NvPNT_nG6vbW-8Gb-cH3xX7vS392rj87v_txYXfjVf52aWe1ly996Pe-7qyt7L6ey99923n5a_Bird9b315YGXzeyLeWtxfW-72fiJNvLg6Wl85aU81ocmzCLtOJ2B2ODpotWwkFS3RTT4PIEum3PXAy0CpFJ8544n4C2nHbIoUMTz6Zp3ni6Mxpc5m2tRbnrOGZhzPZeWskhdSHJBWOSttey03Ab0vFnczBZQ6UbF2woBiH-FERMyY-tCkuHp30knXiz4Nw2RruPn6aXUGXuauvGrv-BveGqRM
linkProvider Colorado Alliance of Research Libraries
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=%E6%B7%B1%E5%B1%A4%E5%AD%A6%E7%BF%92%E6%89%8B%E6%B3%95%E3%81%AB%E3%82%88%E3%82%8B%E9%A1%94%E9%AD%85%E5%8A%9B%E8%A6%81%E5%9B%A0%E3%81%AE%E8%A7%A3%E9%87%88%E5%8F%AF%E8%83%BD%E6%80%A7%E3%81%A8%E7%94%A3%E6%A5%AD%E5%BF%9C%E7%94%A8%E3%81%B8%E3%81%AE%E5%B1%95%E6%9C%9B&rft.jtitle=%E5%BF%9C%E7%94%A8%E6%95%B0%E7%90%86&rft.au=%E4%BD%90%E9%87%8E%2C+%E8%B2%B4%E7%B4%80&rft.date=2023-06-23&rft.pub=%E4%B8%80%E8%88%AC%E7%A4%BE%E5%9B%A3%E6%B3%95%E4%BA%BA+%E6%97%A5%E6%9C%AC%E5%BF%9C%E7%94%A8%E6%95%B0%E7%90%86%E5%AD%A6%E4%BC%9A&rft.eissn=2432-1982&rft.volume=33&rft.issue=2&rft.spage=89&rft.epage=93&rft_id=info:doi/10.11540%2Fbjsiam.33.2_89&rft.externalDocID=article_bjsiam_33_2_33_89_article_char_ja