Generic Sparse Graph Based Convolutional Networks for Face Recognition

Several graph-based methods have been proposed to perform face recognition, such as elastic graph matching, etc. These methods take advantage of the fact that the face has a graph structure. However, these methods are weaker than the CNNs. With the development of graph convolutional neural networks...

Full description

Saved in:
Bibliographic Details
Published inProceedings - International Conference on Image Processing pp. 1589 - 1593
Main Authors Wu, Renjie, Kamata, Sei-ichiro
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.01.2021
Subjects
Online AccessGet full text
ISSN2381-8549
DOI10.1109/ICIP42928.2021.9506083

Cover

More Information
Summary:Several graph-based methods have been proposed to perform face recognition, such as elastic graph matching, etc. These methods take advantage of the fact that the face has a graph structure. However, these methods are weaker than the CNNs. With the development of graph convolutional neural networks (GCNNs), we can reconsider the benefits of identifying the graph structure. In this paper, a face image is modeled as a sparse graph. The major challenge is how to estimate the sparse graph. Usually, the sparse graph is based on some prior clustering methods, such as k-nn, etc., that will cause the learned graph to be closer to the prior graph. Another problem is that the regularization parameters are difficult to accurately estimate. This paper presents a generic sparse graph based convolutional networks (GSgCNs). We have three advantages: 1) the regularization parameters are not estimated in the generic sparse graph modeling, 2) non-prior and 3) each sparse subgraph is represented as a connected graph of the most adjacent-relevant vertices. Because the generic sparse graph representation is non-convex, we implement the projected gradient descent algorithm with structured sparse representation. Experimental results demonstrate that the GSgCNs have good performance compared with some state-of-the-art methods.
ISSN:2381-8549
DOI:10.1109/ICIP42928.2021.9506083