Finger Vein Pulsation-Based Biometric Recognition

Finger vein has become an appealing biometric trait due to its intrinsic nature, contactless acquisition and anti-spoofing capability when compared to other dominant biometric traits. The state-of-the-art intrinsic recognition derives vein patterns based on either curvature values, line tracking or...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on information forensics and security Vol. 16; pp. 5034 - 5044
Main Authors Krishnan, Arya, Thomas, Tony, Mishra, Deepak
Format Journal Article
LanguageEnglish
Published New York IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN1556-6013
1556-6021
DOI10.1109/TIFS.2021.3122073

Cover

More Information
Summary:Finger vein has become an appealing biometric trait due to its intrinsic nature, contactless acquisition and anti-spoofing capability when compared to other dominant biometric traits. The state-of-the-art intrinsic recognition derives vein patterns based on either curvature values, line tracking or deep neural networks. However, these methods extract artifacts such as noise, breaks and texture along with veins due to the problems such as irregular shading, poor contrast and blurriness in NIR images which affect the recognition accuracy. To deal with these issues, we propose a novel acquisition mechanism for vein patterns based on the pulsation of the veins. We propose to capture the pulsations from vein videos to accurately isolate the vein patterns. Besides, the proposed framework has an inherent method of detecting liveness along with recognition of the finger vein. To the best of our knowledge, this is the first work that utilizes the finger vein pulsations for biometric recognition. We acquired a finger vein video dataset, from 320 subjects, to evaluate the proposed method. The experimental results indicate that the proposed approach has a better recognition performance compared to the existing image-based approaches with an EER (%) of 0.8 and a recognition accuracy of 96.35%.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1556-6013
1556-6021
DOI:10.1109/TIFS.2021.3122073