Cross-view gait recognition by fusion of multiple transformation consistency measures

Gait is a promising modality for forensic science because it has discrimination ability even if the gait features are extracted from low-quality image sequences captured at a distance. However, in forensic cases the observation view is often different, leading to accuracy degradation. Therefore the...

Full description

Saved in:
Bibliographic Details
Published inIET biometrics Vol. 4; no. 2; pp. 62 - 73
Main Authors Muramatsu, Daigo, Makihara, Yasushi, Yagi, Yasushi
Format Journal Article
LanguageEnglish
Published Stevenage The Institution of Engineering and Technology 01.06.2015
John Wiley & Sons, Inc
Subjects
Online AccessGet full text
ISSN2047-4938
2047-4946
2047-4946
DOI10.1049/iet-bmt.2014.0042

Cover

More Information
Summary:Gait is a promising modality for forensic science because it has discrimination ability even if the gait features are extracted from low-quality image sequences captured at a distance. However, in forensic cases the observation view is often different, leading to accuracy degradation. Therefore the authors propose a gait recognition algorithm that achieves high accuracy in cases where observation views are different. They used a view transformation technique, and generated multiple joint gait features by changing the source gait features. They formed a hypothesis that the multiple transformed features and original features should be similar to each other if the target subjects are the same. They calculated multiple scores that measured the consistency of the features, and a likelihood ratio from the scores. To evaluate the accuracy of the proposed method, they drew Tippett plots and empirical cross-entropy plots, together with cumulative match characteristic curves and receiver operator characteristic curves, and evaluated discrimination ability and calibration quality. The results showed that their proposed method achieves good results in terms of discrimination and calibration.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2047-4938
2047-4946
2047-4946
DOI:10.1049/iet-bmt.2014.0042