Improved gait recognition by gait dynamics normalization

Potential sources for gait biometrics can be seen to derive from two aspects: gait shape and gait dynamics. We show that improved gait recognition can be achieved after normalization of dynamics and focusing on the shape information. We normalize for gait dynamics using a generic walking model, as c...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on pattern analysis and machine intelligence Vol. 28; no. 6; pp. 863 - 876
Main Authors Zongyi Liu, Sarkar, S.
Format Journal Article
LanguageEnglish
Published Los Alamitos, CA IEEE 01.06.2006
IEEE Computer Society
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0162-8828
1939-3539
DOI10.1109/TPAMI.2006.122

Cover

Abstract Potential sources for gait biometrics can be seen to derive from two aspects: gait shape and gait dynamics. We show that improved gait recognition can be achieved after normalization of dynamics and focusing on the shape information. We normalize for gait dynamics using a generic walking model, as captured by a population hidden Markov model (pHMM) defined for a set of individuals. The states of this pHMM represent gait stances over one gait cycle and the observations are the silhouettes of the corresponding gait stances. For each sequence, we first use Viterbi decoding of the gait dynamics to arrive at one dynamics-normalized, averaged, gait cycle of fixed length. The distance between two sequences is the distance between the two corresponding dynamics-normalized gait cycles, which we quantify by the sum of the distances between the corresponding gait stances. Distances between two silhouettes from the same generic gait stance are computed in the linear discriminant analysis space so as to maximize the discrimination between persons, while minimizing the variations of the same subject under different conditions. The distance computation is constructed so that it is invariant to dilations and erosions of the silhouettes. This helps us handle variations in silhouette shape that can occur with changing imaging conditions. We present results on three different, publicly available, data sets. First, we consider the HumanID gait challenge data set, which is the largest gait benchmarking data set that is available (122 subjects), exercising five different factors, i.e., viewpoint, shoe, surface, carrying condition, and time. We significantly improve the performance across the hard experiments involving surface change and briefcase carrying conditions. Second, we also show improved performance on the UMD gait data set that exercises time variations for 55 subjects. Third, on the CMU Mobo data set, we show results for matching across different walking speeds. It is worth noting that there was no separate training for the UMD and CMU data sets.
AbstractList Potential sources for gait biometrics can be seen to derive from two aspects: gait shape and gait dynamics. We show that improved gait recognition can be achieved after normalization of dynamics and focusing on the shape information. We normalize for gait dynamics using a generic walking model, as captured by a population Hidden Markov Model (pHMM) defined for a set of individuals. The states of this pHMM represent gait stances over one gait cycle and the observations are the silhouettes of the corresponding gait stances. For each sequence, we first use Viterbi decoding of the gait dynamics to arrive at one dynamics-normalized, averaged, gait cycle of fixed length. The distance between two sequences is the distance between the two corresponding dynamics-normalized gait cycles, which we quantify by the sum of the distances between the corresponding gait stances. Distances between two silhouettes from the same generic gait stance are computed in the linear discriminant analysis space so as to maximize the discrimination between persons, while minimizing the variations of the same subject under different conditions. The distance computation is constructed so that it is invariant to dilations and erosions of the silhouettes. This helps us handle variations in silhouette shape that can occur with changing imaging conditions. We present results on three different, publicly available, data sets. First, we consider the HumanlD Gait Challenge data set, which is the largest gait benchmarking data set that is available (122 subjects), exercising five different factors, i.e., viewpoint, shoe, surface, carrying condition, and time. We significantly improve the performance across the hard experiments involving surface change and briefcase carrying conditions. Second, we also show improved performance on the UMD gait data set that exercises time variations for 55 subjects. Third, on the CMU Mobo data set, we show results for matching across different walking speeds. It is worth noting that there was no separate training for the UMD and CMU data sets.
Potential sources for gait biometrics can be seen to derive from two aspects: gait shape and gait dynamics. We show that improved gait recognition can be achieved after normalization of dynamics and focusing on the shape information. We normalize for gait dynamics using a generic walking model, as captured by a population hidden Markov model (pHMM) defined for a set of individuals. The states of this pHMM represent gait stances over one gait cycle and the observations are the silhouettes of the corresponding gait stances. For each sequence, we first use Viterbi decoding of the gait dynamics to arrive at one dynamics-normalized, averaged, gait cycle of fixed length. The distance between two sequences is the distance between the two corresponding dynamics-normalized gait cycles, which we quantify by the sum of the distances between the corresponding gait stances. Distances between two silhouettes from the same generic gait stance are computed in the linear discriminant analysis space so as to maximize the discrimination between persons, while minimizing the variations of the same subject under different conditions. The distance computation is constructed so that it is invariant to dilations and erosions of the silhouettes. This helps us handle variations in silhouette shape that can occur with changing imaging conditions. We present results on three different, publicly available, data sets. First, we consider the HumanID gait challenge data set, which is the largest gait benchmarking data set that is available (122 subjects), exercising five different factors, i.e., viewpoint, shoe, surface, carrying condition, and time. We significantly improve the performance across the hard experiments involving surface change and briefcase carrying conditions. Second, we also show improved performance on the UMD gait data set that exercises time variations for 55 subjects. Third, on the CMU Mobo data set, we show results for matching across different walking speeds. It is worth no- - ting that there was no separate training for the UMD and CMU data sets.
Potential sources for gait biometrics can be seen to derive from two aspects: gait shape and gait dynamics. We show that improved gait recognition can be achieved after normalization of dynamics and focusing on the shape information. We normalize for gait dynamics using a generic walking model, as captured by a population hidden Markov model (pHMM) defined for a set of individuals. The states of this pHMM represent gait stances over one gait cycle and the observations are the silhouettes of the corresponding gait stances. For each sequence, we first use Viterbi decoding of the gait dynamics to arrive at one dynamics-normalized, averaged, gait cycle of fixed length. The distance between two sequences is the distance between the two corresponding dynamics-normalized gait cycles, which we quantify by the sum of the distances between the corresponding gait stances. Distances between two silhouettes from the same generic gait stance are computed in the linear discriminant analysis space so as to maximize the discrimination between persons, while minimizing the variations of the same subject under different conditions. The distance computation is constructed so that it is invariant to dilations and erosions of the silhouettes. This helps us handle variations in silhouette shape that can occur with changing imaging conditions. We present results on three different, publicly available, data sets. First, we consider the HumanID gait challenge data set, which is the largest gait benchmarking data set that is available (122 subjects), exercising five different factors, i.e., viewpoint, shoe, surface, carrying condition, and time. We significantly improve the performance across the hard experiments involving surface change and briefcase carrying conditions. Second, we also show improved performance on the UMD gait data set that exercises time variations for 55 subjects. Third, on the CMU Mobo data set, we show results for matching across different walking speeds. It is worth noting that there was no separate training for the UMD and CMU data sets.
Potential sources for gait biometrics can be seen to derive from two aspects: gait shape and gait dynamics. We show that improved gait recognition can be achieved after normalization of dynamics and focusing on the shape information. We normalize for gait dynamics using a generic walking model, as captured by a population Hidden Markov Model (pHMM) defined for a set of individuals. The states of this pHMM represent gait stances over one gait cycle and the observations are the silhouettes of the corresponding gait stances. For each sequence, we first use Viterbi decoding of the gait dynamics to arrive at one dynamics-normalized, averaged, gait cycle of fixed length. The distance between two sequences is the distance between the two corresponding dynamics-normalized gait cycles, which we quantify by the sum of the distances between the corresponding gait stances. Distances between two silhouettes from the same generic gait stance are computed in the linear discriminant analysis space so as to maximize the discrimination between persons, while minimizing the variations of the same subject under different conditions. The distance computation is constructed so that it is invariant to dilations and erosions of the silhouettes. This helps us handle variations in silhouette shape that can occur with changing imaging conditions. We present results on three different, publicly available, data sets. First, we consider the HumanlD Gait Challenge data set, which is the largest gait benchmarking data set that is available (122 subjects), exercising five different factors, i.e., viewpoint, shoe, surface, carrying condition, and time. We significantly improve the performance across the hard experiments involving surface change and briefcase carrying conditions. Second, we also show improved performance on the UMD gait data set that exercises time variations for 55 subjects. Third, on the CMU Mobo data set, we show results for matching across different walking speeds. It is worth noting that there was no separate training for the UMD and CMU data sets.Potential sources for gait biometrics can be seen to derive from two aspects: gait shape and gait dynamics. We show that improved gait recognition can be achieved after normalization of dynamics and focusing on the shape information. We normalize for gait dynamics using a generic walking model, as captured by a population Hidden Markov Model (pHMM) defined for a set of individuals. The states of this pHMM represent gait stances over one gait cycle and the observations are the silhouettes of the corresponding gait stances. For each sequence, we first use Viterbi decoding of the gait dynamics to arrive at one dynamics-normalized, averaged, gait cycle of fixed length. The distance between two sequences is the distance between the two corresponding dynamics-normalized gait cycles, which we quantify by the sum of the distances between the corresponding gait stances. Distances between two silhouettes from the same generic gait stance are computed in the linear discriminant analysis space so as to maximize the discrimination between persons, while minimizing the variations of the same subject under different conditions. The distance computation is constructed so that it is invariant to dilations and erosions of the silhouettes. This helps us handle variations in silhouette shape that can occur with changing imaging conditions. We present results on three different, publicly available, data sets. First, we consider the HumanlD Gait Challenge data set, which is the largest gait benchmarking data set that is available (122 subjects), exercising five different factors, i.e., viewpoint, shoe, surface, carrying condition, and time. We significantly improve the performance across the hard experiments involving surface change and briefcase carrying conditions. Second, we also show improved performance on the UMD gait data set that exercises time variations for 55 subjects. Third, on the CMU Mobo data set, we show results for matching across different walking speeds. It is worth noting that there was no separate training for the UMD and CMU data sets.
[...] we consider the HumanID gait challenge data set, which is the largest gait benchmarking data set that is available (122 subjects), exercising five different factors, i.e., viewpoint, shoe, surface, carrying condition, and time.
Author Zongyi Liu
Sarkar, S.
Author_xml – sequence: 1
  surname: Zongyi Liu
  fullname: Zongyi Liu
  organization: Dept. of Comput. Sci. & Eng., Univ. of South Florida, Tampa, FL, USA
– sequence: 2
  givenname: S.
  surname: Sarkar
  fullname: Sarkar, S.
  organization: Dept. of Comput. Sci. & Eng., Univ. of South Florida, Tampa, FL, USA
BackLink http://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=17748008$$DView record in Pascal Francis
https://www.ncbi.nlm.nih.gov/pubmed/16724582$$D View this record in MEDLINE/PubMed
BookMark eNqFkctrFEEQxhuJmM3q1Ysgi6CeZu334xiCj4WIHuK56emuDh1memL3rLD-9fZm1wcB9VRQ9fuKqu87Qyd5yoDQU4LXhGDz5urz-cfNmmIs14TSB2hBDDMdE8ycoAUmknZaU32Kzmq9wZhwgdkjdEqkolxoukB6M96W6RuE1bVL86qAn65zmtOUV_3u0Au77Mbk6ypPZXRD-u7248foYXRDhSfHukRf3r29uvjQXX56v7k4v-w8Z2zuOI8SIKo-9FpCdAGUAgZE0ADSiT5oBt5RFoWBgIFoEQVWBiIBzFjAbIleH_a2M79uoc52TNXDMLgM07ZabSSlXBHWyFf_JKVufmhh_gtSjbnmmDTwxT3wZtqW3N61WgrFmGlGL9HzI7TtRwj2tqTRlZ396XEDXh4BV70bYnHZp_qbU4prjHXj-IHzZaq1QLQ-zXdez8WlwRJs95Hbu8jtPnLbIm-y9T3Zr81_Ezw7CBIA_HEu5UxQ9gOyJ7SC
CODEN ITPIDJ
CitedBy_id crossref_primary_10_1109_TIFS_2011_2175921
crossref_primary_10_1049_el_20080089
crossref_primary_10_1016_j_knosys_2020_106273
crossref_primary_10_1080_15536548_2015_1046286
crossref_primary_10_1142_S0218001413500171
crossref_primary_10_1109_TCYB_2017_2705799
crossref_primary_10_1109_TCDS_2017_2658674
crossref_primary_10_3390_electronics13163137
crossref_primary_10_1109_TIP_2021_3055936
crossref_primary_10_1016_j_neucom_2017_10_049
crossref_primary_10_1016_j_patcog_2014_06_010
crossref_primary_10_1109_TCSVT_2022_3227385
crossref_primary_10_1049_iet_bmt_2016_0113
crossref_primary_10_1198_016214507000001229
crossref_primary_10_2197_ipsjtcva_4_53
crossref_primary_10_1109_TIP_2011_2180914
crossref_primary_10_1109_TIFS_2013_2287605
crossref_primary_10_1016_j_neucom_2009_09_017
crossref_primary_10_1016_j_neucom_2015_11_111
crossref_primary_10_1007_s11042_019_7712_3
crossref_primary_10_1145_2523819
crossref_primary_10_1016_j_patrec_2010_05_027
crossref_primary_10_1155_2008_629102
crossref_primary_10_1109_JIOT_2023_3242417
crossref_primary_10_1016_j_patcog_2015_08_011
crossref_primary_10_1109_TIP_2007_891157
crossref_primary_10_1049_iet_cvi_2011_0234
crossref_primary_10_1016_j_robot_2015_09_017
crossref_primary_10_1049_iet_ipr_2014_0773
crossref_primary_10_1016_j_cmpb_2018_03_019
crossref_primary_10_1109_TIFS_2012_2204253
crossref_primary_10_1109_TIM_2022_3214271
crossref_primary_10_1109_TPAMI_2011_260
crossref_primary_10_1016_j_patcog_2016_05_030
crossref_primary_10_1109_TIFS_2007_902030
crossref_primary_10_1587_transinf_2020ZDP7503
crossref_primary_10_1007_s00138_008_0144_0
crossref_primary_10_1016_j_sigpro_2010_01_024
crossref_primary_10_1007_s11042_023_15079_5
crossref_primary_10_1109_TCSVT_2013_2242640
crossref_primary_10_1016_j_patcog_2014_01_016
crossref_primary_10_1109_TIP_2009_2017143
crossref_primary_10_1109_TSMCB_2011_2182048
crossref_primary_10_1145_3381754
crossref_primary_10_1016_j_patrec_2024_06_031
crossref_primary_10_1007_s11042_017_5469_0
crossref_primary_10_1049_iet_bmt_2011_0004
crossref_primary_10_1109_TIP_2013_2266578
crossref_primary_10_1155_2014_484320
crossref_primary_10_1109_TIFS_2013_2252342
crossref_primary_10_1109_TCSVT_2012_2186744
crossref_primary_10_1016_j_patcog_2010_10_021
crossref_primary_10_1109_TCYB_2017_2682280
crossref_primary_10_1109_TPAMI_2021_3092833
crossref_primary_10_1109_TIFS_2009_2025858
crossref_primary_10_9746_jcmsi_6_331
crossref_primary_10_1016_j_patcog_2017_04_015
crossref_primary_10_1007_s11042_019_08400_8
crossref_primary_10_1007_s11042_019_07945_y
crossref_primary_10_1007_s11042_023_15775_2
crossref_primary_10_1007_s11042_018_6045_y
crossref_primary_10_1109_TCSVT_2008_2005594
crossref_primary_10_1016_j_patrec_2009_06_008
crossref_primary_10_1016_j_pnsc_2008_04_011
crossref_primary_10_1109_TPAMI_2021_3057879
crossref_primary_10_1587_transinf_E95_D_668
crossref_primary_10_1109_TMC_2014_2365185
crossref_primary_10_1109_TPAMI_2014_2366766
crossref_primary_10_1109_TSMCB_2012_2197823
crossref_primary_10_1016_j_cviu_2015_11_016
crossref_primary_10_1109_TCSVT_2012_2186731
crossref_primary_10_1109_TIFS_2018_2870594
crossref_primary_10_1109_TPAMI_2016_2533388
crossref_primary_10_1007_s11760_008_0089_9
crossref_primary_10_1016_j_robot_2015_03_001
crossref_primary_10_1016_j_imavis_2014_10_004
crossref_primary_10_1109_TITB_2009_2022913
crossref_primary_10_1145_3571743
crossref_primary_10_1016_j_cviu_2013_08_003
crossref_primary_10_1016_j_jvcir_2021_103052
crossref_primary_10_1016_j_patcog_2012_02_032
crossref_primary_10_1109_TMM_2019_2942479
crossref_primary_10_1145_3534607
crossref_primary_10_1049_iet_bmt_2018_5063
crossref_primary_10_1109_TCSVT_2009_2035852
crossref_primary_10_1016_j_neucom_2012_06_022
crossref_primary_10_1016_j_patcog_2014_09_022
crossref_primary_10_1016_j_patcog_2009_05_006
crossref_primary_10_1016_S0969_4765_11_70170_9
crossref_primary_10_1016_j_patcog_2010_03_011
crossref_primary_10_1109_TIFS_2007_902040
crossref_primary_10_1016_j_patrec_2019_04_010
crossref_primary_10_1109_LRA_2019_2895266
crossref_primary_10_1109_TIP_2013_2294552
crossref_primary_10_7717_peerj_cs_2158
crossref_primary_10_1016_j_artmed_2022_102314
crossref_primary_10_1109_TITB_2009_2035050
crossref_primary_10_1016_j_patrec_2011_04_014
crossref_primary_10_1109_TIP_2011_2160956
crossref_primary_10_1109_TPAMI_2017_2726061
crossref_primary_10_1109_TSP_2014_2306174
crossref_primary_10_1109_ACCESS_2021_3056880
crossref_primary_10_1080_01691864_2014_996604
crossref_primary_10_1145_3230633
crossref_primary_10_1016_j_sigpro_2009_01_015
crossref_primary_10_1109_TSMCA_2008_2007977
crossref_primary_10_1109_TIFS_2014_2336379
crossref_primary_10_1016_j_jvcir_2013_02_002
crossref_primary_10_1109_TIFS_2015_2445315
crossref_primary_10_1016_j_patcog_2018_03_030
crossref_primary_10_1016_j_cviu_2014_05_004
crossref_primary_10_1109_TMC_2018_2828816
crossref_primary_10_1049_el_2010_2738
crossref_primary_10_1002_adfm_202303562
crossref_primary_10_1049_iet_bmt_2016_0136
crossref_primary_10_1145_3488715
crossref_primary_10_1016_j_cose_2019_05_011
crossref_primary_10_1109_ACCESS_2016_2614720
crossref_primary_10_1109_TMC_2019_2897933
crossref_primary_10_1016_j_neucom_2016_05_077
crossref_primary_10_1016_j_neucom_2019_01_091
crossref_primary_10_1108_02602281311294342
crossref_primary_10_1109_TSMCB_2012_2199310
crossref_primary_10_1016_j_cviu_2018_01_007
crossref_primary_10_1039_D4TA08135H
crossref_primary_10_1260_2047_4970_4_2_209
crossref_primary_10_1016_j_cosrev_2021_100432
crossref_primary_10_1016_j_patcog_2009_12_020
crossref_primary_10_1109_TIP_2007_906769
crossref_primary_10_1007_s11263_010_0362_6
crossref_primary_10_1198_016214507000001238
crossref_primary_10_1016_j_patrec_2016_05_009
crossref_primary_10_1049_iet_cvi_2010_0166
crossref_primary_10_1049_iet_bmt_2020_0103
crossref_primary_10_1016_j_jvcir_2013_02_006
crossref_primary_10_1155_2013_206251
crossref_primary_10_1186_1687_6180_2014_15
Cites_doi 10.1109/34.879790
10.1109/AFGR.2004.1301502
10.1109/AFGR.2002.1004148
10.1007/3-540-44887-X_67
10.1109/TPAMI.2003.1251144
10.1109/TSMCB.2004.842251
10.1109/ICPR.2004.1333741
10.1109/AFGR.2004.1301521
10.1007/3-540-45344-X_44
10.1007/978-1-4612-1694-0_15
10.1109/AFGR.2002.1004181
10.1109/34.598228
10.1007/3-540-44887-X_85
10.3758/BF03337021
10.1117/12.543107
10.1016/j.cviu.2004.04.004
10.1007/3-540-47967-8_44
10.1109/CVPR.2004.1315252
10.1109/AFGR.2004.1301522
10.1109/cvpr.2004.1315244
10.1109/MNRAO.1994.346253
10.1109/TIP.2004.832865
10.1007/3-540-44887-X_83
10.1007/3-540-44887-X_82
10.1109/AFGR.2004.1301504
10.1109/ICCV.2003.1238411
10.1109/TPAMI.2005.246
10.1109/TPAMI.2005.39
ContentType Journal Article
Copyright 2006 INIST-CNRS
Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2006
Copyright_xml – notice: 2006 INIST-CNRS
– notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2006
DBID 97E
RIA
RIE
AAYXX
CITATION
IQODW
CGR
CUY
CVF
ECM
EIF
NPM
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
F28
FR3
DOI 10.1109/TPAMI.2006.122
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Pascal-Francis
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
Engineering Research Database
ANTE: Abstracts in New Technology & Engineering
DatabaseTitleList MEDLINE
Technology Research Database

MEDLINE - Academic
Computer and Information Systems Abstracts
Technology Research Database
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 3
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
Applied Sciences
EISSN 1939-3539
EndPage 876
ExternalDocumentID 2343239311
16724582
17748008
10_1109_TPAMI_2006_122
1624352
Genre orig-research
Evaluation Studies
Research Support, U.S. Gov't, Non-P.H.S
Journal Article
GroupedDBID ---
-DZ
-~X
.DC
0R~
29I
4.4
53G
5GY
5VS
6IK
97E
9M8
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABFSI
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACNCT
ADRHT
AENEX
AETEA
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
E.L
EBS
EJD
F5P
FA8
HZ~
H~9
IBMZZ
ICLAB
IEDLZ
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNI
RNS
RXW
RZB
TAE
TN5
UHB
VH1
XJT
~02
AAYXX
CITATION
IQODW
RIG
AAYOK
CGR
CUY
CVF
ECM
EIF
NPM
PKN
RIC
Z5M
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
F28
FR3
ID FETCH-LOGICAL-c433t-44f6eef7bdb86efade77e3e152de6a5bd83eca23f59ed0e185f5079ef1e033d03
IEDL.DBID RIE
ISSN 0162-8828
IngestDate Sun Sep 28 02:12:07 EDT 2025
Wed Oct 01 14:12:20 EDT 2025
Sun Sep 28 09:09:54 EDT 2025
Mon Jun 30 05:58:00 EDT 2025
Wed Feb 19 01:52:42 EST 2025
Mon Jul 21 09:14:47 EDT 2025
Wed Oct 01 06:41:10 EDT 2025
Thu Apr 24 22:51:59 EDT 2025
Tue Aug 26 16:40:12 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 6
Keywords Biometrics
Discriminant analysis
Gait
Imaging
Form defect
Hidden Markov model
LDA
population HMM
Pattern analysis
Viterbi decoding
Gait recognition
gait shape
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
CC BY 4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c433t-44f6eef7bdb86efade77e3e152de6a5bd83eca23f59ed0e185f5079ef1e033d03
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ObjectType-Undefined-1
ObjectType-Feature-3
PMID 16724582
PQID 865733935
PQPubID 23500
PageCount 14
ParticipantIDs proquest_miscellaneous_68016859
proquest_journals_865733935
pubmed_primary_16724582
crossref_primary_10_1109_TPAMI_2006_122
proquest_miscellaneous_896224713
proquest_miscellaneous_28048401
ieee_primary_1624352
pascalfrancis_primary_17748008
crossref_citationtrail_10_1109_TPAMI_2006_122
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2006-06-01
PublicationDateYYYYMMDD 2006-06-01
PublicationDate_xml – month: 06
  year: 2006
  text: 2006-06-01
  day: 01
PublicationDecade 2000
PublicationPlace Los Alamitos, CA
PublicationPlace_xml – name: Los Alamitos, CA
– name: United States
– name: New York
PublicationTitle IEEE transactions on pattern analysis and machine intelligence
PublicationTitleAbbrev TPAMI
PublicationTitleAlternate IEEE Trans Pattern Anal Mach Intell
PublicationYear 2006
Publisher IEEE
IEEE Computer Society
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: IEEE Computer Society
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref12
ref15
ref14
ref31
ref11
ref10
ref32
ref2
ref1
ref17
Gross (ref5) 2001
ref16
ref19
ref18
ref24
ref23
Rabiner (ref27) 1993
ref25
ref20
ref22
ref21
ref28
ref29
ref8
ref7
Sunderesan (ref26); 2
ref9
ref4
ref3
ref6
Lee (ref30) 2003
References_xml – ident: ref25
  doi: 10.1109/34.879790
– ident: ref7
  doi: 10.1109/AFGR.2004.1301502
– ident: ref20
  doi: 10.1109/AFGR.2002.1004148
– ident: ref12
  doi: 10.1007/3-540-44887-X_67
– ident: ref11
  doi: 10.1109/TPAMI.2003.1251144
– volume-title: Fundamentals of Speech Recognition
  year: 1993
  ident: ref27
– ident: ref9
  doi: 10.1109/TSMCB.2004.842251
– ident: ref22
  doi: 10.1109/ICPR.2004.1333741
– ident: ref4
  doi: 10.1109/AFGR.2004.1301521
– ident: ref16
  doi: 10.1007/3-540-45344-X_44
– volume: 2
  start-page: 93
  volume-title: Proc. IEEE Int’l Conf. Image Processing
  ident: ref26
  article-title: A Hidden Markov Model Based Framework for Recognition of Humans from Gait Sequences
– ident: ref28
  doi: 10.1007/978-1-4612-1694-0_15
– ident: ref14
  doi: 10.1109/AFGR.2002.1004181
– ident: ref29
  doi: 10.1109/34.598228
– ident: ref15
  doi: 10.1007/3-540-44887-X_85
– ident: ref1
  doi: 10.3758/BF03337021
– ident: ref10
  doi: 10.1117/12.543107
– ident: ref3
  doi: 10.1016/j.cviu.2004.04.004
– ident: ref23
  doi: 10.1007/3-540-47967-8_44
– ident: ref18
  doi: 10.1109/CVPR.2004.1315252
– year: 2001
  ident: ref5
  article-title: The CMU Motion of Body (MoBo) Database
– ident: ref21
  doi: 10.1109/AFGR.2004.1301522
– volume-title: Massachusetts Inst. of Technology
  year: 2003
  ident: ref30
  article-title: Gait Analysis for Classification
– ident: ref32
  doi: 10.1109/cvpr.2004.1315244
– ident: ref2
  doi: 10.1109/MNRAO.1994.346253
– ident: ref6
  doi: 10.1109/TIP.2004.832865
– ident: ref17
  doi: 10.1007/3-540-44887-X_83
– ident: ref31
  doi: 10.1007/3-540-44887-X_82
– ident: ref24
  doi: 10.1109/AFGR.2004.1301504
– ident: ref13
  doi: 10.1109/ICCV.2003.1238411
– ident: ref19
  doi: 10.1109/TPAMI.2005.246
– ident: ref8
  doi: 10.1109/TPAMI.2005.39
SSID ssj0014503
Score 2.3744533
Snippet Potential sources for gait biometrics can be seen to derive from two aspects: gait shape and gait dynamics. We show that improved gait recognition can be...
[...] we consider the HumanID gait challenge data set, which is the largest gait benchmarking data set that is available (122 subjects), exercising five...
SourceID proquest
pubmed
pascalfrancis
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 863
SubjectTerms Algorithms
Applied sciences
Artificial Intelligence
Biometrics
Biometry - methods
Cluster Analysis
Computation
Computer displays
Computer science; control theory; systems
Computer Simulation
Computer vision
Decoding
Diagnosis, Computer-Assisted - methods
Discriminant analysis
Dynamics
Exact sciences and technology
Footwear
Gait
Gait - physiology
Gait recognition
gait shape
Hidden Markov models
Image Enhancement - methods
Image Interpretation, Computer-Assisted - methods
Information Storage and Retrieval - methods
LDA
Legged locomotion
Linear discriminant analysis
Markov Chains
Mathematical models
Models, Biological
Models, Statistical
Pattern Recognition, Automated - methods
Pattern recognition. Digital image processing. Computational geometry
Performance enhancement
Photography - methods
population HMM
Reproducibility of Results
Sensitivity and Specificity
Shape
Studies
Viterbi algorithm
Walking
Title Improved gait recognition by gait dynamics normalization
URI https://ieeexplore.ieee.org/document/1624352
https://www.ncbi.nlm.nih.gov/pubmed/16724582
https://www.proquest.com/docview/865733935
https://www.proquest.com/docview/28048401
https://www.proquest.com/docview/68016859
https://www.proquest.com/docview/896224713
Volume 28
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1939-3539
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014503
  issn: 0162-8828
  databaseCode: RIE
  dateStart: 19790101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VnuBAXzzSQskBiQvZJrHjx7FCVAVpEYdW6i3yY4IQVRax2QP99R3b2bRFrNRbFM_B9ng8nz3j-QDeV7X0VnNRaOZNwTvLCo2-KcLpQjmtrYsR3fk3cX7Jv141V1vwcXoLg4gx-Qxn4TPG8v3CrcJV2UklavLutOE-kVKnt1pTxIA3kQWZEAxZOB0jxgKNValPLr6fzr-kuENVR_oaIesQL3rgiyK5SkiNNEuanS7RWmzGndH_nO3AfN3zlHbya7Ya7Mzd_FPU8bFD24XnIxDNT9PK2YMt7PdhZ03ykI82vw_P7lUsPACVLiHQ5z_MzyGf0o8WfW7_pn8-cdwv8z7A4evxnecLuDz7fPHpvBjJFwrHGRsKzjuB2EnrrRLYGY9SIkNy9x6FaaxXDJ2pWdeQbkskt98RtNTYVVgy5kv2Erb7RY-vIWfWuUqb2giB3PPGdpJ5wZ0QzjChbAbFWg2tGyuTB4KM6zaeUErdRg0GxkzRkgYz-DDJ_041OTZKHoSpvpNKs5zB8QMt37UTGCYErTI4Wqu9HW162SoRakdq1mTwbmolYwwRFtPjYrVsa0UbIp1YN0sIQgRCNTqDfIOE0oJglaxYBq_SgrvX_bRuD_8_rCN4mi6Iwh3RG9ge_qzwLUGmwR5HW7kFfQUR9A
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Nb9QwEB1V5QAcKLRAQ6HNAYkL2Sax49jHClFtoVtx2Eq9Rf6YVIgqi7rZA_x6xnY2bRErcYviOdgej-eNx54H8L4oa2cUF5liTme8NSxT6KrMRxfSKmVsyOjOLsT0kn-5qq624OP4FgYRw-UznPjPkMt3C7vyR2XHhSjJu9OG-6iiqKKOr7XGnAGvAg8yYRiycQokhhKNRa6O599OZmcx81CUgcBG1KXPGD3wRoFexV-O1EuanzYSW2xGnsEDne7AbN33ePHkx2TVm4n9_VdZx_8d3HN4NkDR9CSunRewhd0u7KxpHtLB6nfh6b2ahXsg4zEEuvRaf-_T8QLSokvNr_jPRZb7Zdp5QHwzvPR8CZenn-efptlAv5BZzlifcd4KxLY2zkiBrXZY18iQHL5DoSvjJEOrS9ZWpN0cyfG3BC4VtgXmjLmcvYLtbtHhPqTMWFsoXWohkDtembZmTnArhNVMSJNAtlZDY4fa5J4i46YJMUqumqBBz5kpGtJgAh9G-Z-xKsdGyT0_1XdScZYTOHyg5bt2gsOEoWUCB2u1N4NVLxspfPVIxaoEjsZWMkefY9EdLlbLppS0JVLMullCECYQslIJpBskpBIErOqCJfA6Lrh73Y_r9s2_h3UEj6fz2Xlzfnbx9QCexOMif2L0Frb72xW-IwDVm8NgN38A_M0VRQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Improved+gait+recognition+by+gait+dynamics+normalization&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Zongyi+Liu&rft.au=Sarkar%2C+S.&rft.date=2006-06-01&rft.issn=0162-8828&rft.volume=28&rft.issue=6&rft.spage=863&rft.epage=876&rft_id=info:doi/10.1109%2FTPAMI.2006.122&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TPAMI_2006_122
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon