A Coding System with Independent Annotations of Gesture Forms and Functions During Verbal Communication: Development of a Database of Speech and GEsture (DoSaGE)

Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently de...

Full description

Saved in:
Bibliographic Details
Published inJournal of nonverbal behavior Vol. 39; no. 1; pp. 93 - 111
Main Authors Kong, Anthony Pak-Hin, Law, Sam-Po, Kwan, Connie Ching-Yin, Lai, Christy, Lam, Vivian
Format Journal Article
LanguageEnglish
Published Boston Springer US 01.03.2015
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0191-5886
1573-3653
1573-3653
DOI10.1007/s10919-014-0200-6

Cover

Abstract Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator software was used to independently annotate each participant’s linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance.
AbstractList Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator software was used to independently annotate each participant’s linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance.
Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture (DoSaGE) based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator (ELAN) software was used to independently annotate each participant's linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance.
Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator software was used to independently annotate each participant's linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance. Adapted from the source document
Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture (DoSaGE) based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator (ELAN) software was used to independently annotate each participant's linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance.Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture (DoSaGE) based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator (ELAN) software was used to independently annotate each participant's linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance.
Author Lam, Vivian
Lai, Christy
Kong, Anthony Pak-Hin
Law, Sam-Po
Kwan, Connie Ching-Yin
AuthorAffiliation 2 Division of Speech and Hearing Sciences, The University of Hong Kong Hong Kong SAR, China
1 Department of Communication Sciences and Disorders University of Central Florida, Orlando, FL, USA
AuthorAffiliation_xml – name: 1 Department of Communication Sciences and Disorders University of Central Florida, Orlando, FL, USA
– name: 2 Division of Speech and Hearing Sciences, The University of Hong Kong Hong Kong SAR, China
Author_xml – sequence: 1
  givenname: Anthony Pak-Hin
  surname: Kong
  fullname: Kong, Anthony Pak-Hin
  email: antkong@ucf.edu
  organization: Department of Communication Sciences and Disorders, University of Central Florida, Division of Speech and Hearing Sciences, The University of Hong Kong
– sequence: 2
  givenname: Sam-Po
  surname: Law
  fullname: Law, Sam-Po
  organization: Division of Speech and Hearing Sciences, The University of Hong Kong
– sequence: 3
  givenname: Connie Ching-Yin
  surname: Kwan
  fullname: Kwan, Connie Ching-Yin
  organization: Division of Speech and Hearing Sciences, The University of Hong Kong
– sequence: 4
  givenname: Christy
  surname: Lai
  fullname: Lai, Christy
  organization: Division of Speech and Hearing Sciences, The University of Hong Kong
– sequence: 5
  givenname: Vivian
  surname: Lam
  fullname: Lam, Vivian
  organization: Division of Speech and Hearing Sciences, The University of Hong Kong
BackLink https://www.ncbi.nlm.nih.gov/pubmed/25667563$$D View this record in MEDLINE/PubMed
BookMark eNqNks1u1DAUhS1URKeFB2CDLLEpi4Cvk9gJC6TR_FGpEosBtpYTO51UiR3spNU8Dm-K00xLqQR0Y8u65xz7fr4n6MhYoxF6DeQ9EMI_eCA55BGBJCKUkIg9QzNIeRzFLI2P0IxADlGaZewYnXh_RQhJ8oS_QMc0ZYynLJ6hn3O8sKo2l3i7971u8U3d7_C5UbrTYTE9nhtje9nX1nhsK7zRvh-cxmvrWo-lUXg9mHIqLwc3Jn3XrpBNyG3bwdTlrfcjXupr3diuHTNDjsRL2ctCej2etp3W5e42brOaLjhb2q3crN69RM8r2Xj96rCfom_r1dfF5-jiy-Z8Mb-ISkZYH1rOaQIJVZCkRFalqmJCMkgTULTKmKIZL4ABgAzkpOYxUQVTlaQqzSvFZXyK6JQ7mE7ub2TTiM7VrXR7AUSMvMXEWwTeYuQtWDB9mkzdULRalaE5J38brazFnxVT78SlvRZJHP4GeAg4OwQ4-2MIbEVb-1I3jTTaDl5ARhkDmvP4_1KWhZcRCuwJ0jRhOcmyPEjfPpJe2cGZQDqoEh5D4Dqq3jzs877Bu0EKApgEpbPeO109CR5_5CnradICqrr5p_PwV74bJ067B4_-q-kXlyP7DA
CODEN JNVBDV
CitedBy_id crossref_primary_10_1111_1460_6984_12579
crossref_primary_10_32714_ricl12_02_02
crossref_primary_10_1016_j_pragma_2021_02_027
crossref_primary_10_1080_02687038_2017_1301368
crossref_primary_10_1080_02699206_2021_1984582
crossref_primary_10_1017_dsd_2020_104
crossref_primary_10_1007_s12124_022_09697_1
crossref_primary_10_1515_applirev_2021_0043
crossref_primary_10_1080_02687038_2018_1463085
crossref_primary_10_1016_j_neuropsychologia_2018_06_025
crossref_primary_10_1109_JSTSP_2019_2956371
crossref_primary_10_1044_2017_JSLHR_L_16_0093
crossref_primary_10_1044_2017_JSLHR_L_17_0185
crossref_primary_10_14232_jeny_2020_1_3
crossref_primary_10_3389_fnhum_2024_1393284
crossref_primary_10_1007_s11265_019_01511_3
crossref_primary_10_1080_13218719_2024_2346739
crossref_primary_10_1007_s12369_021_00839_w
crossref_primary_10_3758_s13428_018_1043_6
crossref_primary_10_1016_j_compind_2023_104023
crossref_primary_10_1111_jar_12980
crossref_primary_10_32714_ricl_12_02_02
crossref_primary_10_1002_brb3_1663
crossref_primary_10_3390_arts12040179
crossref_primary_10_1007_s10579_021_09540_w
crossref_primary_10_1080_02643294_2019_1618255
crossref_primary_10_1016_j_jcomdis_2015_06_007
crossref_primary_10_1109_THMS_2022_3149173
Cites_doi 10.1093/acprof:oso/9780198524519.003.0006
10.1023/A:1021431425225
10.1016/j.jml.2006.07.011
10.1109/FGR.2006.8
10.21437/Eurospeech.1999-566x
10.1037/0882-7974.23.1.104
10.1016/S0885-2014(99)80017-3
10.1023/A:1021487510204
10.1080/02687038708248824
10.1109/AFGR.1996.557260
10.1016/j.sbspro.2010.08.029
10.1145/568513.568514
10.1080/02687038.2011.589893
10.1016/0093-934X(88)90053-3
10.1017/CBO9780511620850.013
10.1109/AFGR.2000.840674
10.1016/S1364-6613(99)01397-2
10.1023/A:1021435526134
10.1080/01688638708405214
10.1207/s15327809jls0503_2
10.1111/j.2517-6161.1964.tb00553.x
10.1177/0261927X99018004005
10.1007/BF02248714
10.1075/gest.7.1.05hos
10.1016/j.bandl.2006.12.003
10.1073/pnas.0909197106
10.1080/02687038808248899
10.1037/0033-295X.96.1.168
10.1111/j.1467-9280.1996.tb00364.x
10.1017/CBO9780511620850.017
10.1016/j.specom.2010.02.009
10.3758/BRM.41.3.841
10.1080/02687030802642044
10.1348/000712600161943
10.1515/semi.1969.1.1.49
10.1080/13825580802592771
10.1080/016909600750040571
ContentType Journal Article
Copyright Springer Science+Business Media New York 2014
Springer Science+Business Media New York 2015
Copyright_xml – notice: Springer Science+Business Media New York 2014
– notice: Springer Science+Business Media New York 2015
DBID AAYXX
CITATION
NPM
0-V
3V.
7TK
7XB
88G
88I
88J
8AF
8AO
8FI
8FJ
8FK
8G5
ABUWG
AFKRA
ALSLI
AVQMV
AZQEC
BENPR
CCPQU
DWQXO
FYUFA
GHDGH
GNUQQ
GUQSH
HCIFZ
K50
M1D
M2M
M2O
M2P
M2R
MBDVC
PEJEM
PHGZM
PHGZT
PKEHL
PMKZF
POGQB
PQEST
PQQKQ
PQUKI
PRINS
PRQQA
PSYQQ
Q9U
7T9
7X8
5PM
ADTOC
UNPAY
DOI 10.1007/s10919-014-0200-6
DatabaseName CrossRef
PubMed
ProQuest Social Sciences Premium Collection【Remote access available】
ProQuest Central (Corporate)
Neurosciences Abstracts
ProQuest Central (purchase pre-March 2016)
Psychology Database (Alumni)
Science Database (Alumni Edition)
Social Science Database (Alumni Edition)
STEM Database
ProQuest Pharma Collection
ProQuest Hospital Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
Research Library (Alumni)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
Social Science Premium Collection
Arts Premium Collection
ProQuest Central Essentials
ProQuest Central
ProQuest One
ProQuest Central
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Central Student
ProQuest Research Library
SciTech Premium Collection
Art, Design & Architecture Collection
Arts & Humanities Database (ProQuest)
Psychology Database (ProQuest)
Research Library
Science Database
Social Science Database
Research Library (Corporate)
ProQuest One Visual Arts & Design
ProQuest Central Premium
ProQuest One Academic (New)
ProQuest One Academic Middle East (New)
ProQuest Digital Collections
ProQuest Sociology & Social Sciences Collection
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
ProQuest One Social Sciences
ProQuest One Psychology
ProQuest Central Basic
Linguistics and Language Behavior Abstracts (LLBA)
MEDLINE - Academic
PubMed Central (Full Participant titles)
Unpaywall for CDI: Periodical Content
Unpaywall
DatabaseTitle CrossRef
PubMed
ProQuest One Psychology
Research Library Prep
ProQuest Sociology & Social Sciences Collection
ProQuest Central Student
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest Social Science Journals (Alumni Edition)
ProQuest AP Science
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
Research Library (Alumni Edition)
ProQuest Pharma Collection
Sociology & Social Sciences Collection
ProQuest Central China
ProQuest Central
Health Research Premium Collection
Arts Premium Collection
ProQuest Central Korea
ProQuest Research Library
ProQuest Central (New)
ProQuest Art, Design and Architecture Collection
Social Science Premium Collection
ProQuest Science Journals (Alumni Edition)
ProQuest One Social Sciences
ProQuest Central Basic
ProQuest Science Journals
ProQuest One Academic Eastern Edition
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
ProQuest Psychology Journals (Alumni)
Neurosciences Abstracts
ProQuest Hospital Collection (Alumni)
ProQuest Digital Collections
ProQuest Social Science Journals
ProQuest Psychology Journals
ProQuest Social Sciences Premium Collection
ProQuest One Academic UKI Edition
ProQuest One Visual Arts & Design
Arts & Humanities Full Text
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
Linguistics and Language Behavior Abstracts (LLBA)
MEDLINE - Academic
DatabaseTitleList
PubMed
Neurosciences Abstracts
ProQuest One Psychology

Linguistics and Language Behavior Abstracts (LLBA)
MEDLINE - Academic
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: BENPR
  name: ProQuest Central
  url: http://www.proquest.com/pqcentral?accountid=15518
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Psychology
Education
EISSN 1573-3653
EndPage 111
ExternalDocumentID oai:pubmedcentral.nih.gov:4319117
PMC4319117
3564604391
25667563
10_1007_s10919_014_0200_6
Genre Journal Article
GrantInformation_xml – fundername: NIDCD NIH HHS
  grantid: R01 DC010398
GroupedDBID -4V
-55
-5G
-BR
-EM
-Y2
-~C
-~X
.86
.VR
0-V
06D
07C
0R~
0VY
199
1N0
1SB
2.D
203
28-
29L
2J2
2JN
2JY
2KG
2KM
2LR
2P1
2VQ
2~H
30V
3V.
4.4
406
408
409
40D
40E
53G
5GY
5QI
5VS
67Z
6NX
78A
88I
8AF
8AO
8FI
8FJ
8FW
8G5
8TC
8UJ
95-
95.
95~
96X
AAAVM
AABHQ
AACDK
AAHNG
AAIAL
AAJBT
AAJKR
AANZL
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAWTL
AAYIU
AAYQN
AAYTO
AAYZH
ABAKF
ABBBX
ABBXA
ABDZT
ABECU
ABFTD
ABFTV
ABHLI
ABHQN
ABIVO
ABJNI
ABJOX
ABKCH
ABKTR
ABMNI
ABMQK
ABNWP
ABPLI
ABPPZ
ABQBU
ABQSL
ABSXP
ABTEG
ABTHY
ABTKH
ABTMW
ABULA
ABUWG
ABWNU
ABXPI
ACAOD
ACBXY
ACDTI
ACGFS
ACGOD
ACHQT
ACHSB
ACHXU
ACKNC
ACMDZ
ACMLO
ACNCT
ACOKC
ACOMO
ACPIV
ACPRK
ACSNA
ACYUM
ACZOJ
ADBBV
ADHHG
ADHIR
ADIMF
ADINQ
ADKNI
ADKPE
ADMHG
ADRFC
ADTPH
ADURQ
ADYFF
ADZKW
AEBTG
AEFIE
AEFQL
AEGAL
AEGNC
AEJHL
AEJRE
AEKMD
AEMSY
AEOHA
AEPYU
AERSA
AESKC
AETLH
AEVLU
AEXYK
AFBBN
AFEXP
AFFNX
AFGCZ
AFKRA
AFLOW
AFQWF
AFWTZ
AFZKB
AGAYW
AGDGC
AGGDS
AGJBK
AGMZJ
AGQEE
AGQMX
AGRTI
AGWIL
AGWZB
AGYKE
AHAVH
AHBYD
AHKAY
AHSBF
AHYZX
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AJBLW
AJRNO
AJZVZ
ALIPV
ALMA_UNASSIGNED_HOLDINGS
ALSLI
ALWAN
AMKLP
AMXSW
AMYLF
AMYQR
AOCGG
ARALO
ARMRJ
ASUFR
AVQMV
AXYYD
AYQZM
AZFZN
AZQEC
B-.
BA0
BBWZM
BDATZ
BENPR
BGNMA
BKOMP
BPHCQ
BSONS
BVXVI
CAG
CCPQU
COF
CS3
CSCUP
DDRTE
DL5
DNIVK
DPUIP
DU5
DWQXO
EBLON
EBS
ECE
EIHBH
EIOEI
EJD
ESBYG
F5P
FEDTE
FERAY
FFXSO
FIGPU
FINBP
FJW
FNLPD
FRRFC
FSGXE
FWDCC
FYUFA
GGCAI
GGRSB
GJIRD
GNUQQ
GNWQR
GQ6
GQ7
GQ8
GUQSH
GXS
H13
HCIFZ
HF~
HG5
HG6
HMJXF
HQYDN
HRMNR
HVGLF
HZ~
H~9
I09
IHE
IJ-
IKXTQ
IN-
IRVIT
ITM
IWAJR
IXC
IZIGR
IZQ
I~X
I~Z
J-C
J0Z
JBSCW
JCJTX
JZLTJ
K50
KDC
KOV
KOW
LAK
LLZTM
LPU
M1D
M2M
M2O
M2P
M2Q
M2R
M4Y
MA-
MVM
N2Q
NB0
NDZJH
NPVJJ
NQJWS
NU0
O-J
O9-
O93
O9G
O9I
O9J
OAM
OHT
OVD
P19
P2P
P9L
PF0
PQQKQ
PRG
PROAC
PSYQQ
PT4
PT5
Q2X
QF4
QN7
QO5
QOK
QOS
R-Y
R4E
R89
R9I
RHV
RNI
ROL
RPX
RSV
RZC
RZD
RZK
S16
S1Z
S26
S27
S28
S3B
SAP
SBS
SBU
SCLPG
SDH
SDM
SHX
SISQX
SJYHP
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
SSXJD
STPWE
SZN
T13
T16
TEORI
TN5
TSG
TSK
TSV
TUC
TWZ
U2A
U9L
UG4
UKHRP
ULE
UOJIU
UTJUX
UZXMN
VC2
VFIZW
W23
W48
WH7
WK6
WK8
XJT
YIN
YLTOR
YQT
YR5
Z45
Z83
ZHY
ZMTXR
ZMU
ZOVNA
~A9
~EX
AAPKM
AAYXX
ABBRH
ABDBE
ABFSG
ABRTQ
ACSTC
ADHKG
ADXHL
AEZWR
AFDZB
AFHIU
AFOHR
AGQPQ
AHPBZ
AHWEU
AIXLP
ATHPR
AYFIA
CITATION
PEJEM
PHGZM
PHGZT
PMKZF
PRQQA
PUEGO
NPM
7TK
7XB
8FK
MBDVC
PKEHL
POGQB
PQEST
PQUKI
PRINS
Q9U
7T9
7X8
5PM
ADTOC
UNPAY
ID FETCH-LOGICAL-c606t-36924142d1450afcdf30081541d2f86d287b16111a007ae730db6dfa2d59fd7a3
IEDL.DBID M1D
ISSN 0191-5886
1573-3653
IngestDate Tue Aug 19 23:50:12 EDT 2025
Tue Sep 30 16:50:07 EDT 2025
Wed Oct 01 14:02:16 EDT 2025
Fri Sep 05 06:55:18 EDT 2025
Fri Sep 05 14:46:46 EDT 2025
Sat Aug 23 13:34:23 EDT 2025
Thu Apr 03 07:06:11 EDT 2025
Wed Oct 01 06:32:06 EDT 2025
Thu Apr 24 22:57:21 EDT 2025
Fri Feb 21 02:37:10 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 1
Keywords Gesture form
Database
Gesture function
Nonverbal communication
Cantonese
database
nonverbal communication
gesture form
gesture function
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c606t-36924142d1450afcdf30081541d2f86d287b16111a007ae730db6dfa2d59fd7a3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
OpenAccessLink https://www.ncbi.nlm.nih.gov/pmc/articles/4319117
PMID 25667563
PQID 1647316929
PQPubID 48527
PageCount 19
ParticipantIDs unpaywall_primary_10_1007_s10919_014_0200_6
pubmedcentral_primary_oai_pubmedcentral_nih_gov_4319117
proquest_miscellaneous_1826612973
proquest_miscellaneous_1680140216
proquest_miscellaneous_1654690889
proquest_journals_1647316929
pubmed_primary_25667563
crossref_primary_10_1007_s10919_014_0200_6
crossref_citationtrail_10_1007_s10919_014_0200_6
springer_journals_10_1007_s10919_014_0200_6
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2015-03-01
PublicationDateYYYYMMDD 2015-03-01
PublicationDate_xml – month: 03
  year: 2015
  text: 2015-03-01
  day: 01
PublicationDecade 2010
PublicationPlace Boston
PublicationPlace_xml – name: Boston
– name: United States
– name: New York
PublicationTitle Journal of nonverbal behavior
PublicationTitleAbbrev J Nonverbal Behav
PublicationTitleAlternate J Nonverbal Behav
PublicationYear 2015
Publisher Springer US
Springer Nature B.V
Publisher_xml – name: Springer US
– name: Springer Nature B.V
References ButterworthBHadarUGesture, speech, and computational stages: A reply to McNeillPsychological Review1989961168174246731910.1037/0033-295X.96.1.168
BeattieGShoveltonHMapping the range of information contained in the iconic hand gestures that accompany spontaneous speechJournal of Language and Social Psychology199918443846210.1177/0261927X99018004005
FieldAPDiscovering statistics using SPSS: (and sex and drugs and rock ‘n’ roll)20093LondonSAGE
CohenRLBorsoiDThe role of gestures in description-communication: A cross-sectional study of agingJournal of Nonverbal Behavior1996201456310.1007/BF02248714
Hwang, B. W., Kim, S. M., & Lee, S. W. (2006). A full-body gesture database for automatic gesture recognition. In Proceeding of the 7th international conference on automatic face and gesture recognition, Vol. 7, pp. 243–248.
WilsonFRThe hand: How its use shapes the brain, language, and human culture1998New YorkPantheon Books
OsborneJWImproving your data transformations: Applying the Box-Cox transformationPractical Assessment, Research & Evaluation2010151219
Le MayADavidRThomasAPThe use of spontaneous gesture by aphasic patientsAphasiology19882213714510.1080/02687038808248899
McNeillDHand and mind: What gestures reveal about thought1992ChicagoUniversity of Chicago Press
Max Planck Institute. (2001, February 27). Gesture database (GDB). Retrieved from http://www.mpi.nl/ISLE/overview/Overview_GDB.html/.
MontepareJKoffEZaitchikDAlbertMThe use of body movements and gestures as cues to emotions in younger and older adultsJournal of Nonverbal Behavior19992313315210.1023/A:1021435526134
Triesch, J., & von der Malsburg, C. (1996). Robust classification of hand postures against complex backgrounds. In Proceedings of the second international conference on automatic face and gesture recognition, Vol. 2, pp. 170–175.
Just, A., Rodriguez, Y., & Marcel, S. (1996). Hand posture classification and recognition using the modified census transform. In Proceeding of the IEEE international conference on automatic face and gesture recognition, Vol. 2, pp. 351–356.
Bressem, J. (2008). Notating gestures: Proposal for a form based notation system of coverbal gestures. Unpublished manuscript.
JacobsNGarnhamAThe role of conversational hand gestures in a narrative taskJournal of Memory and Language200756229130310.1016/j.jml.2006.07.011
Hayamizu, S., Nagaya, S., Watanuki, K., Nakazawa, M., Nobe, S., & Yoshimura, T. (1999). A multimodal database of gestures and speech. In Proceeding of the sixth European conference on speech communication and technology, Vol. 6, pp, 2247–2250.
KraussRMHadarUCampbellRMessingLThe role of speech-related arm/hand gestures in word retrievalGesture, speech, and sign1999OxfordOxford University Press9311610.1093/acprof:oso/9780198524519.003.0006
MacWhinneyBFrommDForbesMHollandAAphasiaBank: Methods for studying discourseAphasiology2011251286130734246152292387910.1080/02687038.2011.589893
MatherSMMetzgerMFleetwoodEEthnographic research on the use of visually based regulators for teachers and interpretersAttitudes, innuendo, and regulators2005Washington, DCGallaudet University Press136161
GoochCMSternYRakitinBCEvidence for age-related changes to temporal attention and memory from the choice time production taskAging, Neuropsychology & Cognition200916328531010.1080/13825580802592771
DadgostarFBarczakALCSarrafzadehAA color hand gesture database for evaluating and improving algorithms on hand gesture and posture recognitionResearch Letters in the Information and Mathematical Sciences20057127134
Lausberg, H., & Sloetjes, H. (2008). Gesture coding with the NGCS–ELAN system. In. A. J. Spink, M. R. Ballintijn, N. D. Bogers, F. Grieco, L. W. S. Loijens, L. P. J. J. Noldus, G. Smit, & P. H. Zimmerman (Eds.), Proceedings of measuring behavior 2008 (pp. 176–177). Netherlands: Noldus Information Technology.
MacWhinney, B. (2003). Child language analyses (CLAN) (version 23 September 2003) [Computer software]. Pittsburgh, PA: Author.
CrowderEMGestures at work in sense-making science talkThe Journal of the Learning Sciences19965317320810.1207/s15327809jls0503_2
HerrmannMReichleTLucius-HoeneGWalleschCWJohannsen-HorbachHNonverbal communication as a compensation strategy for severely nonfluent aphasic? A quantitative approachBrain and Language198813415410.1016/0093-934X(88)90053-3
LausbergHSloetjesHCoding gestural behavior with the NEUROGES-ELAN systemBehavior Research Methods20094138418491958720010.3758/BRM.41.3.841
QuekFMcNeillDBryllRDuncanSMaXFKirbasCMcCulloughKEAnsariRMultimodal human discourse: Gesture and speechACM Transactions on Computer-Human Interaction20029317119310.1145/568513.568514
BeattieGShoveltonHIconic hand gestures and the predictability of words in context in spontaneous speechBritish Journal of Psychology20009144731110417410.1348/000712600161943
Goldin-MeadowSThe role of gesture in communication and thinkingTrends in Cognitive Sciences19993114194291052979710.1016/S1364-6613(99)01397-2
SmithLNonverbal competency in aphasic stroke patients’ conversationAphasiology19871212713910.1080/02687038708248824
XuJGannonPJEmmoreyKJasonFSBraunARSymbolic gestures and spoken language are processed by a common neural systemProceedings of the National Academy of Sciences of the United States of America200910649206642066927792031992343610.1073/pnas.0909197106
Brugman, H., & Russel, A. (2004). Annotating multi-media/multi-modal resources with ELAN. LREC.
HostetterABAlibaliMWRaise your hand if you’re spatial: Relations between verbal and spatial skills and gesture productionGesture20077739510.1075/gest.7.1.05hos
BoxGEPCoxDRAn analysis of transformationJournal of Royal Statistical Society (Series B)196426211246
RauscherFHKraussRMChenYGesture, speech, and lexical access: The role of lexical movements in speech productionPsychological Science19967422623110.1111/j.1467-9280.1996.tb00364.x
OldSRNaveh-BenjaminMDifferential effects of age on item and associative measures of memory: A meta-analysisPsychology and Aging20082311041181836166010.1037/0882-7974.23.1.104
EkmanPFriesenWVThe repertoire of nonverbal behavior: Categories, origins, usage, and codingSemiotica1969114998
KongAPHLawSPLeeASYAn investigation of use of non-verbal behaviors among individuals with aphasia in Hong Kong: Preliminary dataProcedia Social and Behavioral Sciences20106575810.1016/j.sbspro.2010.08.029
Marcel, S., Bernier, O., Viallet, J-E., & Collobert, D. (2000). Hand gesture recognition using input/ouput hidden markov models. In Proceedings of the 4th international conference on automatic face and gesture recognition, Vol. 4, pp. 398–402.
CollettaJMPellenqCGuidettiMAge-related changes in co-speech gesture and narrative: Evidence from French children and adultsSpeech Communication201052656557610.1016/j.specom.2010.02.009
MontepareJMTuckerJSAging and nonverbal behavior: Current perspectives and future directionsJournal of Nonverbal Behavior19992310510910.1023/A:1021431425225
WuYCCoulsonSHow iconic gestures enhance communication: An ERP studyBrain and Language20071012342451722289710.1016/j.bandl.2006.12.003
Anastasiou, D. (2012). A speech and gesture spatial corpus in assisted living. In Proceedings of the eight international conference on language resources and evaluation (LREC’12), Vol. 8, pp. 2351–2354.
Goldin-Meadow, S. (2003). Hearing gesture: How our hands help us think. Cambridge, MA: Belknap Press of Harvard University Press.
KraussRMChenYGottesmanRFMcNeillDLexical gestures and lexical access: A process modelLanguage and gesture2000CambridgeCambridge University Press26128310.1017/CBO9780511620850.017
AlibaliMWDiRussoAAThe function of gesture in learning to count: More than keeping trackCognitive Development1999141375610.1016/S0885-2014(99)80017-3
Lücking, A., Bergmann, K., Hahn, F., Kopp, S., & Rieser, H. (2010). The Bielefeld speech and gesture alignment corpus (SaGA). In M. Kipp, J. P. Martin, P. Paggio, & D. Heylen, (Eds). LREC 2010 workshop: Multimodal corpora–advances in capturing, coding and analyzing multimodality (pp. 92–98). Republic of Malta.
FeyereisenPHavardIMental imagery and production of hand gestures while speaking in younger and older adultsJournal of Nonverbal Behavior199923215317110.1023/A:1021487510204
GermanDJIt’s on the tip of my tongue: Word-finding strategies to remember names and words you often forget2001Chicago, ILWord Finding Materials Inc
SkaBNespoulousJPantomimes and agingJournal of Clinical and Experimental Neuropsychology19879754766369353210.1080/01688638708405214
Kong, A. P. H., Law, S. P., Kwan, C., Lai, C., Lam, V., & Lee, A. (2012, November). A comprehensive framework to analyze co-verbal gestures during discourse production. Poster presented at the 2012 American speech-language-hearing association (ASHA) convention, Atlanta, GA, USA.
LanyonLRoseMLDo the hands have it? The facilitation effects of arm and hand gesture on word retrieval in aphasiaAphasiology2009237–880982210.1080/02687030802642044
AlibaliMWKitaSYoungAJGesture and the process of speech production: We think, therefore we gestureLanguage & Cognitive Processes200015659361310.1080/016909600750040571
MayberryRIJaquesJMcNeillDGesture production during stuttered speech: Insights into the nature of gesture-speech integrationLanguage and gesture2000New YorkCambridge University Press19921410.1017/CBO9780511620850.013
Max Planck Institute for Psycholinguistics. (2002). http://www.lat-mpi.eu/tools/elan/.
Ahlsén, E. (2011). Towards an integrated view of gestures related to speech. In Proceedings of the 3rd Nordic symposium on multimodal communication, Vol. 15, pp. 72–77.
JM Colletta (200_CR12) 2010; 52
MW Alibali (200_CR2) 1999; 14
200_CR27
200_CR28
SR Old (200_CR47) 2008; 23
200_CR33
B Ska (200_CR51) 1987; 9
J Montepare (200_CR45) 1999; 23
S Goldin-Meadow (200_CR19) 1999; 3
200_CR36
JM Montepare (200_CR46) 1999; 23
AB Hostetter (200_CR24) 2007; 7
G Beattie (200_CR5) 1999; 18
D McNeill (200_CR44) 1992
APH Kong (200_CR29) 2010; 6
M Herrmann (200_CR23) 1988; 13
H Lausberg (200_CR34) 2009; 41
200_CR1
200_CR4
DJ German (200_CR18) 2001
200_CR8
B MacWhinney (200_CR38) 2011; 25
F Quek (200_CR49) 2002; 9
200_CR9
MW Alibali (200_CR3) 2000; 15
F Dadgostar (200_CR14) 2005; 7
RM Krauss (200_CR30) 2000
200_CR20
200_CR22
P Ekman (200_CR15) 1969; 1
P Feyereisen (200_CR16) 1999; 23
200_CR25
J Xu (200_CR56) 2009; 106
CM Gooch (200_CR21) 2009; 16
L Lanyon (200_CR32) 2009; 23
EM Crowder (200_CR13) 1996; 5
G Beattie (200_CR6) 2000; 91
200_CR53
L Smith (200_CR52) 1987; 1
AP Field (200_CR17) 2009
A May Le (200_CR35) 1988; 2
FR Wilson (200_CR54) 1998
YC Wu (200_CR55) 2007; 101
SM Mather (200_CR40) 2005
200_CR37
200_CR39
RM Krauss (200_CR31) 1999
RI Mayberry (200_CR43) 2000
RL Cohen (200_CR11) 1996; 20
GEP Box (200_CR7) 1964; 26
200_CR41
N Jacobs (200_CR26) 2007; 56
200_CR42
FH Rauscher (200_CR50) 1996; 7
B Butterworth (200_CR10) 1989; 96
JW Osborne (200_CR48) 2010; 15
23908676 - Procedia Soc Behav Sci. 2010;6:57-58
17222897 - Brain Lang. 2007 Jun;101(3):234-45
11104174 - Br J Psychol. 2000 Nov;91 ( Pt 4):473-91
19587200 - Behav Res Methods. 2009 Aug;41(3):841-9
2467319 - Psychol Rev. 1989 Jan;96(1):168-74
19132578 - Neuropsychol Dev Cogn B Aging Neuropsychol Cogn. 2009 May;16(3):285-310
3693532 - J Clin Exp Neuropsychol. 1987 Dec;9(6):754-66
18361660 - Psychol Aging. 2008 Mar;23(1):104-18
22923879 - Aphasiology. 2011;25(11):1286-1307
3342320 - Brain Lang. 1988 Jan;33(1):41-54
19923436 - Proc Natl Acad Sci U S A. 2009 Dec 8;106(49):20664-9
10529797 - Trends Cogn Sci. 1999 Nov;3(11):419-429
References_xml – reference: Ahlsén, E. (2011). Towards an integrated view of gestures related to speech. In Proceedings of the 3rd Nordic symposium on multimodal communication, Vol. 15, pp. 72–77.
– reference: McNeillDHand and mind: What gestures reveal about thought1992ChicagoUniversity of Chicago Press
– reference: Triesch, J., & von der Malsburg, C. (1996). Robust classification of hand postures against complex backgrounds. In Proceedings of the second international conference on automatic face and gesture recognition, Vol. 2, pp. 170–175.
– reference: Just, A., Rodriguez, Y., & Marcel, S. (1996). Hand posture classification and recognition using the modified census transform. In Proceeding of the IEEE international conference on automatic face and gesture recognition, Vol. 2, pp. 351–356.
– reference: AlibaliMWDiRussoAAThe function of gesture in learning to count: More than keeping trackCognitive Development1999141375610.1016/S0885-2014(99)80017-3
– reference: Max Planck Institute. (2001, February 27). Gesture database (GDB). Retrieved from http://www.mpi.nl/ISLE/overview/Overview_GDB.html/.
– reference: EkmanPFriesenWVThe repertoire of nonverbal behavior: Categories, origins, usage, and codingSemiotica1969114998
– reference: Hayamizu, S., Nagaya, S., Watanuki, K., Nakazawa, M., Nobe, S., & Yoshimura, T. (1999). A multimodal database of gestures and speech. In Proceeding of the sixth European conference on speech communication and technology, Vol. 6, pp, 2247–2250.
– reference: Lücking, A., Bergmann, K., Hahn, F., Kopp, S., & Rieser, H. (2010). The Bielefeld speech and gesture alignment corpus (SaGA). In M. Kipp, J. P. Martin, P. Paggio, & D. Heylen, (Eds). LREC 2010 workshop: Multimodal corpora–advances in capturing, coding and analyzing multimodality (pp. 92–98). Republic of Malta.
– reference: JacobsNGarnhamAThe role of conversational hand gestures in a narrative taskJournal of Memory and Language200756229130310.1016/j.jml.2006.07.011
– reference: SkaBNespoulousJPantomimes and agingJournal of Clinical and Experimental Neuropsychology19879754766369353210.1080/01688638708405214
– reference: Lausberg, H., & Sloetjes, H. (2008). Gesture coding with the NGCS–ELAN system. In. A. J. Spink, M. R. Ballintijn, N. D. Bogers, F. Grieco, L. W. S. Loijens, L. P. J. J. Noldus, G. Smit, & P. H. Zimmerman (Eds.), Proceedings of measuring behavior 2008 (pp. 176–177). Netherlands: Noldus Information Technology.
– reference: HerrmannMReichleTLucius-HoeneGWalleschCWJohannsen-HorbachHNonverbal communication as a compensation strategy for severely nonfluent aphasic? A quantitative approachBrain and Language198813415410.1016/0093-934X(88)90053-3
– reference: BoxGEPCoxDRAn analysis of transformationJournal of Royal Statistical Society (Series B)196426211246
– reference: KongAPHLawSPLeeASYAn investigation of use of non-verbal behaviors among individuals with aphasia in Hong Kong: Preliminary dataProcedia Social and Behavioral Sciences20106575810.1016/j.sbspro.2010.08.029
– reference: OldSRNaveh-BenjaminMDifferential effects of age on item and associative measures of memory: A meta-analysisPsychology and Aging20082311041181836166010.1037/0882-7974.23.1.104
– reference: MontepareJKoffEZaitchikDAlbertMThe use of body movements and gestures as cues to emotions in younger and older adultsJournal of Nonverbal Behavior19992313315210.1023/A:1021435526134
– reference: KraussRMChenYGottesmanRFMcNeillDLexical gestures and lexical access: A process modelLanguage and gesture2000CambridgeCambridge University Press26128310.1017/CBO9780511620850.017
– reference: WilsonFRThe hand: How its use shapes the brain, language, and human culture1998New YorkPantheon Books
– reference: LausbergHSloetjesHCoding gestural behavior with the NEUROGES-ELAN systemBehavior Research Methods20094138418491958720010.3758/BRM.41.3.841
– reference: Max Planck Institute for Psycholinguistics. (2002). http://www.lat-mpi.eu/tools/elan/.
– reference: HostetterABAlibaliMWRaise your hand if you’re spatial: Relations between verbal and spatial skills and gesture productionGesture20077739510.1075/gest.7.1.05hos
– reference: LanyonLRoseMLDo the hands have it? The facilitation effects of arm and hand gesture on word retrieval in aphasiaAphasiology2009237–880982210.1080/02687030802642044
– reference: Bressem, J. (2008). Notating gestures: Proposal for a form based notation system of coverbal gestures. Unpublished manuscript.
– reference: FieldAPDiscovering statistics using SPSS: (and sex and drugs and rock ‘n’ roll)20093LondonSAGE
– reference: Marcel, S., Bernier, O., Viallet, J-E., & Collobert, D. (2000). Hand gesture recognition using input/ouput hidden markov models. In Proceedings of the 4th international conference on automatic face and gesture recognition, Vol. 4, pp. 398–402.
– reference: AlibaliMWKitaSYoungAJGesture and the process of speech production: We think, therefore we gestureLanguage & Cognitive Processes200015659361310.1080/016909600750040571
– reference: Goldin-Meadow, S. (2003). Hearing gesture: How our hands help us think. Cambridge, MA: Belknap Press of Harvard University Press.
– reference: GermanDJIt’s on the tip of my tongue: Word-finding strategies to remember names and words you often forget2001Chicago, ILWord Finding Materials Inc
– reference: MacWhinney, B. (2003). Child language analyses (CLAN) (version 23 September 2003) [Computer software]. Pittsburgh, PA: Author.
– reference: BeattieGShoveltonHMapping the range of information contained in the iconic hand gestures that accompany spontaneous speechJournal of Language and Social Psychology199918443846210.1177/0261927X99018004005
– reference: Le MayADavidRThomasAPThe use of spontaneous gesture by aphasic patientsAphasiology19882213714510.1080/02687038808248899
– reference: QuekFMcNeillDBryllRDuncanSMaXFKirbasCMcCulloughKEAnsariRMultimodal human discourse: Gesture and speechACM Transactions on Computer-Human Interaction20029317119310.1145/568513.568514
– reference: FeyereisenPHavardIMental imagery and production of hand gestures while speaking in younger and older adultsJournal of Nonverbal Behavior199923215317110.1023/A:1021487510204
– reference: MatherSMMetzgerMFleetwoodEEthnographic research on the use of visually based regulators for teachers and interpretersAttitudes, innuendo, and regulators2005Washington, DCGallaudet University Press136161
– reference: Goldin-MeadowSThe role of gesture in communication and thinkingTrends in Cognitive Sciences19993114194291052979710.1016/S1364-6613(99)01397-2
– reference: CohenRLBorsoiDThe role of gestures in description-communication: A cross-sectional study of agingJournal of Nonverbal Behavior1996201456310.1007/BF02248714
– reference: MontepareJMTuckerJSAging and nonverbal behavior: Current perspectives and future directionsJournal of Nonverbal Behavior19992310510910.1023/A:1021431425225
– reference: Hwang, B. W., Kim, S. M., & Lee, S. W. (2006). A full-body gesture database for automatic gesture recognition. In Proceeding of the 7th international conference on automatic face and gesture recognition, Vol. 7, pp. 243–248.
– reference: BeattieGShoveltonHIconic hand gestures and the predictability of words in context in spontaneous speechBritish Journal of Psychology20009144731110417410.1348/000712600161943
– reference: XuJGannonPJEmmoreyKJasonFSBraunARSymbolic gestures and spoken language are processed by a common neural systemProceedings of the National Academy of Sciences of the United States of America200910649206642066927792031992343610.1073/pnas.0909197106
– reference: Kong, A. P. H., Law, S. P., Kwan, C., Lai, C., Lam, V., & Lee, A. (2012, November). A comprehensive framework to analyze co-verbal gestures during discourse production. Poster presented at the 2012 American speech-language-hearing association (ASHA) convention, Atlanta, GA, USA.
– reference: MacWhinneyBFrommDForbesMHollandAAphasiaBank: Methods for studying discourseAphasiology2011251286130734246152292387910.1080/02687038.2011.589893
– reference: CollettaJMPellenqCGuidettiMAge-related changes in co-speech gesture and narrative: Evidence from French children and adultsSpeech Communication201052656557610.1016/j.specom.2010.02.009
– reference: MayberryRIJaquesJMcNeillDGesture production during stuttered speech: Insights into the nature of gesture-speech integrationLanguage and gesture2000New YorkCambridge University Press19921410.1017/CBO9780511620850.013
– reference: DadgostarFBarczakALCSarrafzadehAA color hand gesture database for evaluating and improving algorithms on hand gesture and posture recognitionResearch Letters in the Information and Mathematical Sciences20057127134
– reference: Brugman, H., & Russel, A. (2004). Annotating multi-media/multi-modal resources with ELAN. LREC.
– reference: GoochCMSternYRakitinBCEvidence for age-related changes to temporal attention and memory from the choice time production taskAging, Neuropsychology & Cognition200916328531010.1080/13825580802592771
– reference: CrowderEMGestures at work in sense-making science talkThe Journal of the Learning Sciences19965317320810.1207/s15327809jls0503_2
– reference: Anastasiou, D. (2012). A speech and gesture spatial corpus in assisted living. In Proceedings of the eight international conference on language resources and evaluation (LREC’12), Vol. 8, pp. 2351–2354.
– reference: KraussRMHadarUCampbellRMessingLThe role of speech-related arm/hand gestures in word retrievalGesture, speech, and sign1999OxfordOxford University Press9311610.1093/acprof:oso/9780198524519.003.0006
– reference: WuYCCoulsonSHow iconic gestures enhance communication: An ERP studyBrain and Language20071012342451722289710.1016/j.bandl.2006.12.003
– reference: ButterworthBHadarUGesture, speech, and computational stages: A reply to McNeillPsychological Review1989961168174246731910.1037/0033-295X.96.1.168
– reference: OsborneJWImproving your data transformations: Applying the Box-Cox transformationPractical Assessment, Research & Evaluation2010151219
– reference: RauscherFHKraussRMChenYGesture, speech, and lexical access: The role of lexical movements in speech productionPsychological Science19967422623110.1111/j.1467-9280.1996.tb00364.x
– reference: SmithLNonverbal competency in aphasic stroke patients’ conversationAphasiology19871212713910.1080/02687038708248824
– ident: 200_CR27
– start-page: 93
  volume-title: Gesture, speech, and sign
  year: 1999
  ident: 200_CR31
  doi: 10.1093/acprof:oso/9780198524519.003.0006
– ident: 200_CR33
– volume: 23
  start-page: 105
  year: 1999
  ident: 200_CR46
  publication-title: Journal of Nonverbal Behavior
  doi: 10.1023/A:1021431425225
– volume: 56
  start-page: 291
  issue: 2
  year: 2007
  ident: 200_CR26
  publication-title: Journal of Memory and Language
  doi: 10.1016/j.jml.2006.07.011
– ident: 200_CR25
  doi: 10.1109/FGR.2006.8
– ident: 200_CR42
– ident: 200_CR22
  doi: 10.21437/Eurospeech.1999-566x
– ident: 200_CR9
– volume: 23
  start-page: 104
  issue: 1
  year: 2008
  ident: 200_CR47
  publication-title: Psychology and Aging
  doi: 10.1037/0882-7974.23.1.104
– ident: 200_CR37
– volume: 14
  start-page: 37
  issue: 1
  year: 1999
  ident: 200_CR2
  publication-title: Cognitive Development
  doi: 10.1016/S0885-2014(99)80017-3
– start-page: 136
  volume-title: Attitudes, innuendo, and regulators
  year: 2005
  ident: 200_CR40
– ident: 200_CR28
– volume: 23
  start-page: 153
  issue: 2
  year: 1999
  ident: 200_CR16
  publication-title: Journal of Nonverbal Behavior
  doi: 10.1023/A:1021487510204
– volume: 1
  start-page: 127
  issue: 2
  year: 1987
  ident: 200_CR52
  publication-title: Aphasiology
  doi: 10.1080/02687038708248824
– ident: 200_CR41
– ident: 200_CR20
– ident: 200_CR53
  doi: 10.1109/AFGR.1996.557260
– volume-title: The hand: How its use shapes the brain, language, and human culture
  year: 1998
  ident: 200_CR54
– volume: 6
  start-page: 57
  year: 2010
  ident: 200_CR29
  publication-title: Procedia Social and Behavioral Sciences
  doi: 10.1016/j.sbspro.2010.08.029
– volume: 9
  start-page: 171
  issue: 3
  year: 2002
  ident: 200_CR49
  publication-title: ACM Transactions on Computer-Human Interaction
  doi: 10.1145/568513.568514
– volume: 25
  start-page: 1286
  year: 2011
  ident: 200_CR38
  publication-title: Aphasiology
  doi: 10.1080/02687038.2011.589893
– volume: 13
  start-page: 41
  year: 1988
  ident: 200_CR23
  publication-title: Brain and Language
  doi: 10.1016/0093-934X(88)90053-3
– start-page: 199
  volume-title: Language and gesture
  year: 2000
  ident: 200_CR43
  doi: 10.1017/CBO9780511620850.013
– volume: 15
  start-page: 1
  issue: 12
  year: 2010
  ident: 200_CR48
  publication-title: Practical Assessment, Research & Evaluation
– ident: 200_CR39
  doi: 10.1109/AFGR.2000.840674
– volume: 3
  start-page: 419
  issue: 11
  year: 1999
  ident: 200_CR19
  publication-title: Trends in Cognitive Sciences
  doi: 10.1016/S1364-6613(99)01397-2
– volume: 23
  start-page: 133
  year: 1999
  ident: 200_CR45
  publication-title: Journal of Nonverbal Behavior
  doi: 10.1023/A:1021435526134
– volume: 9
  start-page: 754
  year: 1987
  ident: 200_CR51
  publication-title: Journal of Clinical and Experimental Neuropsychology
  doi: 10.1080/01688638708405214
– volume: 5
  start-page: 173
  issue: 3
  year: 1996
  ident: 200_CR13
  publication-title: The Journal of the Learning Sciences
  doi: 10.1207/s15327809jls0503_2
– volume: 26
  start-page: 211
  year: 1964
  ident: 200_CR7
  publication-title: Journal of Royal Statistical Society (Series B)
  doi: 10.1111/j.2517-6161.1964.tb00553.x
– volume: 18
  start-page: 438
  issue: 4
  year: 1999
  ident: 200_CR5
  publication-title: Journal of Language and Social Psychology
  doi: 10.1177/0261927X99018004005
– volume: 20
  start-page: 45
  issue: 1
  year: 1996
  ident: 200_CR11
  publication-title: Journal of Nonverbal Behavior
  doi: 10.1007/BF02248714
– volume: 7
  start-page: 73
  year: 2007
  ident: 200_CR24
  publication-title: Gesture
  doi: 10.1075/gest.7.1.05hos
– volume: 101
  start-page: 234
  year: 2007
  ident: 200_CR55
  publication-title: Brain and Language
  doi: 10.1016/j.bandl.2006.12.003
– volume: 106
  start-page: 20664
  issue: 49
  year: 2009
  ident: 200_CR56
  publication-title: Proceedings of the National Academy of Sciences of the United States of America
  doi: 10.1073/pnas.0909197106
– volume: 2
  start-page: 137
  issue: 2
  year: 1988
  ident: 200_CR35
  publication-title: Aphasiology
  doi: 10.1080/02687038808248899
– volume: 96
  start-page: 168
  issue: 1
  year: 1989
  ident: 200_CR10
  publication-title: Psychological Review
  doi: 10.1037/0033-295X.96.1.168
– volume: 7
  start-page: 226
  issue: 4
  year: 1996
  ident: 200_CR50
  publication-title: Psychological Science
  doi: 10.1111/j.1467-9280.1996.tb00364.x
– start-page: 261
  volume-title: Language and gesture
  year: 2000
  ident: 200_CR30
  doi: 10.1017/CBO9780511620850.017
– volume: 52
  start-page: 565
  issue: 6
  year: 2010
  ident: 200_CR12
  publication-title: Speech Communication
  doi: 10.1016/j.specom.2010.02.009
– volume: 41
  start-page: 841
  issue: 3
  year: 2009
  ident: 200_CR34
  publication-title: Behavior Research Methods
  doi: 10.3758/BRM.41.3.841
– ident: 200_CR1
– volume-title: It’s on the tip of my tongue: Word-finding strategies to remember names and words you often forget
  year: 2001
  ident: 200_CR18
– volume: 7
  start-page: 127
  year: 2005
  ident: 200_CR14
  publication-title: Research Letters in the Information and Mathematical Sciences
– volume: 23
  start-page: 809
  issue: 7–8
  year: 2009
  ident: 200_CR32
  publication-title: Aphasiology
  doi: 10.1080/02687030802642044
– ident: 200_CR8
– ident: 200_CR4
– volume-title: Hand and mind: What gestures reveal about thought
  year: 1992
  ident: 200_CR44
– ident: 200_CR36
– volume: 91
  start-page: 473
  issue: 4
  year: 2000
  ident: 200_CR6
  publication-title: British Journal of Psychology
  doi: 10.1348/000712600161943
– volume: 1
  start-page: 49
  issue: 1
  year: 1969
  ident: 200_CR15
  publication-title: Semiotica
  doi: 10.1515/semi.1969.1.1.49
– volume-title: Discovering statistics using SPSS: (and sex and drugs and rock ‘n’ roll)
  year: 2009
  ident: 200_CR17
– volume: 16
  start-page: 285
  issue: 3
  year: 2009
  ident: 200_CR21
  publication-title: Aging, Neuropsychology & Cognition
  doi: 10.1080/13825580802592771
– volume: 15
  start-page: 593
  issue: 6
  year: 2000
  ident: 200_CR3
  publication-title: Language & Cognitive Processes
  doi: 10.1080/016909600750040571
– reference: 18361660 - Psychol Aging. 2008 Mar;23(1):104-18
– reference: 10529797 - Trends Cogn Sci. 1999 Nov;3(11):419-429
– reference: 19923436 - Proc Natl Acad Sci U S A. 2009 Dec 8;106(49):20664-9
– reference: 22923879 - Aphasiology. 2011;25(11):1286-1307
– reference: 19132578 - Neuropsychol Dev Cogn B Aging Neuropsychol Cogn. 2009 May;16(3):285-310
– reference: 2467319 - Psychol Rev. 1989 Jan;96(1):168-74
– reference: 11104174 - Br J Psychol. 2000 Nov;91 ( Pt 4):473-91
– reference: 23908676 - Procedia Soc Behav Sci. 2010;6:57-58
– reference: 17222897 - Brain Lang. 2007 Jun;101(3):234-45
– reference: 3693532 - J Clin Exp Neuropsychol. 1987 Dec;9(6):754-66
– reference: 3342320 - Brain Lang. 1988 Jan;33(1):41-54
– reference: 19587200 - Behav Res Methods. 2009 Aug;41(3):841-9
SSID ssj0004947
Score 2.2710958
Snippet Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies...
SourceID unpaywall
pubmedcentral
proquest
pubmed
crossref
springer
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 93
SubjectTerms Behavioral Science and Psychology
Cantonese
Communication
Education
Gestures
Intonation
Linguistic Performance
Nonverbal Communication
Original Paper
Personality and Social Psychology
Psychology
Social Sciences
Sociology
Speech
Verbal communication
SummonAdditionalLinks – databaseName: SpringerLINK - Czech Republic Consortium
  dbid: AGYKE
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB7B9kB74A1dKMhIHIDKVZzYeXBbdR8FJC5tUTlFjh8qYpWsuolQ-Tf8U8Z5dZfCoh4jO3ZmMpn5JvMwwGs_83hglU99FfiUR5mlidYZjTzPqkSFvrF1t8_P4dEp_3gmzto67mWX7d6FJGtNvVLslriCG8YpQhz0eW7DlnD-yQC2RrOvnyZX5ZAJb6ukGRVxHHbBzL8tsm6OrmHM66mSfbx0B-5U-UJe_pDz-YpJmt6Dk46YJhPl-0FVZgfq5x99Hm9I7X2420JUMmpk6gHcMvlD2O415eUj-DUih4UzeqRpeE7c31zyoT9RtySjPC-aIP-SFJbMkNrqwpApIuQlkbkmU7SnzfC4LpQkX_D94q5rBSvvyUpOk1tHkrEspbO87up4YYw6r5ebTZoN3oyLYzmbvH0Mp9PJyeERbc96oApdqJIGITqCjPuaceFJq7QNHFoRnGnfxqFGxy5DcMqYRO5Ig3pJZ6G20tcisTqSwRMY5EVudoEwgY63TXgis5hbYySzIuOaM6s8pZNkCF73ylPVNkJ353HM06sWzo75KTI_dcxPwyG8629ZNF1ANk3e6-QobRXCMnVt2wKGROL2r_ph_JRdfEbmpqjcHOF-VsTxxjmu3Q8Cs3DDnNihLnco2RCeNuLbPzUiXHQRQxyJ1gS7n-Daja-P5N_O67bjCDXRMkZD2O8kdoW8fzNjv_9K_s-6Zzda-zlsI3wVTUbgHgzKi8q8QIhYZi9blfAbsDZcJQ
  priority: 102
  providerName: Springer Nature
Title A Coding System with Independent Annotations of Gesture Forms and Functions During Verbal Communication: Development of a Database of Speech and GEsture (DoSaGE)
URI https://link.springer.com/article/10.1007/s10919-014-0200-6
https://www.ncbi.nlm.nih.gov/pubmed/25667563
https://www.proquest.com/docview/1647316929
https://www.proquest.com/docview/1654690889
https://www.proquest.com/docview/1680140216
https://www.proquest.com/docview/1826612973
https://pubmed.ncbi.nlm.nih.gov/PMC4319117
https://www.ncbi.nlm.nih.gov/pmc/articles/4319117
UnpaywallVersion submittedVersion
Volume 39
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVLSH
  databaseName: SpringerLink Journals
  customDbUrl:
  mediaType: online
  eissn: 1573-3653
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0004947
  issn: 1573-3653
  databaseCode: AFBBN
  dateStart: 19970301
  isFulltext: true
  providerName: Library Specific Holdings
– providerCode: PRVPQU
  databaseName: Arts & Humanities Database (ProQuest)
  customDbUrl:
  eissn: 1573-3653
  dateEnd: 20171231
  omitProxy: false
  ssIdentifier: ssj0004947
  issn: 1573-3653
  databaseCode: M1D
  dateStart: 19961001
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/artshumanities
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl: http://www.proquest.com/pqcentral?accountid=15518
  eissn: 1573-3653
  dateEnd: 20241001
  omitProxy: true
  ssIdentifier: ssj0004947
  issn: 1573-3653
  databaseCode: BENPR
  dateStart: 19961001
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVAVX
  databaseName: SpringerLINK - Czech Republic Consortium
  customDbUrl:
  eissn: 1573-3653
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0004947
  issn: 1573-3653
  databaseCode: AGYKE
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: http://link.springer.com
  providerName: Springer Nature
– providerCode: PRVAVX
  databaseName: SpringerLink Journals (ICM)
  customDbUrl:
  eissn: 1573-3653
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0004947
  issn: 1573-3653
  databaseCode: U2A
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: http://www.springerlink.com/journals/
  providerName: Springer Nature
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Zb9QwELagfaB94KYESmUkHoAqJU6cixcUukcBUSHKovIUOT5UxCpZuolQ-Tf8U2Zy7S4Vi8RLVpFnk9gez3w-5htCnriZwz0jXduVnmvzMDN2rFRmh45jZCwDV5ua7fM4OJrwt6f-abvgNm-PVXY2sTbUqpC4Rv4Cea88FoA3fzX7bmPWKNxdbVNoXCWbSBqDGQzes8EiLjLmbbg0s_0oCrpdzSZ0LsbwHcZtAEwwg1r1S5fA5uUzk_3G6Ta5VuUzcfFDTKdLvml0g6RdrZojKd8OqjI7kD__IHz8_2rfJNdb2EqTRs9ukSs6v022eut5cYf8SuhhgY6QNiToFFd46Zs-y25Jkzwvmo3_OS0MHcNHVOeajgA1z6nIFR2Bj22KB3XwJP0MfQ5vXQlieUmXzjnhcwQdiFKgN8a7k5nW8qx-3HjYvODpoDgR4-Gzu2QyGn46PLLb_A-2hGlVaXtQbc64qxj3HWGkMh4iGJ8z5ZooUDDZywCwMiagR4UGW6WyQBnhKj82KhTePbKRF7m-TygoB_NMzGORRdxoLZjxM644M9KRKo4t4nS9n8qWHB1zdEzTBa0zKkwKCpOiwqSBRZ73f5k1zCDrhHe77k1bIzFPF31rkcd9MQxv3LMRuS4qlPFxASOK1sogBRCAtWCNTIRIDBOVWWSn0eT-qwH1wrQxgJJwRcd7AaQgXy3Jv57VVOQAP8FbhhbZ70bDUvX-3hj7_YD5d9M9WN90D8kWYFi_ORa4SzbK80o_ApxYZntkMxl_eTeE39fD4w8f92rjANeJm_wG_R5p_A
linkProvider ProQuest
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Zb9NAEF6V9qHlAXFjKLBIIAGVhddeX0gVCs1JS4Toob6Z9R4qUmSHJlGVn8Mf4bcx4yuJKsJTH6OdrO2d2Zlvducg5LWbOtwz0rVd6bk2D1Njx0qldug4RsYycLUpqn0Og_4p_3Lun2-QP3UuDIZV1jqxUNQql3hG_gHrXnksAGv-afzLxq5ReLtat9AQVWsFtV-UGKsSOw71_ApcuMn-oA38fuO63c7JQd-uugzYEsD71PZgTs64qxj3HWGkMh7aSZ8z5ZooUOBSpACLGBNgToWGHaHSQBnhKj82KhQezHuLbAHs8GBXbX3uDL99X2RmxrxK2Ga2H0VBfa9aJu_FmEDEuA2QDXy4Vct4De5ej9psrm5vk-1ZNhbzKzEaLVnH7l1yp4K1tFXK4T2yobP7ZKfRrvMH5HeLHuRoKGlZJJ3iCTAdNF14p7SVZXkZGDChuaE9eK3ZpaZdQNUTKjJFu2CDy-F2kVxJz0Am4KkrSS4f6VIcFM4jaFtMBVpr_HU81lpeFNP1OuUD3rbzY9HrvHtITm-Ec4_IZpZn-gmhzAdn3cQ8FmnEjdaCGT_lijMjHani2CJOzZtEVsXTsYfHKFmUfUZ2JsDOBNmZBBZ53_xlXFYOWUe8WzM8qZTIJFmIvEVeNcOw_fFOR2Q6nyGNjwccUbSWBksEAZgL1tBEiNSwkZlFHpdy1rw1oGJwKwMYCVcksCHAEuWrI9nPi6JUOcBTsKahRfZqWV36vH8vxl4jzv9fuqfrl-4l2e6ffD1KjgbDw2dkB_CuX4YQ7pLN6eVMPwdMOU1fVBuXkh83rSv-AmuKhU0
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V3bbtNAEF2VIkF54H4JFFgkkIDKrdde35AQiuqkDYUKqRT1zaz3oiIiOzSOqvA3_Aa_ws8w41sSKsJTH3iMdrOOJzM7Z3ZnzhDy1Elt7hrpWI50HYsHqbEipVIrsG0jI-k72pRsn_v-7iF_e-QdrZCfTS0MplU2e2K5Uatc4hn5FvJeucwHb75l6rSID3H_zeibhR2k8Ka1aadRqcienp5C-DZ-PYjhv37mOP3ex-1dq-4wYEkA7oXlwnqccUcx7tnCSGVc9JEeZ8oxoa8gnEgBEjEmwJUKDdagUl8Z4SgvMioQLqx7gVwMuG8jb_97Fs9qMiNel2ozywtDv7lRrcr2IiwdYtwCsAbR26JPPAN0z-Zrtpe2V8jlSTYS01MxHM75xf418quRaJUO83VzUqSb8vsfZJP_p8ivk6s1XKfdyr5ukBWd3SRrrdeY3iI_unQ7RwBAK_J3iifbdNB2Fy5oN8vyKuFhTHNDd0AAkxNN-xAtjKnIFO0DtqiG47JolH4CXYenLhTvvKJz-V24jqCxKASiEPx0MNJaHpfL7fSqBzyP8wOx03txmxyei4jukNUsz_Q9QpkXMNdEPBJpyI3Wghkv5YozI22poqhD7EbzElmTwmNvkmEyo7NGZU1AWRNU1sTvkJftV0YVI8qyyeuNaiX15jhOZnrVIU_aYdjW8K5KZDqf4BwPD27CcOkcpD4CkOovmRMiAsUGbR1yt7Ki9lcD2odw2YeRYMG-2glIvb44kn05LinYAXYDSgg6ZKOxxLnX-7swNlpj_bfo7i8X3WNyCYwxeTfY33tA1gDGe1Vm5DpZLU4m-iFA5SJ9VO5JlHw-b4v8DUgaycc
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Coding+System+with+Independent+Annotations+of+Gesture+Forms+and+Functions+during+Verbal+Communication%3A+Development+of+a+Database+of+Speech+and+GEsture+%28DoSaGE%29&rft.jtitle=Journal+of+nonverbal+behavior&rft.au=Kong%2C+Anthony+Pak-Hin&rft.au=Law%2C+Sam-Po&rft.au=Kwan%2C+Connie+Ching-Yin&rft.au=Lai%2C+Christy&rft.date=2015-03-01&rft.issn=0191-5886&rft.volume=39&rft.issue=1&rft.spage=93&rft_id=info:doi/10.1007%2Fs10919-014-0200-6&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0191-5886&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0191-5886&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0191-5886&client=summon