Visual interpretation of natural pointing gestures in 3D space for human-robot interaction

Visual interpretation of natural pointing gestures is essential in a human robot interaction scenario. Both hands and head are involved in pointing behaviors. Given the color images acquired by a web camera and the depth data by a TOF range camera, we perform visual tracking of the head and hands in...

Full description

Saved in:
Bibliographic Details
Published in2010 11th International Conference on Control Automation Robotics and Vision pp. 2513 - 2518
Main Authors Zhi Li, Jarvis, R
Format Conference Proceeding
LanguageEnglish
Japanese
Published IEEE 01.12.2010
Subjects
Online AccessGet full text
ISBN1424478146
9781424478149
DOI10.1109/ICARCV.2010.5707377

Cover

Abstract Visual interpretation of natural pointing gestures is essential in a human robot interaction scenario. Both hands and head are involved in pointing behaviors. Given the color images acquired by a web camera and the depth data by a TOF range camera, we perform visual tracking of the head and hands in 3D space. We investigate both the Head-Finger Line (HFL) and the forearm orientation as the estimation of the pointing direction. HFL is determined by 3D positions of the face and finger tip. Forearm direction is calculated using the PCA method in the RANSAC framework. Their performances are evaluated and compared in the experiments. Face direction and eye gaze orientation provide important cues regarding where the person's attention is during a pointing operation, which is proven helpful for eliminating some false estimations in our experiment.
AbstractList Visual interpretation of natural pointing gestures is essential in a human robot interaction scenario. Both hands and head are involved in pointing behaviors. Given the color images acquired by a web camera and the depth data by a TOF range camera, we perform visual tracking of the head and hands in 3D space. We investigate both the Head-Finger Line (HFL) and the forearm orientation as the estimation of the pointing direction. HFL is determined by 3D positions of the face and finger tip. Forearm direction is calculated using the PCA method in the RANSAC framework. Their performances are evaluated and compared in the experiments. Face direction and eye gaze orientation provide important cues regarding where the person's attention is during a pointing operation, which is proven helpful for eliminating some false estimations in our experiment.
Author Jarvis, R
Zhi Li
Author_xml – sequence: 1
  surname: Zhi Li
  fullname: Zhi Li
  email: Zhi.Li@eng.monash.edu.au
  organization: Intell. Robot. Res. Centre, Monash Univ., Melbourne, VIC, Australia
– sequence: 2
  givenname: R
  surname: Jarvis
  fullname: Jarvis, R
  email: Ray.Jarvis@eng.monash.edu.au
  organization: Intell. Robot. Res. Centre, Monash Univ., Melbourne, VIC, Australia
BookMark eNpFkM1KAzEUhSMqaGufoJu8wNRJJsmdLGX8hYIg2oWbkskkNdImQ5JZ-PZGWvBuLvecw8fhztCFD94gtCT1ipBa3r50d2_dZkXrInCooQE4QzPCKGPQEs7O_w8mrtAipe-6DKfACL1GnxuXJrXHzmcTx2iyyi54HCz2Kk-xOGMonvM7vDOpKCaVLG7ucRqVNtiGiL-mg_JVDH3IR47Sf5AbdGnVPpnFac_Rx-PDe_dcrV-fSut15QgXuTKgrARZN3yg2gD0CnpGSaN7IbVig2wt4VzoVg6DokC1BsssCEFJyWvazNHyyHXGmO0Y3UHFn-3pF80vnodWmg
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/ICARCV.2010.5707377
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Xplore POP ALL
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
EISBN 1424478154
9781424478156
9781424478132
1424478138
EndPage 2518
ExternalDocumentID 5707377
Genre orig-research
GroupedDBID 6IE
6IF
6IH
6IK
6IL
6IN
AAJGR
AAWTH
ADFMO
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CBEJK
IEGSK
IERZE
OCL
RIE
RIL
ID FETCH-LOGICAL-i156t-e7af979035d2ce77ba7b4213cb69ca4d98f1556c89dda272cc7f4f766212cec23
IEDL.DBID RIE
ISBN 1424478146
9781424478149
IngestDate Wed Aug 27 03:31:54 EDT 2025
IsPeerReviewed false
IsScholarly false
Language English
Japanese
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i156t-e7af979035d2ce77ba7b4213cb69ca4d98f1556c89dda272cc7f4f766212cec23
PageCount 6
ParticipantIDs ieee_primary_5707377
PublicationCentury 2000
PublicationDate 2010-12
PublicationDateYYYYMMDD 2010-12-01
PublicationDate_xml – month: 12
  year: 2010
  text: 2010-12
PublicationDecade 2010
PublicationTitle 2010 11th International Conference on Control Automation Robotics and Vision
PublicationTitleAbbrev ICARCV
PublicationYear 2010
Publisher IEEE
Publisher_xml – name: IEEE
SSID ssj0000527412
Score 1.5201682
Snippet Visual interpretation of natural pointing gestures is essential in a human robot interaction scenario. Both hands and head are involved in pointing behaviors....
SourceID ieee
SourceType Publisher
StartPage 2513
SubjectTerms 3D head and hands tracking
Cameras
Estimation
Eye gaze direction
Face
Face direction
Forearm orientation
Head-Finger Line
Image color analysis
Pointing Gestures
Principal component analysis
Three dimensional displays
Title Visual interpretation of natural pointing gestures in 3D space for human-robot interaction
URI https://ieeexplore.ieee.org/document/5707377
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LSwMxEA5tT55UWvFNDh5N3Sa7yeYoq6UKFRFbipeSJyxKt9Tdi7_e2UdXKh687YshzIZ8M8l83yB0xZ3iESAHURC5kTB2jEhDBXEAL9ZEAbO2JDhPn_hkFj4uokUHXbdcGOdcVXzmhuVldZZvM1OUW2WQvMOEFKKLuiLmNVer3U8JolKIhW65W6WSE99KOjX3slEdGgXy5iG5fUnmdWlXY3anv0oFL-N9NN0OrK4qeR8WuR6ar1-ajf8d-QEa_BD58HMLUYeo41Z99DZPPwv1gdOdekOceVypfMKbdZZWDSRwefhUQEIO32J2h2H1AYMQ5uKqtR_ZZDrLazs1P2KAZuP712RCmhYLJIXELSdOKC-FDFhkqXFCaCV0SEfMaC6NCq2MPQQc3MTSWkUFNUb40AvOAfGMM5Qdod4qW7ljhCMdcKmYlza24G-lWOhl4EPtqKciGJ2gfumX5bpW0Vg2Ljn9-_EZ2qNt4cg56uWbwl0A_Of6svrv30yBrOI
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LTwIxEG4QD3pSA8a3PXi0uHT72B4NSkCBGAOEeCF9JhsNS3D34q-3-wCD8eBtu9tMmkmzM9N-3zcA3DArGfWRA0mfuSES2RAJjTmyPrwYTYPQmJzgPByx3oQ8zeisBm43XBhrbQE-s638sbjLN4nO8qMyX7z7Dcn5DtilhBBasrU2JyoBzaVY8Jq9lWs5sbWoUzUWle5QOxB3_c79a2dagrsqw1sdVooA0z0Aw_XSSlzJeytLVUt__VJt_O_aD0Hzh8oHXzZB6gjU7KIB3qbxZyY_YLyFOISJg4XOp_-yTOKihQTMr58yX5L7uTB8gP7_4w36RBcWzf3QKlFJWtopGRJNMOk-jjs9VDVZQLEv3VJkuXSCiyCkBmvLuZJcEdwOtWJCS2JE5HzKwXQkjJGYY625I44z5mOethqHx6C-SBb2BECqAiZk6ISJjPe3lCFxInBEWewwD9qnoJH7Zb4sdTTmlUvO_n59DfZ64-FgPuiPns_BPt7ASC5APV1l9tInA6m6KvbAN4wzsC8
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2010+11th+International+Conference+on+Control+Automation+Robotics+and+Vision&rft.atitle=Visual+interpretation+of+natural+pointing+gestures+in+3D+space+for+human-robot+interaction&rft.au=Zhi+Li&rft.au=Jarvis%2C+R&rft.date=2010-12-01&rft.pub=IEEE&rft.isbn=9781424478149&rft.spage=2513&rft.epage=2518&rft_id=info:doi/10.1109%2FICARCV.2010.5707377&rft.externalDocID=5707377
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9781424478149/lc.gif&client=summon&freeimage=true
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9781424478149/mc.gif&client=summon&freeimage=true
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9781424478149/sc.gif&client=summon&freeimage=true