Visual interpretation of natural pointing gestures in 3D space for human-robot interaction

Visual interpretation of natural pointing gestures is essential in a human robot interaction scenario. Both hands and head are involved in pointing behaviors. Given the color images acquired by a web camera and the depth data by a TOF range camera, we perform visual tracking of the head and hands in...

Full description

Saved in:
Bibliographic Details
Published in2010 11th International Conference on Control Automation Robotics and Vision pp. 2513 - 2518
Main Authors Zhi Li, Jarvis, R
Format Conference Proceeding
LanguageEnglish
Japanese
Published IEEE 01.12.2010
Subjects
Online AccessGet full text
ISBN1424478146
9781424478149
DOI10.1109/ICARCV.2010.5707377

Cover

More Information
Summary:Visual interpretation of natural pointing gestures is essential in a human robot interaction scenario. Both hands and head are involved in pointing behaviors. Given the color images acquired by a web camera and the depth data by a TOF range camera, we perform visual tracking of the head and hands in 3D space. We investigate both the Head-Finger Line (HFL) and the forearm orientation as the estimation of the pointing direction. HFL is determined by 3D positions of the face and finger tip. Forearm direction is calculated using the PCA method in the RANSAC framework. Their performances are evaluated and compared in the experiments. Face direction and eye gaze orientation provide important cues regarding where the person's attention is during a pointing operation, which is proven helpful for eliminating some false estimations in our experiment.
ISBN:1424478146
9781424478149
DOI:10.1109/ICARCV.2010.5707377