Visual interpretation of natural pointing gestures in 3D space for human-robot interaction
Visual interpretation of natural pointing gestures is essential in a human robot interaction scenario. Both hands and head are involved in pointing behaviors. Given the color images acquired by a web camera and the depth data by a TOF range camera, we perform visual tracking of the head and hands in...
Saved in:
| Published in | 2010 11th International Conference on Control Automation Robotics and Vision pp. 2513 - 2518 |
|---|---|
| Main Authors | , |
| Format | Conference Proceeding |
| Language | English Japanese |
| Published |
IEEE
01.12.2010
|
| Subjects | |
| Online Access | Get full text |
| ISBN | 1424478146 9781424478149 |
| DOI | 10.1109/ICARCV.2010.5707377 |
Cover
| Summary: | Visual interpretation of natural pointing gestures is essential in a human robot interaction scenario. Both hands and head are involved in pointing behaviors. Given the color images acquired by a web camera and the depth data by a TOF range camera, we perform visual tracking of the head and hands in 3D space. We investigate both the Head-Finger Line (HFL) and the forearm orientation as the estimation of the pointing direction. HFL is determined by 3D positions of the face and finger tip. Forearm direction is calculated using the PCA method in the RANSAC framework. Their performances are evaluated and compared in the experiments. Face direction and eye gaze orientation provide important cues regarding where the person's attention is during a pointing operation, which is proven helpful for eliminating some false estimations in our experiment. |
|---|---|
| ISBN: | 1424478146 9781424478149 |
| DOI: | 10.1109/ICARCV.2010.5707377 |