Towards a humanoid museum guide robot that interacts with multiple persons
The purpose of our research is to develop a humanoid museum guide robot that performs intuitive, multimodal interaction with multiple persons. In this paper, we present a robotic system that makes use of visual perception, sound source localization, and speech recognition to detect, track, and invol...
        Saved in:
      
    
          | Published in | 5th IEEE-RAS International Conference on Humanoid Robots, 2005 pp. 418 - 423 | 
|---|---|
| Main Authors | , , , , | 
| Format | Conference Proceeding | 
| Language | English | 
| Published | 
            IEEE
    
        2005
     | 
| Subjects | |
| Online Access | Get full text | 
| ISBN | 0780393201 9780780393202  | 
| ISSN | 2164-0572 | 
| DOI | 10.1109/ICHR.2005.1573603 | 
Cover
| Summary: | The purpose of our research is to develop a humanoid museum guide robot that performs intuitive, multimodal interaction with multiple persons. In this paper, we present a robotic system that makes use of visual perception, sound source localization, and speech recognition to detect, track, and involve multiple persons into interaction. Depending on the audio-visual input, our robot shifts its attention between different persons. In order to direct the attention of its communication partners towards exhibits, our robot performs gestures with its eyes and arms. As we demonstrate in practical experiments, our robot is able to interact with multiple persons in a multimodal way and to shift its attention between different people. Furthermore, we discuss experiences made during a two-day public demonstration of our robot | 
|---|---|
| ISBN: | 0780393201 9780780393202  | 
| ISSN: | 2164-0572 | 
| DOI: | 10.1109/ICHR.2005.1573603 |