Animatronic Shader Lamps Avatars

Applications such as telepresence and training involve the display of real or synthetic humans to multiple viewers. When attempting to render the humans with conventional displays, non-verbal cues such as head pose, gaze direction, body posture, and facial expression are difficult to convey correctl...

Full description

Saved in:
Bibliographic Details
Published in2009 8th IEEE International Symposium on Mixed and Augmented Reality pp. 27 - 33
Main Authors Lincoln, Peter, Welch, Greg, Nashel, Andrew, Ilie, Adrian, State, Andrei, Fuchs, Henry
Format Conference Proceeding
LanguageEnglish
Published Washington, DC, USA IEEE Computer Society 19.10.2009
IEEE
SeriesACM Other Conferences
Subjects
Online AccessGet full text
ISBN9781424453900
1424453909
DOI10.1109/ISMAR.2009.5336503

Cover

More Information
Summary:Applications such as telepresence and training involve the display of real or synthetic humans to multiple viewers. When attempting to render the humans with conventional displays, non-verbal cues such as head pose, gaze direction, body posture, and facial expression are difficult to convey correctly to all viewers. In addition, a framed image of a human conveys only a limited physical sense of presence—primarily through the display's location. While progress continues on articulated robots that mimic humans, the focus has been on the motion and behavior of the robots. We introduce a new approach for robotic avatars of real people: the use of cameras and projectors to capture and map the dynamic motion and appearance of a real person onto a humanoid animatronic model. We call these devices animatronic Shader Lamps Avatars (SLA).We present a proof-of-concept prototype comprised of a camera, a tracking system, a digital projector, and a life-sized styrofoam head mounted on a pan-tilt unit. The system captures imagery of a moving, talking user and maps the appearance and motion onto the animatronic SLA, delivering a dynamic, real-time representation of the user to multiple viewers.
ISBN:9781424453900
1424453909
DOI:10.1109/ISMAR.2009.5336503