Auditory augmented reality: Object sonification for the visually impaired

Augmented reality applications have focused on visually integrating virtual objects into real environments. In this paper, we propose an auditory augmented reality, where we integrate acoustic virtual objects into the real world. We sonify objects that do not intrinsically produce sound, with the pu...

Full description

Saved in:
Bibliographic Details
Published in2012 IEEE 14th International Workshop on Multimedia Signal Processing pp. 319 - 324
Main Authors Ribeiro, Flavio, Florencio, Dinei, Chou, Philip A., Zhang, Zhengyou
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.09.2012
Subjects
Online AccessGet full text
ISBN9781467345705
1467345709
DOI10.1109/MMSP.2012.6343462

Cover

More Information
Summary:Augmented reality applications have focused on visually integrating virtual objects into real environments. In this paper, we propose an auditory augmented reality, where we integrate acoustic virtual objects into the real world. We sonify objects that do not intrinsically produce sound, with the purpose of revealing additional information about them. Using spatialized (3D) audio synthesis, acoustic virtual objects are placed at specific real-world coordinates, obviating the need to explicitly tell the user where they are. Thus, by leveraging the innate human capacity for 3D sound source localization and source separation, we create an audio natural user interface. In contrast with previous work, we do not create acoustic scenes by transducing low-level (for instance, pixel-based) visual information. Instead, we use computer vision methods to identify high-level features of interest in an RGB-D stream, which are then sonified as virtual objects at their respective real-world coordinates. Since our visual and auditory senses are inherently spatial, this technique naturally maps between these two modalities, creating intuitive representations. We evaluate this concept with a head-mounted device, featuring modes that sonify flat surfaces, navigable paths and human faces.
ISBN:9781467345705
1467345709
DOI:10.1109/MMSP.2012.6343462