KIMA: The Wheel –Voice Turned into Vision: A Participatory, Immersive Visual Soundscape Installation

Over the last five years, , an art and research project on sound and vision, has investigated visual properties of sound. Previous iterations of focused on digital representations of cymatics—physical sound patterns—as media for performance. The most recent development incorporated neural networks a...

Full description

Saved in:
Bibliographic Details
Published inLeonardo (Oxford) Vol. 53; no. 5; pp. 479 - 484
Main Authors Gingrich, Oliver, Emets, Evgenia, Renaud, Alain, Soraghan, Sean, Ablanedo, Dario Villanueva
Format Journal Article
LanguageEnglish
Published One Rogers Street, Cambridge, MA 02142-1209, USA MIT Press 01.10.2020
The MIT Press
MIT Press Journals, The
Subjects
Online AccessGet full text
ISSN0024-094X
1530-9282
DOI10.1162/leon_a_01698

Cover

More Information
Summary:Over the last five years, , an art and research project on sound and vision, has investigated visual properties of sound. Previous iterations of focused on digital representations of cymatics—physical sound patterns—as media for performance. The most recent development incorporated neural networks and machine learning strategies to explore visual expressions of sound in participatory music creation. The project, displayed on a 360-degree canvas at the London Roundhouse, prompted the audience to explore their own voice as intelligent, real-time visual representation. Machine learning algorithms played a key role in meaningful interpretation of sound as visual form. The resulting immersive performance turned the audience into cocreators of the piece.
Bibliography:October, 2020
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0024-094X
1530-9282
DOI:10.1162/leon_a_01698