Kima: The wheel— voice turned into vision a participatory, immersive visual soundscape installation
Authors: Gingrich, O., Emets, E., Renaud, A., Soraghan, S. and Ablanedo, D.V.
Journal: Leonardo
Volume: 53
Issue: 5
Pages: 479-484
ISSN: 0024-094X
DOI: 10.1162/leon_a_01698
Abstract:Over the last five years, KIMA, an art and research project on sound and vision, has investigated visual properties of sound. Previous iterations of KIMA focused on digital representations of cymatics— physical sound patterns—as media for performance. The most recent development incorporated neural networks and machine learning strategies to explore visual expressions of sound in participatory music creation. The project, displayed on a 360-degree canvas at the London Roundhouse, prompted the audience to explore their own voice as intelligent, real-time visual representation. Machine learning algorithms played a key role in meaningful interpretation of sound as visual form. The resulting immersive performance turned the audience into cocreators of the piece.
Source: Scopus
<i>KIMA: The Wheel</i>-Voice Turned into Vision <i>A Participatory, Immersive Visual Soundscape Installation</i>
Authors: Gingrich, O., Emets, E., Renaud, A., Soraghan, S. and Ablanedo, D.V.
Journal: LEONARDO
Volume: 53
Issue: 5
Pages: 479-484
eISSN: 1530-9282
ISSN: 0024-094X
DOI: 10.1162/leon_a_01698
Source: Web of Science (Lite)
KIMA: The Wheel—Voice Turned into Vision: A participatory, immersive visual soundscape installation
Authors: Gingrich, O., Renaud, A., Emets, E., Soraghan, S. and Villanueva-Ablanedo, D.
Journal: Leonardo
Pages: 1-13
Publisher: MIT Press
ISSN: 0024-094X
DOI: 10.1162/leon_a_01698
Abstract:Over the last five years, KIMA, an art and research project on sound and vision, has investigated visual properties of sound. Previous iterations of KIMA focused on digital representations of cymatics—physical sound patterns—as medium for performance. The current development incorporates neural networks and machine learning strategies to explore visual expressions of sound in participatory music creation. The project, displayed on a 360 degree canvas at the London Roundhouse, prompted the audience to explore their own voice as intelligent, real-time visual representation. Machine learning algorithms played a key role in meaningful interpretation of sound as visual form. The resulting immersive performance turned the audience into co-creators of the piece.
https://www.mitpressjournals.org/doi/abs/10.1162/leon_a_01698
Source: Manual