Kima: The wheel— voice turned into vision a participatory, immersive visual soundscape installation

Authors: Gingrich, O., Renaud, A., Emets, E., Soraghan, S. and Villanueva-Ablanedo, D.

http://eprints.bournemouth.ac.uk/31795/

https://www.mitpressjournals.org/doi/abs/10.1162/leon_a_01698

Journal: Leonardo

Pages: 1-13

Publisher: MIT Press

ISSN: 0024-094X

DOI: 10.1162/leon_a_01698

Over the last five years, KIMA, an art and research project on sound and vision, has investigated visual properties of sound. Previous iterations of KIMA focused on digital representations of cymatics—physical sound patterns—as medium for performance. The current development incorporates neural networks and machine learning strategies to explore visual expressions of sound in participatory music creation. The project, displayed on a 360 degree canvas at the London Roundhouse, prompted the audience to explore their own voice as intelligent, real-time visual representation. Machine learning algorithms played a key role in meaningful interpretation of sound as visual form. The resulting immersive performance turned the audience into co-creators of the piece.

This data was imported from Scopus:

Authors: Gingrich, O., Emets, E., Renaud, A., Soraghan, S. and Ablanedo, D.V.

http://eprints.bournemouth.ac.uk/31795/

Journal: Leonardo

Volume: 53

Issue: 5

Pages: 479-484

ISSN: 0024-094X

DOI: 10.1162/leon_a_01698

© 2020 ISAST. Over the last five years, KIMA, an art and research project on sound and vision, has investigated visual properties of sound. Previous iterations of KIMA focused on digital representations of cymatics— physical sound patterns—as media for performance. The most recent development incorporated neural networks and machine learning strategies to explore visual expressions of sound in participatory music creation. The project, displayed on a 360-degree canvas at the London Roundhouse, prompted the audience to explore their own voice as intelligent, real-time visual representation. Machine learning algorithms played a key role in meaningful interpretation of sound as visual form. The resulting immersive performance turned the audience into cocreators of the piece.

This data was imported from Web of Science (Lite):

Authors: Gingrich, O., Emets, E., Renaud, A., Soraghan, S. and Ablanedo, D.V.

http://eprints.bournemouth.ac.uk/31795/

Journal: LEONARDO

Volume: 53

Issue: 5

Pages: 479-484

eISSN: 1530-9282

ISSN: 0024-094X

DOI: 10.1162/leon_a_01698

The data on this page was last updated at 05:31 on November 27, 2020.