Transmission: A telepresence interface for neural and kinetic interaction

This source preferred by Zhidong Xiao

Authors: Gingrich, O., Emets, E., Alain, R. and Xiao, Z.

http://eprints.bournemouth.ac.uk/31796/

https://mitpress.mit.edu/

Journal: Leonardo

Volume: 47

Issue: 4

Pages: 375-385

Publisher: MIT Press

ISSN: 0024-094X

DOI: 10.1162/LEON_a_00843

Transmission is both a telepresence performance and a research project. As a real-time visualization tool, Transmission creates alternate representations of neural activity through sound and vision, investigating the effect of interaction on human consciousness. As a sonification project, it creates an immersive experience for two users: a soundscape created by the human mind and the influence of kinetic interaction. An electroencephalographic (EEG) headset interprets a user's neural activity. An Open Sound Control (OSC) script then translates this data into a real-time particle stream and sound environment at one end. A second user in a remote location modifies this stream in real time through body movement. Together they become a telematic musical interface--communicating through visual and sonic representation of their interactions.

This source preferred by Zhidong Xiao

Authors: Gingrich, O., Renaud, A., Emets, E. and Xiao, Z.

http://eprints.bournemouth.ac.uk/31796/

Journal: Leonardo, MIT Press Journals,

Volume: 47

Issue: 4

Pages: 375-385

This data was imported from Scopus:

Authors: Gingrich, O., Renaud, A., Emets, E. and Xiao, Z.

http://eprints.bournemouth.ac.uk/31796/

Journal: Leonardo

Volume: 47

Issue: 4

Pages: 375-385

ISSN: 0024-094X

DOI: 10.1162/LEON_a_00843

© 2014 Oliver Gingrich, Alain Renaud, Eugenia doi:10.1162/LEON_a_00843 Emets, Zhidong Xiao. Transmission is both a telepresence performance and a research project. As a real-time visualization tool, Transmission creates alternate representations of neural activity through sound and vision, investigating the effect of interaction on human consciousness. As a sonification project, it creates an immersive experience for two users: a soundscape created by the human mind and the influence of kinetic interaction. An electroencephalographic (EEG) headset interprets a user’s neural activity. An Open Sound Control (OSC) script then translates this data into a real-time particle stream and sound environment at one end. A second user in a remote location modifies this stream in real time through body movement. Together they become a telematic musical interface—communicating through visual and sonic representation of their interactions.

This data was imported from Web of Science (Lite):

Authors: Gingrich, O., Renaud, A., Emets, E. and Xiao, Z.

http://eprints.bournemouth.ac.uk/31796/

Journal: LEONARDO

Volume: 47

Issue: 4

Pages: 375-385

eISSN: 1530-9282

ISSN: 0024-094X

DOI: 10.1162/LEON_a_00843

The data on this page was last updated at 05:31 on November 27, 2020.