An accelerometer based gestural capture system for performer based music composition.
This source preferred by Jon Cobb
Authors: Cobb, J.E.
Editors: Howard, D.M.
Previous studies by research groups in the field of music technology have resulted in a variety of gestural capture systems that enable various levels of interaction and control in the generation of sound. The majority of approaches developed to date have focused on direct control of synthesis parameters. More recently there has been a move toward a higher level of abstraction in which performers can interact in the music composition process. This seems to be motivated by the desire to realise the full creative expression achievable through fusion of physical performance and music. Such a goal poses a number of technical challenges several of which have been explored in the current project.
The development of a small, low cost, low power accelerometer based gestural capture system is described. Using this device it is shown that significant subject to subject variation occurred (30% for a study group of n=36) when different test subjects executed the same movement. To compensate for this measurement uncertainty the development of a neural network based pattern recognition system is described which reduces variation to <5%. Refinement of the design of the gestural capture system is described to enable it to be implemented as a wireless network of body-worn sensors to capture movement data simultaneously from all four limbs.
Theoretical consideration is then given to the scaling of the system to enable multiple performers to cooperate physically in the music composition process. This leads to the concept of a ‘performance orchestra’ where traditional musicians are replaced by instrumented dancers with direct control of accompanying sound.