concept

Audiobody is an early working-prototype of a concept I have been pondering for a while. 

Interested in gesture-control and frustrated with the current standards for electronic musicians, I sought to create a system that would allow electronic musicians a more intimate and performative experience during live sets. 

Currently during live shows, many electronic musicians are merely inputing controls onto their laptops, creating a very disconnected experience with their audience. Watching musicians hide behind the glow of their screen is un-immersive and unexciting. I wanted to take steps to change that.

 

Video - live demonstration


how it works

Audiobody allows the user to simultaneously control both the sound and stage-lighting with the nuances of their hand and finger movement. As gesture control technology has not been developed to a state of precision, I aimed to create a balance between automation and control in order to compensate for this. Users are able to input a pre-selection of notes from which they are able to play, enabling them to avoid hitting "incorrect" notes during their gestural motions.  

I aimed to create a balance between automation and control


The motion sensor tracks individual finger and hand positions. The progression of notes is controlled along the X-axis and effects (such as Cutoff) are controlled by the Y and Z-axis. These effects can also be pre-set to control different modules, allowing further flexibility for musicians. Chords can be played by the presence of multiple hands. 

The lighting system is also controlled by these XYZ variables. Movement along the X-axis controls the red value, movement along the Y-axis controls the blue value, and movement along the Z-axis controls the green value. A strobe-like effect can be achieved by the presence of two hands. 

Process

I first researched various examples of gesture-controlled systems as well as techniques for achieving my concept. I then sketched initial ideas of how the system would work together, as well as the input methods used. 

Once I had the concept worked out, I used MAX/MSP and Arduino to program the software. For the hardware, I used LED lights, PVC piping, and distorted plastic to construct a portable setup of the system. 
 

 

Specs

One quarter, Fall 2014
Solo project
 

Instructor

James Coupe
 

class

DXARTS 470
Sensing and control systems
 

involved

Research, ideation, sketching, MAX/MSP, arduino, processing, construction