project: Kinect Reflections

t3kt/luminator

Kinetic Reflections will allow users to interactively control an audio/visual system through movement. It will consist of several subsystems, including an Input and Control Handler, Interaction Feedback Display, Audio, and a Render Display. In order to provide an evolving and intriguing experience, the system as a whole will switch periodically between several modes, changing the generated imagery, sound, and how user interaction is interpreted.

The Input and Control Handler will take input from one or more Kinect devices, and interpret the data in one of several different ways depending on the current mode. It will be implemented with TouchDesigner, and will communicate with the other components of the system through OSC and MIDI messages.

The Interaction Feedback Display will provide users with immediate visual representation making it readily apparent how the system is interpreting their actions. It will use a projection on a nearby wall or screen to show an easily understandable representation of how the system perceives and engages with the users.

The Audio subsystem will take information from the Control Handling and use it to trigger playback of audio clips, and modulate effects parameters. It will be implemented with Ableton Live.

The Render Display will take information from the Control subsystem and use it to generate and manipulate a visual stream which will be projected onto either a screen or an architectural surface in the installation environment. It will focus less on helping users to understand the system, and more on visual aesthetics. It will be implemented using a combination of video clip triggering (with Resolume) and real-time generative visualizations (with TouchDesigner).