Sound+Vision

Sound+Vision was a collaboration between VICE and the Sydney Opera house, with support from Samsung.

The event brought music and visual artists work together and create immersive live experiences.

sound_vision_banner.png

With Code on Canvas, I developed the native Android application that artists used to control the sound and visuals during the performance. The event had a 3 performances, and each was controlled using 2 tablets.

sound_vision_screens.jpg

An Adaptive Interface

A single application reads a layout file that resides on each individual tablet, and draws the interface based on the content. This drastically sped up development and deployment, as it was no longer necessary to keep track of each tablet and develop separate applications.

I used openFrameworks to develop the applications.

Messaging with OSC

The tablets sent messages to a media server using the Open Sound Control (OSC). This provided a flexible and accurate mechanism, and was also highly well-suited for real-time control of media processing.  

The server was notified whenever a toggle was turned on or off, and the value of the sliders in real time.