This is a small but share-worthy experiment I created with friends from KISD during an interaction design summer school in Tampere, Finland, where we experimented with ways to interact with content on a 360° screen. Within four days we created an interactive sound experience in which a user manipulates a real-time generated sound scape with their hands.
Year: 2023
Tools used: Unity, Processing, Kinect, Open Sound Control (OSC), Ableton Live
Made in collaboration with: Konstantin Krais, Anne Feldhof, Anastasia König, Leander Leyendecker
Particles float around the surrounding space. When two of them collide, a sound is triggered. The collisions, together with other other sounds, form an ambient sound layer that users manipulate with their "mighty hands". A Kinect motion capture camera mounted to the ceiling of the 360° cave tracks arm gestures of the user in the center. As soon as they raise their arm away from their body, an inverted gravitational field is created where the hand points, deflecting particles away. Their sudden increase in velocity results in more tumultuous movements, rapidly increasing the number of collision-triggered sounds. The black screen - suddenly revealing itself as a wall encapsulating the user from what lies beyond it - crackes open visually and auditively, letting light into the room for a brief moment.
Three simultaneously running applications are at play that communicate with each other via the Open Sound Control protocol (OSC). Unity renders visual geometry, calculates the physics, and sends OSC events. Another computer connected to the same local network is running a Processing sketch that receives those signals and sends them further to an instance of Ableton Live, where they trigger synthesizers and manipulate audio effects. The Processing sketch also connects to the Kinect camera, gathering its motion capture data, calculating the vector of the direction of the raised arm, and sends it to the Unity application.
Technical setup
In real time, a Processing sketch calculates the vector of the direction where the arm points, receives sound events from the unity application, and sends it all to Ableton Live.
Sound event data manipulates synthesizers and audio effects in real time.