Sound Collider is an OpenFrameworks project, that combines visual effects and sound. It depicts a connection between a piece of music and its visualization. My goal was to mimic the collisions in the video, replicating the way the audio responded to these collisions in my own way.
There are two particles interacting with each other - a spiraling wave made out of turquoise spheres, and another one set of spheres (white ones) floating around the wave. The positions of white spheres is set up randomly, so that they could touch turquoise line at its different parts and moments. The moment when the two sets touch each other triggers a random note, so that they create some sort of music. There are 8 different types of sounds, which appear in a random order every time the floating spheres collide with the main wave.
By dragging the mouse the user can observe the whole scenery from a different perspective. Moreover, the ‘Y’ position of the mouse sets the size of the main (long) wave. By pressing certain keys we’re able to add more waves and later connect them together with thin lines. The whole project focuses on creating both a visually and sonically pleasing effect. For the show I’d love to steer the effects either by motion, or sensors. I think it could look really nice displayed on a screen. Even without sound the effect the whole work might be a nice interactive background displayed on the screen.
Slightly less blurred versions: