Intersections

Is there a way to make sense of the dynamics and patterns of movement that is possible to observe around us in everyday life?
What is the rhythm of a pedestrian crossing? What kind of music would the streams of cars speeding down a highway play?

These are the questions that led the development of Intersections, a sound machine that relies on computer vision to look at patterns and rhythm of moving objects to generate semi-random music.

Based on the concept of a piano roll, a roll of perforated paper used to play a pianola, where every hole represents notes data, Intersections analyzes the frequency, speed and presence of moving objects within a region of interest, and uses this information to control the behaviour of 2 instruments and one effect.

The interval in milliseconds between the passing object determines whether a bass synth moves up or down a pentatonic scale. The presence of a moving object triggers, in turn, a random pattern of notes loosely based on Pachelbel’s Canon played by a drumkit. And lastly, the intensity of movement is mapped on the room size parameter of a reverb FX.

Developed with Processing, Intersection uses LNX Studio to produce sounds and is a first attempt at using OpenCV within the context of pattern and data sonification.

STUDENTS