Getting HANDSy

The HANDS project used Computer Vision (CV) and Fast Fourier Transforms (FFT) to produce visuals accompanying Ravel’s Mother Goose suite.

Screen Shot 2013-12-14 at 10.35.49 PM

Originally inspired by a collaboration between Luke Dubois and Kathleen Supove called digits, I wanted to use CV to utilize the gesture by the hands of the performer to drive the visuals. The final form of the piece shown in the center involved using optical flow, which indicates velocities & directions of moving parts of the image, for two purposes: 1) adding forces to a slowly fading vector field and 2) spawning particles in those positions with initial velocities based on movement which then flow through said vector field. The appearance of the particles is altrered throughout the performance such that they begin so small that they are not visible, and slowly increase in size to appear as white shimmering dust. They gradually evolve into large forms which become colored, taking their hue from the direction they move.

Further, CV frame differencing is also used to detect moving parts of the image, though shown in a different visual form. Moving pixels are detected at the edges of the performer’s silhouette and used to fill a CV image. This image is then blurred to soften the outline and faded over time. This produces a nice ghosting effect. The actual video was also modified. Some combination of brightness reduction and blurring was applied to the video before drawing it underneath the previously mentioned effects. All the pieces worked in tandem to produce a gentle accompaniment for the music.

On the side sails, a separate visual system was utilized which used particles slowly radiating out from a center point. FFT was used to shift the colors between blue and red based on low and high frequencies, and the brightness was based on the volume of the audio stream. This system was repeated 5 times to form what began as a ring and evolved into a more nebulous cloud of organic shapes.

Screen Shot 2013-12-14 at 11.05.47 PM

Several other looks were developed but ultimately not used during the performance. These included a particle system which began as a firework burst and but quickly morphed into trails of particles as forces, based on Perlin noise, were added to each particle to derive its path. Another look involving images of butterflies rather than simple circles was explored.

Screen Shot 2013-12-14 at 11.07.10 PM Screen Shot 2013-12-14 at 11.08.18 PM

All code is available in different folders on the NWS 2013 github repo.

Wieniawski Violin Etudes


The Wieniawski Violin Etudes are pieces for students meant to represent the core virtuosic abilities of a accomplished violinist, with each one focusing on one of these fundamentals. Our task was to create a visual system for these Etudes to tell a story to the audience outside of the music alone.

We were interested in a strong narrative component, one that works more on a human emotional level than just purely a machine translation of sound to light. The first thing we did to prepare was, of course, listen to the tracks quite a bit, then ask ourselves, “What is the the story the musicians are trying to tell? Not just the sequence of notes, but the violence of the strokes, the volume, the timbre. All of it combines to make us feel something, but what do we feel?” What we heard and worked with was a story of tension and release, corruption and redemption.

The Etudes we were scripting for were Op. 18 movements II, III, and IV. The second movement relied heavily on a regular geometric rotational system that gets corrupted and re-skinned over and over again. The third movement plays with the idea of corruption by introducing different particle systems to play over the notes of the piece. And finally, the fourth movement explores the motion and space between players as the similarities and differences of the two violinists music and motions would determine the space and speed of the objects on the screen.

The way we chose to interact with the performers was through physical motion by using accelerometers, and Fast Fourier Transform (FFT) code to translate pitch, attack, and other aural qualities of the music into visual output, or programmatically manipulatable variables. As for the physical motion, we felt it would be advantageous to connect accelerometers directly to the musicians wrists. This would allow a noise free, one-to-one connection to the movement of the violin bow, yielding orientation and movement data in 3 axes.

In order to wirelessly receive this data from the musicians, we decided to use an Arduino Fio v3 connected directly to a Roving Networks WiFly module in order to stream the data to our applications via wifi. The actual accelerometer readings were collected with an LSM303 accelerometer+magnetometer module. The data from the accelerometer was read by the arduino and sent via wifi using UDP communication to an OpenFrameworks receiver application.

All the code we used for both the arduino and the openframeworks sketches are available on the project repository.

Github Repo


Many more images

Group Null – Adiel Fernandez, Owen Herterich, Yi Ning Huang, K Anthony Marefat, Joseph Moore, Jorge Proano, Tiam Taheri

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Creative Commons License