Getting HANDSy

The HANDS project used Computer Vision (CV) and Fast Fourier Transforms (FFT) to produce visuals accompanying Ravel’s Mother Goose suite.

Screen Shot 2013-12-14 at 10.35.49 PM

Originally inspired by a collaboration between Luke Dubois and Kathleen Supove called digits, I wanted to use CV to utilize the gesture by the hands of the performer to drive the visuals. The final form of the piece shown in the center involved using optical flow, which indicates velocities & directions of moving parts of the image, for two purposes: 1) adding forces to a slowly fading vector field and 2) spawning particles in those positions with initial velocities based on movement which then flow through said vector field. The appearance of the particles is altrered throughout the performance such that they begin so small that they are not visible, and slowly increase in size to appear as white shimmering dust. They gradually evolve into large forms which become colored, taking their hue from the direction they move.

Further, CV frame differencing is also used to detect moving parts of the image, though shown in a different visual form. Moving pixels are detected at the edges of the performer’s silhouette and used to fill a CV image. This image is then blurred to soften the outline and faded over time. This produces a nice ghosting effect. The actual video was also modified. Some combination of brightness reduction and blurring was applied to the video before drawing it underneath the previously mentioned effects. All the pieces worked in tandem to produce a gentle accompaniment for the music.

On the side sails, a separate visual system was utilized which used particles slowly radiating out from a center point. FFT was used to shift the colors between blue and red based on low and high frequencies, and the brightness was based on the volume of the audio stream. This system was repeated 5 times to form what began as a ring and evolved into a more nebulous cloud of organic shapes.

Screen Shot 2013-12-14 at 11.05.47 PM

Several other looks were developed but ultimately not used during the performance. These included a particle system which began as a firework burst and but quickly morphed into trails of particles as forces, based on Perlin noise, were added to each particle to derive its path. Another look involving images of butterflies rather than simple circles was explored.

Screen Shot 2013-12-14 at 11.07.10 PM Screen Shot 2013-12-14 at 11.08.18 PM

All code is available in different folders on the NWS 2013 github repo.

Breath and Brass

2013-11-10 14.14.12

“Breath and Brass” visualizes the act of musical performance – both the physicality of the musician’s breathing and the technical nature of the generated music. It newly illuminates the interplay between the instruments and incorporates a more conscious awareness of the musician’s exertion into the audience’s experience.

Working with Poulenc’s Sonata for Horn, Trumpet and Trombone we visualize inhales and exhales together with the notes, volume and speed of the music to reveal the connection between the physicality of music making and the relationships of the sounds and instruments to each other.

2013-11-10 14.14.00

The project relies upon audio analysis and breathing measurement.
The audio analysis uses Fast Fourier Transform (FFT) thanks to an existing code library written in C++. The library breaks the audio signal into 17 channels of different frequencies from low to high and also provides additional data such as amplitude.

 

2013-11-10 14.17.32

To measure the breathing we built custom hardware, consisting of a velcro and elastic band that wraps around the musician’s torso. The front of the band is made of conductive rubber, which measures the expansion and contraction as he or she breathes. A WiFly unit attached to the band sends the data wirelessly to the computer running the software.

 

Breathing Sensor Fritz

We integrated all the data into two narratives, dubbed “Orbits” and “Lines.” “Orbits” consists of a series of concentric circles of increasing radius. The difference between radii maps to the breathing data, so the circles expand and contract. There is one circle for each of the 17 bands of frequency data, and activity on each frequency populates the respective orbit with orbiting “planetoids.”
“Lines” consists of three creatures made up of lines, one creature for each musician. The breathing data controls the length of the lines themselves, causing the creatures to grow and shrink, while the amplitude of the audio maps to the opacity, causing the creatures to brighten and fade.

Screen captures of the two movements can be found in the Google Drive folder.

photo 3 copy

Final Software Movement 1
Final Software Movement 2
Final Firmware
Full code repository

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Creative Commons License

Wieniawski Violin Etudes

L1020245

The Wieniawski Violin Etudes are pieces for students meant to represent the core virtuosic abilities of a accomplished violinist, with each one focusing on one of these fundamentals. Our task was to create a visual system for these Etudes to tell a story to the audience outside of the music alone.

We were interested in a strong narrative component, one that works more on a human emotional level than just purely a machine translation of sound to light. The first thing we did to prepare was, of course, listen to the tracks quite a bit, then ask ourselves, “What is the the story the musicians are trying to tell? Not just the sequence of notes, but the violence of the strokes, the volume, the timbre. All of it combines to make us feel something, but what do we feel?” What we heard and worked with was a story of tension and release, corruption and redemption.

The Etudes we were scripting for were Op. 18 movements II, III, and IV. The second movement relied heavily on a regular geometric rotational system that gets corrupted and re-skinned over and over again. The third movement plays with the idea of corruption by introducing different particle systems to play over the notes of the piece. And finally, the fourth movement explores the motion and space between players as the similarities and differences of the two violinists music and motions would determine the space and speed of the objects on the screen.

The way we chose to interact with the performers was through physical motion by using accelerometers, and Fast Fourier Transform (FFT) code to translate pitch, attack, and other aural qualities of the music into visual output, or programmatically manipulatable variables. As for the physical motion, we felt it would be advantageous to connect accelerometers directly to the musicians wrists. This would allow a noise free, one-to-one connection to the movement of the violin bow, yielding orientation and movement data in 3 axes.

In order to wirelessly receive this data from the musicians, we decided to use an Arduino Fio v3 connected directly to a Roving Networks WiFly module in order to stream the data to our applications via wifi. The actual accelerometer readings were collected with an LSM303 accelerometer+magnetometer module. The data from the accelerometer was read by the arduino and sent via wifi using UDP communication to an OpenFrameworks receiver application.

All the code we used for both the arduino and the openframeworks sketches are available on the project repository.

Github Repo

and…

Many more images

Group Null – Adiel Fernandez, Owen Herterich, Yi Ning Huang, K Anthony Marefat, Joseph Moore, Jorge Proano, Tiam Taheri

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Creative Commons License

Somewhat Inspiration

Just came across these posters and I thought they might have something to do with our project. I guess one of the things we could do is, since we are talking about camera movement and control, that we demonstrate shots of the performers on the screens from shifting angles & distance, different from the views that the audience have, with embedded motion graphics in order to create this type of augmented reality, which, in a way, establishes a complete experience and an all-around scene for the viewers. Random premature thoughts.

a0e27edce20bf57c840aebc2c5c561e5 6ea295f54cb9ab5060f811b896af61f8 2f03654443d49e970c6ce95d59e80b95