RMFTM Interactive visuals

I built a new piece of software for the awesome live band Radar Men From The Moon/ RMFTM. The challenge was to build a piece of software which would generate live real time visuals based upon the musical output from the band.
I simply put an XLR splitter on the following channels to receive the audio.
Kick drum/Snare drum/Bass guitar/Guitar 1/Guitar 2/ Synthesizer.

These guys are a real live band so there was no click track and no agreed length of time with some songs. This meant that some visual sequences had to be triggered by hand (Ipad with MIRA with a custom built user interface) but most of the time the software would make choices based upon the input it received.
For example: On track 1 the software keeps track of how many times a kick drum is triggered. Lets say we count 16 times we then trigger a change in the visuals and then make a new choice, will we change again on 8, 16 or 32 kicks? By layering a lot of these simple rules complex behaviour is created with the output.

The main challenge was environmental sound and latency. A kick drum mic next to a bass amp is receiving bleed the whole time and could cause false triggers. Gates could help out but then a lot of dynamic information would be lost. The solution was to analyse each channel for frequencies that were unique to that channel.
If a snare drum is struck and the visuals occur 10 ms later there will be a noticeable “delay” for the public. This mean that heavy optimizations had to occur to output the visuals under 5ms. This wasn’t a problem at SD resolution with an average latency of 3-5 ms.

The setup also had to be built, plugged in and ready to work in 5 minutes as a lot of line checks are a rushed affair and have to happen quickly. Most of the time the whole process was calibrated in 2-3 minutes.

All in all I was very pleased with the outcome of the set as was the band. Below is a selection of photos from the show at the Roadburn pre party. Thanks for the collaboration guys.