Generative Visual Synthesiser

Standard VJ software just doesn’t cut it for me. Very few people make their own content and even less do something unique to said content. There is only so much you can do with a video whether it’s content you made yourself or that you downloaded from the internet.
Everyone can be a VJ now and it pretty much shows most of the time…
So I decided to make my own VJ software with a few simple rules in mind:
No video playback. The basic starting visuals had to be 3D forms generated in Open GL.
No fixed effect pipeline. Standard software will not allow you to change the order of the effects once the software is running. This had to be dynamic so a user could change it on the fly.
– It had to be able to synchronize with any kind of musical setting. This required real time manipulation of everything handled by the software.
–  1 huge render output with multiple renders inside of it, 3D mapping software could then be used to layer everything on top of each other.

After 12 months the first workable version was complete. The open GL part wasn’t too hard to get running, the dynamic effect system on the other hand was a real challenge.
Using 9 unique post effects mean that there was 9*9*9*9*9*9*9*9*9 = 387,420,289 combinations.
The permutation rule was simple:
– Each effect can only be used once.
– It can be anywhere in the effect chain.
– It can be enabled or disabled. (Now it gets complicated)

Let’s say we pick 6 effects out of the 9.
-1 3 4 5 7 9
This gives 46,656 results.
We could also have effects
– 2 4 5 6 8 9
This also gives 46,656 results.
So all these huge amounts of numbers can sound very impressive, the truth is they are not. It is incredibly difficult to have 9 completely different effects which always produce a “pleasing” visual output.

That was the 1st level of complexity, here is the 2nd.
Each of those 9 effects had between 5 to 12 parameters. So each effect responds differently depending on what input it receives from the previous effect. On each iteration all those parameters would be different.
Lets just say it sometimes looked amazing, most of the time the screen just flashed one color or was completely black. Not what you’d typically want from VJ software.
It then took another month to build a system which randomized those parameters with a certain set of rules. This improved the output considerably.

The next challenge was building a preset system which could be loaded and unloaded in a live environment. I took the easy way out and had one of the biggest text files known to man filled with every single permutation. I then used a lookup system which extracted the necessary effects from a user input.

The system was almost there but it needed a master clock system, I decided the whole system had to be controlled by Ableton Live. Why ? No VJ software has true musical sequencer capabilities built in. Using Live allowed me to stack thousands of clips so that 3d geometry could move at different speeds than certain effects. I then built custom max for live effects to bridge the gap between 48 midi tracks and Jitter.
Last but not least. A custom controller setup built on the Ipad with MIRA from cycling 74. This allowed me to build a custom GUI specifically for the software.

Jitter then sent one huge screen of visuals to Madmapper via Syphon. I believe the final resolution was 2400 x 1560. This one output had 6 windows which could be all be accessed by Madmapper.
FPS was a solid 60 🙂

The screenshots below is a small selection of the possible outputs from the system.