My Video Synth Setup: Part 3 – The Eyesy
As with most of the gear I’ve shown so far in this video synth series, I’ve tried to stay away from a dedicated ‘computer’ for my visuals. It’s debatable that a raspberry pi is a computer but in this case, I just didn’t want to use a laptop or other expensive device to generate the visuals. That’s a whole other rabbit hole for another time.
This instalment features the Eyesy video synth from Critter & Guitari. Previously, I had a Rhythm VideoScope which I liked a lot and when the Eyesy was announced, I quickly upgraded. You’ll see why in this video:
At its core, the Eyesy is a raspberry pi compute module in a dedicated metal case. It’s built like a tank and meant to withstand live gigging. It has USB, MIDI, HDMI and a composite output. You can choose between SD composite output or HDMI simply by plugging in a cable before booting. An SD card stores all your patches and the main operating system.
It features a USB port for either midi over USB or with the included USB wifi module, allows you to connect to your local network to manage the modes that the Eyesy runs. This is more for pre-show management rather than something you’d want to do during one.
At present, my Eyesy has about 88 different modes. I forget how many come by default…I’ve added a bunch from Patchstorage and made or modified a few of my own.
Here’s a few different mode demos of some of my favourite ones:
When you connect to the Eyesy over wifi, you can go in and add or edit modes via your browser.
Fairly recently, someone programmed an A.I. bot to understand how to code for the Eyesy in PyGame (the programming language the Eyesy uses that’s build with Python). To say I was impressed is an understatement!
I had been trying to make a few of my own modes by modifying the provided example modes with limited success. Trying to do some more complicated (for me at least) effects, without any complete understanding of coding in PyGame, resulted in broken modes. With the EyesyBot, I was able to describe what I wanted and then further talk with it to troubleshoot error messages, add additional functionality and generally make the mode much more useful to me.
I did encounter an issue with assigning the knob control to change the background color where the bot only seemed to think the knob values were either 0 or 1. But someone in the Eyesy forum pointed this out and provided fixed code.
For the curious: background_color = etc.color_picker_bg(etc.knob5)
Now, this mode can animate any image to the music or midi input and I can change the background color. Super cool! It’s perfect for having a logo or image you want to be persistent during an event to switch back to or overlay on top of any other visuals.
You can see how the knobs control the image (in this case a green PNG file I made for Plastic Cactus). Knob 1 is scale/size of the image. Knobs 2-3 control the X and the Y position. Knob 4 controls the amount of ‘reaction’ from the audio/midi input with knob 5 controlling the background color as mentioned above.
Around the 0:51s mark I press the on screen display button to show the settings and inputs for the Eyesy. You can clearly see the knob values as I adjust them as well as the audio input coming in (the audio isn’t in the video clip as I ran out of inputs when capturing it). This display also shows you the wifi details.
As with the other video synth tools I use, the Eyesy can easily be connected to other devices to further process and mangle the output. I’m going to endeavour to make some more videos that showcase the stacking of these synths and what’s possible.
If you’re looking for a dead simple option to get some reactive visuals quickly, the Eyesy really can’t be beat.