Audio-visual effects play an important role in my work. Audio-visual effects (AVFX) are effects that manipulate the image and sound of a video clip or video stream simultaneously. AVFX can change the character of a video, but can also create completely new images and sounds. Keeping the link intact between the image and sound, while they are distorted, makes the effect much more impactful. It is a great source of artistic expression. 

Before 1996 I used the Commodore Amiga to make music and visuals. The Amiga was a great, creative computer, but its capabilities to manipulate sounds and video was very limited. ProTracker offered some options to manipulate sounds. For example, I used the sample up and down functions of the sample editor to create the metallic, robot sounding vocal samples in PRC intro and What goes through your mind?. With DPaint it was easy to paint over digitized video frames and to manipulate them, but this was a lot of manual labor and the amount of video frames that could be used was very limited. Software to combine audio and video effects didn’t exist on the Amiga. 

HyperPrism and Premiere

When I got a Macintosh computer in 1996, this all changed. Adobe Premiere and After Effects had a nice collection of visual effects and HyperPrism was a very flexible tool to manipulate sound in creative ways. Since then audio-visual effects became an important part of my art. Painting with image and sound. 

This made producing tracks considerably more labour intensive, however. Principally because the visual part and the audio part of a sample had to be processed using different programs and computers at this time were still not powerful enough to perform these processes quickly. Even simple compositions would take many minutes or even hours to preview or to render. That limited my art. I didn’t finish the GarbiTch FilterFunk remix in 1998, for example, because it took too much time to visualize all the audio effects after I made the music. 

skrtZz - (semi) real time AVFX

In 1998 I developed the skrtZz technique to work with video in a more creative, flexible and musical way and to be able to work with audio-visual effects in (semi) real time. This changed the way I used audio-visual effects. With the skrtZz technique only a short sample is needed to create an extensive audio-visual effect composition.

In 1999 I developed the SenSorSuit to control my live shows with my body motion. Audio-visual effects and the skrtZz concept played an important role in these performances. I was able to play with complex audio-visual effects in real-time this way!

3D audio-visual effects

In 2001, Adobe After Effects added the ability to work with video in 3D. This gave my audio-visual effects a new dimension (and made render times even longer :/). in the DVJ 2.0 project I experimented a lot with 3DAVFX. Aside from audio-visual 3DAVFX compositions and skrtZz solos (see example below for an acid style solo), I created rhythmic re-edits to create 3DAVFX grooves (see the Kylie Minogue example below).

Visual music

Audio-visual effects and visual music (the art of translating sounds or music into an analogous visual presentation) are very closely related. In my work these two artforms are merged into each other. I am interested in the way concrete images and sounds that convey a clear message or information and abstract images and sounds that work on an emotional level can amplify each others effect. Keeping the link between image and sound intact while they change from concrete to abstract and vice versa takes the viewer/listener fluently from one artistic world into another.

Real-time and Non-real-time effects combo

In 1995 I started developing my own instruments to work with music and visuals at the same time in real-time, in one, unified creative process. This led to the development of SenS in 2004. In 2005 the ability to work with 3D audio-visual effects in real-time was added to SenS. Because of hardware limitations these effects had a limited resolution. This meant that the shape of these effects was rough and unpolished. To create more interesting and diverse shapes I combined these effects with pre-rendered non-real-time effects. This way I could create advanced audio-visual effects and still play with them in real-time (the next level of the skrtZz concept).

Real-time audio-visual effects

To work with visuals in a musical way it is important that you can use effects in real time. To be able to jam with new ideas and to work in sync with music live. Working with pre-rendered effects interrupts this creative process. It is time consuming to design and render them. Therefore I stopped using pre-rendered effects in 2006 and invested a lot in the development of real-time effects in 2007-2011.

In 2007 AVblock was added to SenS. A very flexible way of designing audio-visual effects in SenS. AVblock had 125 3D video effects (vertex shaders), 132 audio effects and 19 2D video effects. In AVblock it is easy to combine effects to create even more shapes and sounds.

With Nesa I created many 3D visual effects for AVblock, so called ‘vertex shaders’. In 2007 computer hardware was not capable yet of processing complex effects and play and mix videos at the same time, while running music software. A great aspect of vertex shaders is that they are not so demanding for the computer’s processor. Because vertex shaders are light it was possible to use many effects without the computer slowing down too much. An important downside though, was that these effects tend to look unpolished and raw. I like that aesthetic, but for commercial projects that was a problem. That is one of the reasons I invested a lot in the real-time motion graphics software of SenS in 2007-2011.

Real-time audio-visual effects and generators

I loved AVblock, but when Max for Live (a flexible platform to create plug-ins for Ableton Live) was released it was clear that this concept was outdated. Seamless integration into music software is crucial for an AV instrument and Max for Live offered many new ways to do that much better. Besides that, developing audio effects and visual effects is very time consuming and therefore expensive. with Max for Live the audio effects of Ableton Live can be used to create audio-visual effects as well.

In 2017 EboSuite was launched. Finally the AV instrument I had been working on with EboStudio became available to anyone. With EboSuite it is very easy to combine visual effects with audio effects in Ableton Live. All audio effects and instruments available in Ableton Live can be used.

EboSuite supports ISF shaders. An ISF is an open-source format for the creation of visual effects and visual generators. Many people create these ISF shaders and share them on internet. It is very easy to use these shaders in EboSuite. This makes the visual design possibilities of EboSuite virtually endless. An not just to manipulate visuals, but also to generate completely new visuals. And since computer hardware is very powerful nowadays, these visual effects and shapes look very slick and cool. Try it out for yourself!

Bonus