Audio-visual effects play an important role in my work. Audio-visual effects (AVFX) are effects that manipulate the image and sound of a video clip or video stream simultaneously. AVFX can change the character of a video, but can also create completely new images and sounds. Keeping the link intact between the image and sound, while they are distorted, makes the effect much more impactful. It is a great source of artistic expression.
Before 1996 I used the Commodore Amiga to make music and visuals. The Amiga was a great, creative computer, but its capabilities to manipulate sounds and video was very limited. With DPaint it was easy to paint over digitized video frames and to manipulate them, but this was a lot of manual labor and the amount of video frames that could be used was very limited and in low resolution and with few colours. Software to combine audio and video effects didn’t exist on the Amiga.
With the C++ computer programming language it was possible to create your own animated visual effects on the Commodore Amiga. That was a slow process and it was hard to create complex effects. During my studies at the Image and Media technology course at the School of Arts, I created a few animated visual effects this way in 1992.
HyperPrism and after effects
When I got a Macintosh computer in 1996, this all changed. Adobe Premiere and After Effects had a nice collection of visual effects and HyperPrism was a very flexible tool to manipulate sound in creative ways. Since then audio-visual effects became an important part of my art. Painting with image and sound.
This made producing tracks considerably more labour intensive, however. Principally because the visual part and the audio part of a sample had to be processed using different programs and computers at this time were still not powerful enough to perform these processes quickly. Even simple compositions would take many minutes or even hours to preview or to render. That limited my art. I didn’t finish the GarbiTch FilterFunk remix in 1998, for example, because it took too much time to visualize all the audio effects after I made the music.
skrtZz & (semi) real-time AVFX
From 1997 I used Xpose/SampleCell and from 1999 Image/ine to trigger videos live. In the 1990’s computer hardware was too slow to process audio-visual effects in real-time. To solve this, I used audio-visual samples with pre-rendered audio-visual effects. Jumping through the video created the illusion that I was using audio-visual effects live. I kind of was actually. For me it felt like that anyway. SkrtZzing with pre-rendered effects was a lot of fun and offered a lot of creative freedom. It played an important rol in my SenSorSuit performances and in the DVJ 2.0 project. Even the timing of complex effects was easy to control live this way, without latency!
Usually I made sure the pre-rendered AVFX in the sample had a simple, logical development over time, starting with no effect and then adding more effects as time progresses. This way, I could easily move from a ‘clean’ visual/sound to the distorted visual/sound, resulting in smooth, natural sounding/looking skrtZz compositions.
In 1998 I developed the skrtZz technique to work with video in a more creative, flexible and musical way and to be able to work with audio-visual effects in (semi) real time. This changed the way I used audio-visual effects. With the skrtZz technique only a short sample is needed to create an extensive audio-visual effect composition.
3D audio-visual effects
In 2001, Adobe After Effects added the ability to work with video in 3D. This gave my audio-visual effects a new dimension (and made render times even longer :/). in the DVJ 2.0 project I experimented a lot with 3DAVFX. Aside from audio-visual 3DAVFX compositions and skrtZz solos (see example below for an acid style solo), I created rhythmic re-edits to create 3DAVFX grooves (see the Kylie Minogue example below).
Audio-visual effects and visual music (the art of translating sounds or music into an analogous visual presentation) are very closely related. In my work these two artforms are merged into each other. I am interested in the way concrete images/sounds that convey a clear message and abstract images/sounds that work on an emotional level can amplify each others effect. Keeping the link between image and sound intact while they change from concrete to abstract and vice versa takes the viewer/listener fluently from one artistic world into another.
Real-time & Non-real-time AVFX
In 1995 I started developing my own instruments to work with music and visuals at the same time in real-time, in one, unified creative process. This led to the development of SenS in 2004. In 2005 the ability to work with 3D audio-visual effects in real-time was added to SenS. Because of hardware limitations these effects had a limited resolution. This meant that the shape of these effects was rough and unpolished. To create more interesting and diverse shapes I combined these effects with pre-rendered non-real-time effects. This way I could create advanced audio-visual effects and still play with them in real-time (the next level of the skrtZz concept).
To work with visuals in a musical way it is important that you can use effects in real time. To be able to jam with new ideas and to work in sync with music live. Working with pre-rendered effects interrupts this creative process. It is time consuming to design and render them. Therefore I stopped using pre-rendered effects in 2006 and invested a lot in the development of real-time effects in 2007-2011.
In 2007 AVblock was added to SenS. A very flexible way of designing audio-visual effects in SenS. AVblock had 125 3D video effects (vertex shaders), 132 audio effects and 19 2D video effects. In AVblock it is easy to combine effects to create even more shapes and sounds.
With Nenad Popov I created many 3D visual effects for AVblock, so called ‘vertex shaders’. In 2007 computer hardware was not capable yet of processing complex effects and play and mix videos at the same time, while running music software. A great aspect of vertex shaders is that they are not so demanding for the computer’s processor. Because vertex shaders are light it was possible to use many effects without the computer slowing down too much. An important downside though, was that these effects tend to look unpolished and raw. I like that aesthetic, but for commercial projects that was a problem. That is one of the reasons I invested a lot in the real-time motion graphics software of SenS in 2007-2011.
With Timo Rozendal I created many audio effects for AVblock, because audio effects to morph sound into completely new sounds are rare. We made effects to generate bass sounds from voice samples, drum sounds from random sounds and many other creative effects.
Real-time AVFX & Generators
I loved AVblock, but when Max for Live (a flexible platform to create plug-ins for Ableton Live) was released it was clear that this concept was outdated. Seamless integration into music software is crucial for an AV instrument and Max for Live offered many new ways to do that much better. Besides that, developing audio effects and visual effects is very time consuming and therefore expensive. with Max for Live the audio effects of Ableton Live can be used to create audio-visual effects as well.
In 2017 EboSuite was launched. Finally the AV instrument I had been working on with EboStudio became available to anyone. With EboSuite it is very easy to combine visual effects with audio effects in Ableton Live. All audio effects and instruments available in Ableton Live can be used.
EboSuite supports ISF shaders. An ISF is an open-source format for the creation of visual effects and visual generators. Many people create these ISF shaders and share them on internet. It is very easy to use these shaders in EboSuite. This makes the visual design possibilities of EboSuite virtually endless. An not just to manipulate visuals, but also to generate completely new visuals. And since computer hardware is very powerful nowadays, these visual effects and shapes look very slick and cool. Try it out for yourself!