top of page

Dot Product & Panther Panther!

Collaboration (2016)

I started working with Dot Product at the beginning of 2015. Adam Wedge asked whether I’d be interested in doing visuals for their debut album that was due to come out early in 2016 through Osiris records. They were interested in having a visual side to their live performance and they had a couple of gigs booked. This fit well with my interests in performing Visual Music and with my preference for prompt deadlines I can work towards. The visuals would be used as an accompanying item to be launched at the time of Dot Product’s release. The visuals would in effect be a long visual counterpart to the album, a short film.

Dot Product’s music is very textural, bass-heavy and atmospheric. It is in no way light or melodic, yet their dissonance and use of sculpted sound is harmonious with the world they create through their compositions. It’s like a dark dream through spaces unknown between particles and atoms. Whilst there is no beat to ground these sounds, there are grooves made up of accidental field recording rhythms that repeat at regular intervals, emphasizing a sense of journeying through the otherworldly abstract: a space where sounds act upon the listener, forcing new experiences in a quasi-aggressive fashion.

Speaking to Adam and Christopher Jarman about the processes they followed to create their music, I became interested in their approach to sound and field recordings. They explained how they would go around urban environments looking for spaces and objects to sample, using digital recorders and microphones they would gather libraries of sound that they would then tear apart in their studios. They talked about how through their manipulation of these sounds, they were intending to approach the fabric of what makes up a sound and break it up from within to create new, exciting sounds.

​

I was inspired by the idea of zooming into a sound as if it were a static, physical object and looking at the particles that make it up. The space between the microscopic would be the setting for these sounds to come to life; it’s constituent parts being nothing but dust waiting to be shaped through air and frequency. It was the atmosphere within the music that I wanted to capture through my visuals. I didn’t want to mark every change in the music, but rather project outwards to the audience, the worlds I felt the music inhabited.

The name of the act is Dot Music, and so I began sketching ideas based around dots and particles. The energy of the music would create different shapes in my mind but I wasn’t certain that I could reproduce these accurately, I preferred the option of creating digital environments that I could afterwards manipulate through live performance. I was also interested in portraying the microscopic as I’d seen at Punto y Raya Festival in Spain. I began researching the use of USB microscopes as visual inputs for live VJing but I felt that the subject matter would detract from the abstract nature of the music. I wanted to go smaller, to the world of particles and therefore, began experimenting with creating virtual representations of cells, blood streams and other biological objects in After Effects.

Slowly the aesthetic for the work began to emerge. I opted for minimal use of colour as the vibe of the music was heavier and somewhat serious, compared to the work I had been doing for the Tropical Bass scene. I decided from early on that I would use black and white for most of the visuals, but that flashes of red would help me accentuate specific moments in the music during live performances.

 

For our first performance in April 2015 I managed to, only just, create enough material for the duration of the performance. Particles moving in a black space towards the viewer or in random directions across the screen would be engulfed in red flashes from abstract shapes in the background. Visual atmospheres created through the digital representation of light and dust worked well with the atmospheres created through the sound. To these images I was able to add effects or other tools I had been developing as a VJ. Using Dot screens, Strobe flashing and live triggering of clips allowed me to interact with the music in a very physical fashion, which helped me communicate the musical ideas more succinctly.

​

There were problems, of course, with these first performances. Mainly it was due to not having quite enough material for all the compositional changes in a live set, so I had to improvise quite a lot with the same images sometime. It was also difficult to guess when rhythmical changes would take place, as the pieces were so irregular. Sometimes I would trigger a clip just after or before a big sound would occur, making the visual representation awkward. 

The project went into a backburner state for almost nine months, but then the launch deadline began approaching and I started working again on creating a 45 minute video for their launch.

 

I looked at the visuals I had done for them before and decided to improve the quality of the animations I had created. I wanted the images to look beautiful in their dark atmosphere, textured and vibrant, just as the sounds were textured and vibrant themselves. Dot Product create compositions that don’t sound digitally clean as some electroacoustic music, there is an analogue feel to them due to their recording and mixing processes which I wanted to reflect on the visuals.

 

For this next phase of the project I created many more motifs that I could use. What this means is that I experiment with After Effects creating visual compositions until I find one that excites me. I then animate different parameters within this composition to create short animations of around 8-16 seconds, to form a specific visual motif. These motifs could be a sphere of dots, simulated light through dust particles, snow particles blowing in the dark, tentacle-like strands dancing over red backgrounds and more. In this way I can generate between 10-20 clips that I can use to represent one of the audio compositions in a set.

 

I also researched ways of creating film-like and analogue textures on digital animations. These techniques included adding grain, glitches, TV lines, holographic glows and other such adornments. The resulting clips worked very well with the music and enhanced the detail of the worlds I had created for representing the music. I was particularly pleased with images where it was difficult to discern whether what was being watched had been virtually created, or filmed.

 

For a live performance I mapped all my clips to individual keys on my MIDI keyboard as well as various effects parameters to my Launch XL controller. To make sure there was still an element of audio-visual synchronization during important sections, I set up reactive parameters within Resolume so that certain frequencies would reveal specific images or cause desired changes in the specific image components.

I began considering the difficulty of creating a 50-minute long video with this approach. Whilst I’d be able to create some very accurate interactions through live manipulation, triggering video clips would still remain haphazard since all the pieces had strange time signatures and no particular rhythm I could easily follow. To overcome this I recorded in Ableton Live a MIDI map of when I wanted certain images to appear. My software was rigged so that any MIDI note pressed in Ableton would trigger specific videos in Resolume. In this way was able to do a rough recording that I later edited meticulously so that image cuts would be as accurate as possible. 

​

With the MIDI map done, I could then “play” this in time with the music whilst focusing on recording live effects manipulation through Resolume. In theory this should have been an easy process but my computer couldn’t handle all these tasks at once, specially when there were too many effects happening at the same time. My computer kept on crashing and would stop recording, sometimes after only 8 seconds had passed.

I wasted a whole day trying to follow a different procedure, this time by trying to edit to the waveform in Premier Pro. I eventually had to accept that this method could not give me the versatility and hands-on manipulation that I could achieve with Resolume. Furthermore, with the latter I can layer clips and manipulate them in time with the music with such ease that attempting to achieve similar effects through keyframing became furiously frustrating.

 

I bit the bullet and recorded my live effects manipulation through Resolume in sections, sometimes as short as a few seconds. However, once I’d gone through the whole album and pieced everything together in Premiere Pro I was amazed at how well it all turned out. The sections flowed from one to the other seamlessly and my live improvisation came surprisingly close to expressing the music flawlessly, sometimes through happy accidents.

 

Live performances worked well too. The issue of not always triggering the video in time with a particular change in the music was somewhat solved by using the audio-reactive features in Resolume, that although a little limited, did the job fine. 

bottom of page