Skip to content
Nov 28 / Ruth Montgomery blogspot

Residency and live performance for Dada Festival/Syndrome in Liverpool

DADA Performance Rough cut from davelynch on Vimeo.

Residency and live performance for Dada festival / Syndrome in Liverpool

Taking place at the Bluecoat, the space was set up with a large central screen, an electric piano keyboard to the left and a condenser microphone to the centre right. Off to the side of the space, the Frozen Music Collective set up their equipment and computer, and in the central part of the stage was a bass woofer surrounded by a scaffold set up for a cymatics experiment; the study of visible sound co-vibration.

The performances began with Ruth Montgomery, who played a beautifully delicate and impressively dexterous improvisation on the flute. Towards the end of this piece, other technological sounds began to interweave with her playing; a low hum and other random non-musical sounds such as a low level theremin which, for lack of a better description, created a sense of pure science fiction.

Following this, the Frozen Music Collective shared discussion about how they use technology and science to create visual art from the electromagnetic energies emitted from the brain in response to stimulus and levels of brain activity. This discussion  explained the images which appeared on the main screen throughout the performance, which depicted a chart showing the spectrum of brainwaves; delta, theta, alpha beta and gamma and their associated frequencies. Colour bursts and pulses had been appearing on the screen during Ruth’s performance, and I spotted that deaf pianist Danny Lane, though he hadnʼt yet played was present on stage and wearing a small electrode on the side of his head.

The electrode was part of an EEG set up that had been recording impulses from Danny, in particular muscular impulses from blinking that were triggering the bursts on screen. The signals were processed through the computers and software that the Frozen Music Collective were using live and were to come into play as the experiments progressed through the improvised duets between the musicians.

Alongside the technology, the improvisations drew upon the parallels between the musical direction and British Sign Language (BSL), an invaluable tool that has helped these musicians during their research and rehearsal process via Skype in the run up to the performance.

A mimetic performance from Ruth ʻjugglingʼ communicated in a tangible way to the audience how the two were closely related in creating a way of interpreting action and response to make music. The way the performance transitioned from live musical performance to tech demonstration was fascinating and even though there were some things which did not quite go to plan technically, you felt part of the experiment as you were learning alongside the amazing neuroscientists and coders that worked to keep the connection between what they were both measuring and processing to perform with simultaneously.

It would be impossible to explore in depth everything I witnessed and learned here, but one of the most profound things the experience left me with was the revelation that the physical sense of hearing should not be, and evidently is not a hinderance to a deaf musician. Plenty of hearing people are not musical and as a musical person myself, it made me reflect on what exactly it is within me that responds in order to have an innate ʻfeelʼ for musicality, as well as to be able to process and produce music in response to that.

I saw how studying music theory and learning to read a score might allow someone to master co-ordination in solo and ensemble work but the fluidity and imagination required to improvise is by no means an obstacle only for a deaf musician. Having a dedicated group of collaborators all asking the question ʻWhere does music come from?ʼ was what made this event so special and culminated in a superb evening of ʻexperimentʼ that left you inspired, whether you are a hearing person or part of the deaf community.

This event was part of DaDaFest 2014 and took place at the Bluecoat. You can find out more about future Syndrome live art events at


Mixing science, music and the visual arts to explore the nature of performance and deafness, using realtime brainwave scanning to generate a live improvised score.  Susan Bennett witness a performance culminating from a 4-day residency with the Frozen Music Collective, deaf musicians Ruth Montgomery and Danny Lane, and a team of neuroscientists and coders.

Syndrome 3.1

It was strange, to watch someone’s brain signals pulsating, gyrating, expanding and contracting on a full size screen in a motion akin to amoebic breathing. Areas of theta, delta, alpha, beta, gamma brain activity high, low and just plain ordinary mapped out the boundaries of perception while the central core flashed intermittently in star bursts according to the intensity of someone’s reaction to a haunting lilting flute played by Ruth Montgomery. It was weirdly wonderful.

We were taking part in a presentation for DaDaFest showing the products of a four day residency of a team of neuroscientists, coders, Music for the Deaf musicians and the Frozen Music Collective, a new music and multimedia group exploring the symbiosis of the human brain and technology.

Exploration was the key word, it came up time and time again. Matching neuron activity to technology is usually the province of medical science but at the Bluecoat in Liverpool the alternative artistic interpretations were simply crazily beautiful, beguiling and startling. I pondered, irreverently, what Freud would have done with such a tool as this.

For there was no hiding a natural reaction. Once musician Danny Lane was wired up to the headset his instinctive response to music was plain to all who could see the screen. His brain signals were transmitted via computer programmes, which analysed, filtered and refined the impulses in a way which mystified, so that they could be mixed, played and enjoyed.

We were treated to a precisely timed and mimed duet between deaf musicians, tossing illusive balls between them, keyboard notes accenting the actions. A BSL on screen conversation morphed into many hands, Ganesh like arms and distorted features as the flautist wired up to one of these headsets, commanded music, live shapes and performance as well as directing the keyboard player. Her brain activity was turned into sound and light by using a graphical interface to produce a live score.

No, I don’t pretend to understand how all this works but I watched with interest the demonstration which aimed to show how shapes could be related to musical notes. Using an elaborate Pensieve type bowl familiar to Harry Potter fans and with the lights dimmed we were enthralled as a crucible of light topped up by copious jugs of water, formed pools of luminous green which became visual representations of musical vibrations.

In response shapes formed, dissolved and swirled like the essence of life itself. Fragments squirmed and multiplied, then shot bolt scattered in all directions around a central core of white pulsing dotted light, while we were projected further and further away, millions of light years backwards through the universe. Music followed us, or maybe we took it with us: little trills on the flute, ripples off the keyboard, deep pulsing throbbing reverberations, rising triumphantly, sighing back to earth. Part mystical, musical and meditative it was indeed an exploration.

So what next? More exploration… inclusion of visually impaired people in this brainy experiment… more time to reflect, refine the tools and techniques?  Watch this collective space!

Leave a comment