Last month I was involved with a few others on a research and development project (R&D) in London led by Japanese musician Nao Masuda. It was an experimental process that looked into ways of visually describing rhythm and notations through the use of Sign Language and physical movement.
When I was first contacted by Nao she told me her ideas and visions for the project. I thought it was interesting that a hearing person with very little signing experience herself wanted to explore and make the subject of music itself into some kind of visual theatre. Although she had exposure to sign language in her line of work, her intention was not to make it a “linguistic” project, or even to provide therapy to “help” deaf people access music, but purely looking at it as a unique, standalone art form.
With her very strong background in music theatre performing many instruments, singing and composing; generally music in theatre would be created for and around a story – the dialogs, acting, dance choreography, lights, props and movement as a sound effect for an example. However in this case, Nao wanted to step away from that and look into creating physical and visual transcription of the music as opposed to “personal interpretation” inspired by dance for an example. Since she regarded this as an equivalent of using Sign Language in some sense, she sought advice, input and opinions from those with fluid signing skills and professional artistic backgrounds.
As a result she set up a team consisting of herself and me as musicians, Caroline Parker MBE (deaf sign song actress), Mark Smith (choreographer from Deaf men dancing/Paralympics artistic director), Jacob Casselden (Deaf Tribes actor) and Avatara Ayuso (a hearing Spanish choreographer) as collaborators.
The two weeks spent with all of them have been such a phenomenal experience. I honestly have to say I learnt so much during that time and a lot of energies were put into the research. Instead of simply going straight into doing ‘music sign language’ lots of questions and discussions spiralled off for very interesting and various reasons which actually shaped the research process. Even Nao was overwhelmed!
For example, Jacob has absolutely no hearing (he does not wear hearing aids at all) but through drumming and movement he has a completely natural and impeccable sense for rhythm and pulse. After watching Nao play the piano and Celtic harp his questions were ‘Did pitch go up and down?’ ‘what’s the rhythm?’ also ‘I cannot feel or hear the flute; what is Ruth playing?’. As strange as it may sound to some people – music is very important to him; inside him is a world of silence but looking out where buses drive by and people walking he sees a lot of ‘noises’. He is addicted to feeling sounds entering his body as it interrupts the silence and feels he can ‘hear’ them.
Sign song actress Caroline Parker has been performing and translating songs in British Sign Language in the last 30 years has just been recognised and awarded MBE for theatre and drama. Although Caroline can hear very well with the use of her analogue hearing aids, she was questioning ‘Can you clarify it’s tone?’ ‘Why do these sounds clash with each other, but not this and that put together?’ ‘When does major or minor keys come in?’
The above few questions for example demonstrated the need to find out as much information as possible about music to understand the true expression of sound behaviour to match the physical performances. Clashing sounds (dissonance) causes discomfort to the listening ear must be shown in physical performance through facial expressions perhaps. What about the flute’s vibrato (Pitch warble) staccatos (short dotted sounds) and dynamics? (loud/softness).
Therefore in this process Nao and I spent some time explaining all about the pitches in music, instrumental behaviours, how music are written and recorded, scales, rhythm work, time signatures, note relationships, dissonance etc. All of these helped everyone to make informed choices when it came to composing music/Sign Language movements too.
The very first thing we did was look at instrument’s language of sound. Nao plays the Japanese Taiko drums, Celtic harp, bass guitar, piano, djembes, Cajon, cymbals, bells and chimes and had sounds translated into words.
During the research Mark Smith came up with moves that are part dance choreography with sign language in the mix. For example: Shime Daiko (High pitched Japanese drum): Tangy orange flavour; sharp; strong.
More feedback in attachment: Music in Motion Workshops feedback from hearing participants
As there are 12 pitch notes in each octave ranging from low to high called the chromatic scale, choreographers Mark and Avatara devised moves for each of the 12 notes. Physically they can repeat the 12 notes three times starting at below middle C.
After that Nao wrote a gentle piece of music for flute and keyboard. We took the chromatic scale movements created by Mark and Avatara as a true representation of each of the music notes and rhythms written by Nao. Caroline signed the melody (flute), Jacob signed the piano and Mark signed the bass (also piano).
If you watch our video below notice how Caroline’s body language is showing that the flute is warm, smooth, and pretty much playing in the middle register. Jacob had a busy job of doing the steady, relaxed semiquaver runs below middle C while Mark had a full length of the root chord on the keyboard. Mark sitting down on the floor showed that the root chord was at a lower octave.
Before the performers could visually sign and interpret Nao’s piece, we looked into ways of making music accessible and understood for Jacob. Although he can read music and see where the pitches and rhythms are placed, it is not the same. Therefore Caroline came up with a smart idea of recording my flute playing the melody on her BlackBerry which gave him instant access to feeling the pitch vibrations. He then had the blackberry on his hand close to his chest while the other hand was feeling the bass playing by Nao on the keyboard. By feeling the melody on the blackberry together with bass separately helped him to understand how the three separate tunes work together (polyrhythms). I think you will find that his reaction says it all.
Lastly, Caroline, Jacob and Mark came up with their “Sign Languages” using the five notes (pentatonic scale) for me and Nao to follow with our music instruments.
Interestingly, I think you can see two contrasting performances. The first one written by Nao is pretty, tuneful and with direction. Her piece had some minor clashes but they got ‘resolved’ by sounding ‘right’ again. It was structured in using the ternary form which is a three part musical block schematised as A-B-A. The A parts were musically identical and the B part had a simple contrast to them. It made the music and movements flowing and safe. It started and ended with a nice chord finish. The performers signed and stayed in one fixed place.
However Caroline, Mark and Jacob’s piece feels a little unpredictable in terms of what is going to happen in musical sense. It involved a lot of moving around, a chair entry, and eye contact. Their composition were based on improvised ideas, moving to swap places, the chair, head nodding, body spinning, smiling/sour faces and eye contact. Their arm movements displayed a huge range in pitch, speed and varying articulation. It was interesting, very convincing with a lot of beautiful note clashes and an ‘improper’ finish sounding wise but they were very satisfied with their performance.
It has been a huge learning curve in such a short crazy space of time. I absolutely loved the project and the artists I worked with. Sign language is so adaptable and can fit in very well with music. Although I did write at the very beginning that the Research & Development project was not essentially about providing ‘music’ access to deaf people but purely as a visual-music in Sign Language art form, I am beginning to think that deaf people would warm to the new idea.