Music and the Brain: Pt. 3 // Joel Eaton
In this final post in the series we’ll follow up on some of the ideas from the last article on using external stimuli for brain wave control, and look at an example of the innovative applications where people have used brainwaves to make some truly stunning works of art.
Firstly a quick recap. Previously we looked at a method of having four separate means of control via gazing at icons that flash at different speeds. These four controllers were connected to different musical instruments to allow a patient with paralysis to play musical notes along to a backing track.
Essentially these four controllers can be connected (or ‘mapped’) to any type of musical command, with a little bit of technical wizardry. One of the most interesting applications of this technology, especially when performing with it or communicating it to an audience is in how it can be used alongside traditional musical instruments. With that in mind for the last few months I’ve been working on a performance piece, with composer Eduardo Miranda, for a string quartet and a brain quartet. The brain quartet choose different bits of a musical score for each musician to play. Each member of the brain quartet controls the score for one member of the string quartet, who read the notes from a computer screen. The notes are synchronised for the musicians to play in time, and every 24 seconds (or exactly 8 bars of notes) the brain quartet choose the next piece of music.
Perhaps the most exciting outcome of this (to me anyway) is that each member of the brain quartet does not act alone during the performance, and the combined music of the quartet acts as a feedback loop influencing the musical direction. One of the brain performers commented that they felt part of a greater body, where the overall music affected their decision making and, that in a sense, they were working together. Building technology where people (of all abilities) work together towards a common goal, such as composing music, is fascinating, especially when the language is music. So what does it look/sound like? Below is an exclusive clip of the rehearsal for the piece’s première at the Peninsula Arts Contemporary Music Festival. A full documentary on the making of the piece and the performance is currently in the making.
Moving away from this explicit control and back towards one of the first implementations of brain music (see Alvin Lucier’s Music for Solo Performer discussed in the first article) is an area that offers some really amazing and controversial potential. What if we could measure human emotions in brainwaves? A lot of research has shown that it is possible to detect emotional ‘indicators’ within brainwaves. It’s good to remember that emotions are complex, intangible and extremely subjective concepts. However actual levels of primitive emotions (although extremely crude) are detectable within EEG readings. It can even be said that the levels of alertness and relaxation detected by Lucier are crude, emotional ‘states’, and people are currently exploiting this to create art through communicating emotions. One of the most beautifully presented works is by New York artist Lisa Park. In her project Euonia Lisa sonifies (converts her brainwaves to sound) her EEG through speakers attached to dishes filled with water to create a stunning and intimate audiovisual reflection of her brain waves.
So what’s next in this unique area of brainwave control and music? Well, how far we can go with brainwaves is ultimately down to our understanding of them to extract meaningful information from the brain, and this is rapidly increasing. The Holy Grail here is being able to measure thought patterns in brainwaves; literally reading the mind. But then if we could do that surely making music would be one of the last things on our minds, right?
Moving slowly towards this then is the field of detecting crude levels of emotions within brainwaves. It is a controversial claim, but one with quite a lot of scientific backing nonetheless. But what if we could measure people’s emotional responses to music? We could automatically self-medicate music as a response to our mood for one thing. If you’re feeling down then your iPod will know exactly what to play to help cheer you up. But what if artists and musicians could directly tap into the moods of their audience? Imagine watching your favourite artist perform knowing they were selecting songs or music directly in response to how you were feeling at the time? These are the kind of questions I’m going to be investigating next, through developing a new brainwave system built for audience and performer interaction.
© Joel Eaton is a composer, programmer, maker and researcher currently working in the field of Brain-Computer Music Interfacing.