Friday, March 18, 2011

Mind music


Here’s my latest news story for Nature. Eduardo Miranda is working very much at the experimental edge of electronic music – what I’ve heard has an intriguing ethereal quality which grows on you (well, it did on me).
__________________________________________________

A pianist plays a series of notes, and the woman echoes them on a computerized music system. And she plays a simple improvised melody over a looped backing track. It doesn’t sound much of a musical challenge – except that the woman, a stroke victim, is paralysed except for eye, facial and slight head movements. She is making the music purely by thinking.

This is a trial of a computer-music system that interfaces directly with the user’s brain, via electrodes on the scalp that pick up the tiny electrical impulses of neurons. The device, developed by composer and computer-music specialist Eduardo Miranda of the University of Plymouth in England and computer scientists at the University of Essex, should eventually enable people with severe physical disabilities, caused for example by brain or spinal-cord injuries, to make music for recreation or therapeutic purposes.

“This is surely an interesting avenue, and might be very useful for patients”, says Rainer Goebel, a neuroscientist at the University of Maastricht in the Netherlands who works on brain-computer interfacing.

Quite aside from the pleasure that making music offers, its value in therapy – for example, its capacity to awaken atrophied mental and physical functions in neurodegenerative disease – is well attested. But people who have almost no muscle movement at all have generally been excluded from such benefits and can enjoy music only through passive listening.

The development of brain-computer interfaces (BCIs) that can enable users to control computer functions by mind alone offer new possibilities for such people. In general these interfaces rely on the user’s ability to learn how to self-induce particular mental states that can be detected by brain-scanning technologies.

Miranda and colleagues have used one of the oldest of these techniques: electroencephalography (EEG), in which electrodes on the skull pick up faint neural signals. The EEG signal can be processed quickly, allowing fast response times. The instrumentation is cheap and portable in comparison to brain-scanning techniques such as magnetic resonance imaging (MRI) and positron-emission tomography (PET), and operating it requires no expert knowledge.

Whereas previous efforts on BCIs have tended to focus on simple tasks such as moving cursors or other screen icons, Miranda’s team sought to achieve something much more complex: to enable the user to play and compose music.

Miranda says he became aware of the then-emerging field of BCIs over a decade ago while researching how to make music using brainwaves. “When I realized the potential of a musical BCI for the well-being of severely disabled people”, he says, “I couldn’t leave the idea alone. Now I can’t separate this work from my activities as a composer – they are very integrated.”

The trick is to teach the user how to associate particular brain signals with specific tasks by presenting a repeating stimulus – auditory, visual or tactile, say – and getting the user to focus on it. This elicits a distinctive, detectable pattern in the EEG signal. Miranda and colleagues show several flashing ‘buttons’ on a computer screen, each one triggering a musical event. The users ‘push’ a button just by directing their attention to it.

For example, a button might be used to generate a melody from a pre-selected set of notes. The intensity of the control signal – how ‘hard’ the button is pressed, if you like – can be altered by the user by varying the intensity of attention, and the result is fed back to them visually as a change in the button’s size. In this way, any one of several notes can be selected by mentally altering the intensity of ‘pressing’.

With a little practice, this allows users to create a melody just as if they were selecting keys on a piano. And as with learning an instrument, say the researchers, “the more one practices the better one becomes.” They describe it in a forthcoming paper in the journal Music and Medicine [1].

The researchers trialled their system with a female patient at the Royal Hospital for Neuro-disability in London, who is suffering from locked-in syndrome, a form of almost total paralysis caused by brain lesions. During a two-hour session, she got the hang of the system and was eventually playing along with a backing track. She reported that “it was great to be in control again.”

Goebel points out that the patients here still need to be able to control their gaze, which people suffering from total locked-in syndrome cannot. In such partial cases, he says, “one can usually use gaze directly for controlling devices, instead of an EEG system”. But Miranda points out that eye-gazing alone does not permit variations in the intensity of the signal. “Eye gazing is comparable to a mouse or joystick”, he says. “Our system adds another dimension, which is the intensity of the choice. That’s crucial for our musical system.”

Miranda says that, while increasing the complexity of the musical tasks is not a priority, music therapists have suggested it would be better if the system was more like a musical instrument – for instance, with an interface that looks like a piano keyboard. He admits that it’s not easy to increase the number of buttons or keys beyond four, but is confident that “we will get there eventually”.

“The flashing thing does not need to be on a computer screen”, he adds. It could, for example, be a physical electronic keyboard with LEDs on the keys. “You could play it by staring at the keys”, he says.

References
1. Miranda, E. R., Magee, W. L., Wilson, J., Eaton, J. & Palaniappan, R. Music and Medicine (published online), doi:10.1177/1943862111399290.

No comments: