Brain Control, EyeHarp and Instruments in the Phonos Series
Phonos — Cultura i Technologia Musical
Phonos Concert Series 2011–12
3 July 2012
Pompeu Fabra University (Barcelona, Spain)
Phonos in its various activities is tied up with two research groups of the Pompeu Fabra University of Barcelona, the Music Technology Group (MTG) and the Synthetic Perceptive, Emotive and Cognitive Systems (SPECS). During the European Future Technologies Conference (FET) 2009 “Science beyond Fiction” in Prague, we presented the “Multimodal Brain Orchestra: Art through Technology”. This integrated artistic performance displays concepts of interactive, affect-based, self-generated media content. The performance demonstrated new ways of media-based, non-verbal communication between active and passive groups of users through direct interfaces to the brain and the body. Users were able to see how their emotional experience can be modulated by either auditory or media.
In the July 3rd Phonos concert in Barcelona, it was the Music Technology Group (MTG) at Pompeu Farbra University that demonstrated new approaches to music performance via brain control and the EyeHarp. The concert was initially designed around the EyeHarp, which is a musical instrument guided by the musician’s gaze — using eye-tracking technologies — with a potential expressiveness similar to traditional musical instruments. The EyeHarp was developed by Zacharias Vamvakousis, currently preparing his PhD at Pompeu Fabra University, who was also the performer/composer for some pieces in the concert.
The EyeHarp consists of a self-built and low-cost eye tracking device which communicates with an intuitive musical interface. The system allows performers and composers to produce music by controlling sound settings and musical events using eye movement. The main objective of the EyeHarp is to allow people with motor disabilities to play music using only their eyes. In the EyeHarp, several layers are available. One of them is used for building the rhythmic and harmonic musical background and another for playing accompanying melodies on top of the musical background. In this way, the performer is able to control the rhythmic, harmonic and melodic components of his/her composition in real time, as well as to control the timbre of the instrument. The instrument’s timbre is determined by having control over the spectral envelope and the attack-decay time of the sound produced. In addition, the performer has control over the articulation and other temporal aspects of sound such as glissando and vibrato. 1[1. For more information about the EyeHarp, visit Zacharias Vamvakousis’ blog.]
In addition to the pieces performed on the EyeHarp, the concert included several pieces involving brain control: With Nostalgia, Carlos Vaquero demonstrated the transformation of his flute sound through brain control acting on a Max/MSP patch: nice soft, pastel colours on the images created a nostalgic atmosphere. Sueños [Dreams] transformed live guitar sounds produced by composer/guitarist Rafael Ramírez via the brain control system. Cascabel, on the other hand, was a kind of interactive fight between two guitar players with brain controls to force each other towards a kind of quietude.
Most of the pieces in the concert involving brain control use brain computer interfaces (BCI) as sensors to monitor the emotive state of the musicians and then use this information in order to transform the music performance. An algorithm developed by Rafael Ramírez in the MTG was used to detect emotion from electroencephalogram signals. The signals were filtered and processed in order to extract arousal and valence values in real time. In addition, machine learning techniques were applied to classify emotional states into high/low arousal and positive/negative valence. Once the instantaneous emotional state of the musician was determined, this information was used to transform the output in the music performance in real time.
Rafael Ramirez, Zacharias Vamvakousis and their research team in the MTG are currently investigating new ways of detecting emotions from brain activity in real time and using brain-computer interfaces as sensors for enhancing music performance.