We had the opportunity to show off some of our latest work in BCMI recently, as well as hearing from some leading minds in the biosensing space at an event hosted by Dolby, Emotiv and presented by the Manhattan Producers' Alliance at the Center for New Music in San Francisco.
Teaming up with the pioneering Del Sol String Quartet, Intonic demonstrated a real-time brain computer interface network using the Interaxon MUSE headsets, running the signals through a sophisticated software layer called Neuropype, to monitor the musicians' collective level of meditation. This derived signal controlled the selection of segments of a modular composition.
Watch the performance here:
We also had the pleasure of hearing Dolby's Alex Brandemeyer speak about the neuroscience of audition, and how the company has been researching the possibilities of incorporating their findings into their products.
You can watch Dr Brandemeyer's talk here:
Then we were treated to a live experiment from Dr Jason Carl Rosenberg, a composer and music cognition researcher based in San Francisco, who explored in a live experiment with the audience the idea of a 'melodic cloze' and the ability to accurately predict the end of a muscial phrase. Emotiv donned EEG headsets on 3 audience members to monitor their brainwaves as they performed the task. There were, indeed a few stress spikes as they were asked to sing the correct ending of the phrases!
Watch the demonstration here:
The evening's events all centered around a common theme - what is going on 'under the hood' when we perform or listen to music? How can we make sense of the decisions we make when engaging with new and unexpected musical experiences? Technology has not quite ascended to the point of direct reflection of these experiences, but we are able to begin to see a fuzzy outline of what our brains are doing and how they mirror the outside world in fascinating ways.