Brains at the Neuraltones launch performance at Carnegie Hall

I had the opportunity to be part of a team launching a new non-profit organization, the Neuraltones Foundation, through a unique “music meets brain” performance at Carnegie Hall on 6 October 2024. The performance concept, “Boundless Horizon” came from the amazing Chenyi (Tina) Avsharian, who is the chair of the foundation, COO of Shar Music, and the world renown gold medal violinist who performed the concert. New York Classical Review wrote a wonderful piece on the evening.

The Neuraltones Foundation’s mission is to support access to early childhood music education in underserved populations; they have proposed a three-year functional brain study looking at the developmental trajectory of students aged five to nine to demonstrate the efficacy of their program.

Here is a clip from sound check.

I initially flew to Ann Arbor Michigan to record the data during a mock performance including eight solo pieces, three parts of a violin duet, and the two audience members who were restricted to either just hearing or just seeing the performance. Tina wanted to display animated playbacks of this data on stage, as temporally-locked as possible to her live performance. It was only a few weeks between data collection and when the content needed to be finalized in the multimedia presentation that would play on the screen behind her.

I did not have any commercial software that would contribute more than deartifacting to the raw data, so I needed to analyze and visualize the cleaned data using a Python library for working with EEG data called MNE.

As you might imagine the raw data was really noisy with movement, EMG, and EOG artifacts. However, I couldn’t exclude those sections from the analysis because we needed a recording the same length as the live performance. I removed EOG with ICA as laterals from reading a score would contaminate across many bands, and I did not anticipate working with coherence and phase. I wasn’t interested in delta for this application so I applied a high pass at 5 Hz which got rid of a lot of movement. I also low passed at 20 Hz, but knowing that EMG would persist below that cutoff.


The next issue was how to represent the salient activity in a manner that a lay audience could intuit. I decided that I would convert the time series at each electrode position into a spectrogram, which would give me the power across frequency bands. I had collected a baseline recording with EO while sitting quietly to determine each of the participants’ PAF. I then established baseline power at each electrode location using a frequency band of +/- 2 Hz from their PAF. (So, for example, if someone had a PAF of 11.5 Hz, I was interested in power from 9.5 to 13.5 Hz.)

I then summed up the PAF-centered bins under task (playing music) to produce a new time series. I then subtracted off baseline power so that I would have power relative to resting, but I also flipped the sign because I wanted less power in the band around their PAF to mean greater activation. This assumption was based on the premise that less alpha meant greater desynchronization into the local cortical processing on task.

Once a meaningful data transformation was complete, creating the animated topomaps was straightforward: construct the EvokedArray data structure expected by the visualization method and assign the transformed data to each electrode position. Export each frame and then interpolate for a smooth playback as a 24 fps AVI file, which could easily be incorporated into the final presentation.

For the duets, I also produced a second animation overlaying respective time series from a single electrode location with yellow shading representing when both players had their peaks and nadirs aligned within a small window, as a measure of synchrony.

Given more time, I might have generated a 3D brain instead of a flat map. And we can certainly argue whether there are better metrics. But given the time constraints, this got the point across to the audience.

Particularly interesting was degree to which synchrony could be observed in the topomaps during the duets.

I enjoyed the challenge and opportunity of this project. The team at Neuraltones/Shar Music was fantastic to work with. And, of course, it was cool to hang out with them during sound check, backstage, and at the after party.