Get all your news in one place.
100’s of premium titles.
One app.
Start reading
LiveScience
LiveScience
Christoph Schwaiger

Listen to Pink Floyd's 'Another Brick in the Wall,' as decoded from human brain waves

photo of Pink Floyd bass player Roger Waters raising his arms above his head and smiling from the stage at a concert in 2012

(Audio credit: Bellier et al., 2023, PLOS Biology, CC-BY 4.0 (https://creativecommons.org/licenses/by/4.0/))

By recording and decoding people's brain activity as they listened to Pink Floyd's "Another Brick in the Wall," scientists recreated recognizable snippets of the 1979 protest song. 

In some clips, you can make out a Roger Waters-like voice crooning the well-known chorus — but in others, the anthem sounds much muddier. Still, the researchers say that the work adds "another brick in the wall" of our understanding of how the brain processes music and could have future applications in brain-computer interfaces (BCIs) that help people communicate. 

Previous studies had reconstructed intelligible speech by decoding it from brain activity, and research has shown that music can be reconstructed using similar techniques. The functional overlap of the brain structures involved in processing these two types of complex acoustic signals makes this possible.

Related: Google's 'mind-reading' AI can tell what music you listened to based on your brain signals

(Audio credit: Bellier et al., 2023, PLOS Biology, CC-BY 4.0 (https://creativecommons.org/licenses/by/4.0/))

In the new study, published Tuesday (Aug. 15) in the journal PLOS Biology, researchers  wanted to better understand how humans process music with the intention of developing BCIs. For people that can mentally form words but can't physically speak, like those with locked-in syndrome, such devices can help them communicate. 

BCIs incorporate models that translate brain activity into words, but lack models that capture musical elements, like pitch, melody, harmony and rhythm. Such models could help users better convey the emotion behind their words, senior author Robert Knight, a professor of psychology and neuroscience at the University of California, Berkeley, told Live Science. For example, they may be able to turn a robotic-sounding "I love you" into a declaration with a more human ring to it, he said.

The team analyzed the brain activity of 29 people who listened to Pink Floyd's "Another Brick in the Wall, Part 1." Each participant had epilepsy and had undergone a procedure called intracranial electroencephalography (iEEG), during which 2,668 electrodes were placed on their cortex, the wrinkly surface of the brain. 

Of those, 347 electrodes were most relevant for processing music. Rhythm perception was tied to a specific portion of the superior temporal gyrus (STG), part of the brain known to be key for auditory processing. Most of the other key electrodes were in the sensorimotor cortex, which processes and responds to sensory information, and inferior frontal gyrus, linked to language comprehension and production. 

The participants were asked to listen to the music without focusing on any details. The electrodes then picked up the electrical activity of their neurons, capturing how different musical elements were encoded in different brain regions. The scientists decoded that data into the song's acoustics using regression-based models, which reveal the relationship between different variables and an anticipated outcome. These models spit out a spectrogram, a visual representation of sound frequencies through time, which the team reconstructed as an audio file. 

The result: a melody that resembled — but was not identical to — the original one played to the participants.      

"There are certain segments of the song that you can clearly tell that the reconstruction is 'Another Brick in the Wall,'" Knight said. "There's certain segments you really can't …  It's too muddy."

"We're not trying to say we produced high-fidelity Pink Floyd," he added, but they did manage "to get a highly reliable spectrogram" from relatively few, well-placed electrodes.

Knight thinks the reconstructed song's quality would improve with higher-density electrodes; the ones the team used were spaced around 5 millimeters apart. "But we know that the cortex actually has independent information at one to one and a half millimeters," Knight said. Another limitation was that the researchers didn't probe participants' familiarity with the Pink Floyd song or their general music knowledge.

Pietro Avanzini, a neuroscience researcher at Italy's National Research Council who was not involved in the study, described the study as fascinating, as it reveals which parts of a person's neural machinery process different musical features. Moreover, it highlights differences in each person's brain's reaction to the same stimulus, "giving value (and potentially a neural basis) to the variability of our perceptual experience," he said.

Was there a reason the scientists chose to study "Another Brick in the Wall," in particular?

"I think we all like Pink Floyd," Knight said.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.