Scientists say they have found how brain rhythms are used to process music, a finding that shows how our perception of notes and melodies can be used to better understand the hearing system.
The study, which appears in the journal Proceedings of the National Academy of Sciences this week, points to a new role for “cortical oscillations,” or rhythmic repetitions of nerve cell activity in the brain.
These oscillations are involved in detecting of musical sequences, and musical training may enhance their function, according to the study.
“We’ve isolated the rhythms in the brain that match rhythms in music,” explains Keith Doelling, a doctoral student at New York University and the study’s lead author. “Our findings show that the presence of these rhythms enhances our perception of music and of pitch changes.”
The study found that musicians have stronger oscillatory mechanisms than non-musicians, a finding whose importance goes beyond the value of musical instruction, the authors said.
“What this shows is we can be trained, in effect, to make more efficient use of our auditory-detection systems,” said study co-author David Poeppel, a professor at the university. “Musicians, through their experience, are simply better at this type of processing.”
Previous research has shown that brain rhythms very precisely synchronize with speech, enabling us to parse continuous streams of speech—in other words, how we can isolate syllables, words, and phrases from speech, which is not, when we hear it, marked by spaces or punctuation.
But it hasn’t been clear what role such cortical brain rhythms, or oscillations, play in processing other types of natural and complex sounds, such as music.
The researchers conducted three experiments using a technique called magnetoencephalography, which allows measurements of the tiny magnetic fields generated by brain activity.
Participants in the study were asked to detect short pitch distortions in 13-second clips of classical piano music by Bach, Beethoven, Brahms that varied in tempo—from half a note to eight notes per second.
For music that is faster than one note per second, both musicians and non-musicians showed cortical oscillations that synchronized with the note rate of the clips, the authors said. In other words, everyone effectively used these oscillations to process the sounds, although musicians’ brains synchronized more to the musical rhythms.
But only musicians showed oscillations that synchronized with unusually slow clips.
This difference, the researchers say, may suggest that non-musicians can’t process the music as a continuous melody rather than as individual notes. Moreover, musicians much more accurately detected pitch distortions—as evidenced by corresponding cortical oscillations. Brain rhythms, they add, therefore seem to play a role in parsing and grouping sound streams into “chunks” that are then analyzed as speech or music.