Neurons in the brain that respond to the sound of singing have been identified in a new study. However, these neurons do not respond to any other type of music. Professor Samuel Norman-Haignere of the Del Monte Institute for Neuroscience, University of Rochester, is the first author on the paper published in Current Biology that explains these findings, which were published earlier this year. According to Norman-statement, Haignere’s “The work provides evidence for relatively fine-grained segregation of function within the auditory cortex, in a way that aligns with an intuitive distinction within music.”
Near regions that are selective for speech and music is a specific area of the brain that is dedicated to singing. Electrocorticography, or ECOG, is a method of locating seizure-related activity in epilepsy patients who have electrodes implanted in their brains. The electrical activity of the brain can now be more precisely measured thanks to ECoG. “The increased precision allowed us to locate the song-responsive neuron subpopulation. We now have a bird’s eye view of how the human auditory cortex is organized, and our findings suggest that different neural populations respond selectively to different categories, such as speech, music, and singing.”
FMRI has previously been used to study the brains of participants while they listened to various types of speech and music. Norman-Haignere mapped the locations of song-selective neural populations by combining fMRI data from a previous study with data from their new ECoG study.
As the study’s co-senior author Dr. Josh McDermott of McGovern Institute for Brain Research and CBMM (CBMM) puts it: “This methodological advance is a significant step forward.” This issue of sparse recordings has always limited the number of people who have been doing ECoG for the past 10 or 15 years. Sam is the first to combine the improved electrode recording resolution with fMRI data to better localize the overall responses.”
Knowing when and where neurons respond to different sounds in the auditory cortex is critical to developing an understanding of how the brain processes speech and music. Nature Human Behavior has published a new method for measuring the timescale over which different brain regions incorporate information. “We’re trying to figure out how long each type of neuron takes to process information. One hundred-millisecond window suggests that neurons are looking at phonemes, but not whole sentences,” said first author Norman-Haignere. For estimating integration times in the brain, “we didn’t have a general-purpose method prior to this research.”
Researchers will be able to better map the processing of information in the brain if they can figure out when it occurs. In order to create models that more accurately reflect what happens in the brain, we must first understand how information is coded in various brain regions. We’re getting closer and closer to figuring out how these representations are connected to perception with each step we take in this research.”