How does the brain process and store words we hear?
US researchers studying the brain’s auditory lexicon find implications for stroke survivors and others with brain disorders.
Neuroscientists from Georgetown University Medical Centre, US, say the brain’s auditory lexicon, a catalogue of verbal language, is actually located in the front of the primary auditory cortex, not in back of it- a finding that upends a century-long understanding of this area of the brain. The new understanding, published in Neurobiology of Language, matters because it may impact recovery and rehabilitation following a brain injury such as a stroke.
The research team showed the existence of a lexicon for written words at the base of the brain’s left hemisphere in a region known as the Visual Word Form Area (VWFA), and subsequently determined that newly learned written words are added to the VWFA. The present study sought to test whether a similar lexicon existed for spoken words in the so-called Auditory Word Form Area (AWFA), located anterior to primary auditory cortex.
“Since the early 1900s, scientists believed spoken word recognition took place behind the primary auditory cortex, but that model did not fit well with many observations from patients with speech recognition deficits, such as stroke patients,” said Dr Maximilian Riesenhuber, Professor in the Department of Neuroscience at Georgetown University Medical Centre and senior author of this study.
“Our discovery of an auditory lexicon more towards the front of the brain provides a new target area to help us understand speech comprehension deficits.”
In the study, led by Dr Srikanth Damera, 26 volunteers went through three rounds of functional magnetic resonance imaging (fMRI) scans to examine their spoken word processing abilities. The technique used in this study was called functional-MRI rapid adaptation (fMRI-RA), which is more sensitive than conventional fMRI in assessing representation of auditory words as well as the learning of new words.
“In future studies, it will be interesting to investigate how interventions directed at the AWFA affect speech comprehension deficits in populations with different types of strokes or brain injury,” explained Riesenhuber. “We are also trying to understand how the written and spoken word systems interact. Beyond that, we are using the same techniques to look for auditory lexica in other parts of the brain, such as those responsible for speech production.”
Dr Josef Rauschecker, Professor in the Department of Neuroscience at Georgetown and co-author of the study, added that many aspects of how the brain processes words, either written or verbal, remain unexplored.
“We know that when we learn to speak, we rely on our auditory system to tell us whether the sound we’ve produced accurately represents our intended word,” he concluded.
“We use that feedback to refine future attempts to say the word. However, the brain’s process for this remains poorly understood – both for young children learning to speak for the first time, but also for older people learning a second language.”