Speaking and listening share large part of brain infrastructure

Published: 16 August 2011

When it comes to speech, the human brain has two main tasks: to articulate it and to understand it.

When it comes to speech, the human brain has two main tasks: to articulate it and to understand it.

For many years psychologists have debated whether these two functions use the same regions of the brain.

Now scientists at the University of Glasgow, Radboud University and the Max Planck Institute for Psycholinguistics have found the answer.

In a paper published in the journal Psychological Science, researchers showed that formulating speech and comprehending it do use the same regions of the brain – but areas that control the movement of our mouths play no role in understanding speech.

Dr Laura Menenti of the Institute of Neuroscience and Psychology at the University of Glasgow and first author of the paper, said: “Until now, most studies of speech focused on comprehension.

“This was largely due to the fact that analysing brain activity using Functional Magnetic Resonance Imaging (fMRI) requires a subject to remain completely still but talking means moving the head.

“The Donders Institute at the Raboud University, where I conducted the study, however has developed technology that allows scientists to scan the brains of moving subjects and understand what happens when we form speech.”

The authors used fMRI to measure brain activity in people who were either listening to sentences or speaking sentences. The subjects were shown a picture of an action taking place and had to produce a sentence to describe it, for instance saying ‘The man feeds the woman.’

The researchers then were able to examine how the subjects’ brains produced the meaning, the correct words, and the grammatical structure for these sentences.

They found the same areas were activated for these different linguistic speech tasks in people who were speaking and people who were listening.

However, when it came to actually producing and perceiving the sounds that constitute speech, the regions involved differed. The research also sheds some light on what is happening in the brains of people who have suffered a stroke.

Dr Menenti said: “It often appears that people who can’t speak can still understand speech. But this study suggests that if speaking is impaired, so is comprehension.

“Instead, people may be relying on other cues to figure out what they hear, for example if they hear the words ‘apple’, ‘eats’ and ‘man’, even if their brain isn’t able to understand the syntax of the sentence they will still figure out that the man was eating the apple, not the other way around.

The research team comprised Peter Hagoort, Laura Menenti, Sarah Gierhan and Katrien Segaert.


For more information contact Stuart Forsyth in the University of Glasgow Media Relations Office on 0141 330 4831 or email stuart.forsyth@glasgow.ac.uk or Laura Menenti on 0141 330 6165 or laura.menenti@glasgow.ac.uk

First published: 16 August 2011

<< August