Â鶹Çø

News

Â鶹Çø researchers present major findings at US neurosciences conference

Published: 28 October 1997

Topic: Organization for language in the absence of sound: A PET study of deaf signers processing signed languages and hearing controls processing speech

A new study shows that rather than being born exclusively to speak, the human brain may be specialized to make sense of specific patterns that are found in all human language, be it language on the tongue or language on the hands. The finding challenges the accepted notion that speech is essential to language.

"This specialization for language-specific patterns in the brain is what gives humans our universal capacity to acquire language across widely varying home settings," says the studyÂ’s first author Laura Ann Petitto of Â鶹ÇøÂ’s Psychology Department and the Montreal Neurological Institute (McDonnell-Pew Centre for Cognitive Neuroscience, McConnell Brain Imaging Centre). "Yet at the same time, this specialization is highly modifiable, as it can use multiple pathways in brain development depending upon environmental input."

This surprising result comes from Positron Emission Tomography studies (PET) of blood flow in the brains of 11 profoundly deaf people using two autonomous signed languages (five American Sign Language, six Langue des Signes Québécoise) and 10 hearing people using spoken language. It is well known that those who use speech have increased cerebral blood flow at discrete places in the left side of the brain. The researchers asked whether these places are specialized for making sense of speech and sound, per se, or are more general neural pathways tuned to specific types of patterns encoded in natural language.

Petitto, co-investigator Robert Zattore and their colleagues, Kristine Gauna, Deanna Dostie, and Jim Nikelski, discovered that both deaf and hearing people had strikingly similar patterns of blood flow in the left side of their brains, even though signed languages have evolved on the hands in the absence of speech and sound. Deaf peopleÂ’s brains showed increased blood flow in areas that have until now been regarded as being exclusive to speech.

But there also were tantalizing differences between deaf and hearing brains. While deaf peopleÂ’s brains had increased blood flow in the traditional speech areas, in some cases they also had increased blood flow that reached toward the primary visual areas of the brain. This demonstrates that the brain has an impressive ability to migrate and recruit its different parts to accommodate the specific nature of the input. In this case, a visual language recruited parts of the visual system for linguistic purposes, Petitto says.

"The findings are controversial because it shows that speech and language are not one and the same thing," Petitto says. "The research also provides powerful insights into the brainÂ’s neurological plasticity during early cognitive development, especially regarding language. Extreme differences in early sensory experience can still result in entirely normal neurological organization in the brain, due to the brainÂ’s profound capacity to reorganize itself during early life."

In four experimental conditions, subjects saw videotapes of 1) baseline visual fixation point, 2) meaningless finger movements, 3) meaningful signs and 4) signed nouns and generated appropriate signed verbs. Hearing controls received the same conditions but for the last condition, they saw a printed word and generated a spoken verb.

As expected, comparisons of conditions 3 and 1 showed similar visual cortical activation in deaf and hearing subjects. However, left temporal-occipital activation was observed in the deaf but not the hearing subjects. This indicated that, for the deaf subjects, the hand movements were perceived as being truly linguistic and consequently were processed in the identical brain areas just like spoken words. Understandably, this was not so for the hearing subjects, who did not know sign language and, hence, only perceived the signs as being meaningless visual stimuli.

Importantly, comparisons of 4 and 1 demonstrated left inferior prefrontal cortex activation in deaf and hearing subjects. "This finding is especially exciting. This region is believed to be where the brain searches and retrieves information from semantic memory. Surprisingly, we found that the identical brain regions were also used with sign languages," says Zatorre.

By contrast, hearing controls showed no activation of these areas when they viewed real signs in 3 as compared with moving fingers in 2, which shows that these stimuli were seen by them as being only non-linguistic visual movements. Surprisingly, further comparisons of 4 and 1 revealed that the deaf showed activation in the superior temporal gyrus, the traditional auditory processing area, suggesting possible reorganization and use to accommodate visual processing -- even in a language that does not use sound.

"The discovery of common activation sites in signed and spoken language suggests that these sites are dedicated to processing specific patterns unique to natural language, rather than to speech or sound per se," says Petitto. "At the same time, this pattern sensitivity appears to be modifiable and can reorganize itself using multiple pathways as a result of the modality of the language in the infantÂ’s environment."

Petitto expects the new findings will contribute to our understanding of the brainÂ’s impressive plasticity in early life. It will also be useful in the neuropsychological assessment of deaf patients, and it can serve as the basis for designing educational programs for deaf children and of public policy for delivering social services to deaf individuals.

The next step in this research is to further explore exactly how the human brain is "wired" for one of the most important components of what it means to be human: language. The researchers will try to identify the precise components of language in the environment that the brain picks up and responds to that are common to both sign and speech. They will try to determine how early this exposure must occur for the brain to achieve functionally normal wiring.

Funding: Funding for this research was made possible by the generous support of the Natural Sciences and Engineering Research Council of Canada, the Medical Research Council of Canada, and the McDonnell-Pew Centre for Cognitive Neuroscience, Brain Imaging Centre of the Montreal Neurological Institute.

About signed languages

Signed languages are naturally evolved, non-invented languages. There is serious scholarly speculation that they have existed since the onset of spoken languages. Like spoken language, signed languages are not universal. Many wholly autonomous natural signed languages are used within distinct Deaf cultures around the world. Signed languages are not the signed counterpart of the majority spoken language. For example, the signed language used in Quebec among culturally French Deaf people, called "Langue des signes Québécoise" (or LSQ) and the signed language used by Deaf people in France (FSL), constitute distinct languages and two Deaf people within these respective French cultures would need an interpreter to communicate. This is also true of the signed languages used by cultural English Deaf persons in Canada and the United States, called American Sign Language (or ASL) and the Deaf people in Britain, and elsewhere around the world.

Over 40 years of intensive research by linguists and psychologists have demonstrated that natural signed languages possess the identical levels of language organization found in spoken language: specifically, the phonological (or sub-lexical), morphological, and syntactic levels of language organization. Signed languages also convey the full semantic and grammatical expressive range as any spoken language, and they possess similar general discourse (or conversational) rules -- all of which were previously thought to be exclusive to spoken languages. In educational contexts only, there are hand codes that have been invented to mirror the structure of a specific spoken language (for example, "Signed English," or "Signed French"). Unlike naturally evolved signed languages, these are indeed invented codes. Like Morse Code, they are not natural languages -- and there are no communities of Deaf people anywhere in the world that that use these invented codes outside of restricted pedagogic contexts.

About the first author, Dr Laura Ann Petitto

Dr. Petitto is a cognitive neuroscientist at Â鶹Çø, where she is professor of Cognitive Psychology in the Department of Psychology, and she is a Research Scientist in the McDonnell-Pew Centre for Cognitive Neuroscience, Brain Imaging Centre, of the Montreal Neurological Institute. Dr. Petitto is internationally renowned for her work on the biological bases of language, especially during early language acquisition. She is also renowned for her discoveries concerning how young deaf and hearing children acquire natural signed languages from their deaf parents. Her research on the biological bases of language spans over 20 years, beginning with her role on the well-known Columbia University research project, called "Project Nim Chimpsky." As the projectÂ’s "Primary Teacher" and "Project Research Coordinator," she lived with the chimpanzee and attempted to teach him American Sign Language in an environment simulating that of a human child. Most recently, she discovered that profoundly deaf babies exposed to signed languages from birth babble with their hands; indeed, manual babbling possesses the same syllable structure, and occurs on the same developmental timetable, as hearing babiesÂ’ vocal babbling (see Science, March 21, 1991). Dr. Petitto received her MasterÂ’s and Doctoral degrees from Harvard University in 1981 and 1984 (respectively) and has built a vibrant laboratory in Cognitive Neuroscience, which she directs, at Â鶹Çø since then.

Back to top