Feature

The evolution of human language has long been one of the most intriguing challenges in psychological science. But an age-old and largely dismissed theory that language evolved from gestures is gaining scientific credence through several converging lines of research, including exciting findings from brain imaging studies.

Some researchers are optimistic that the new evidence could revolutionize our understanding of the origins of human language and how we view the evolution of cognitive skills in general.

"A lovely picture is beginning to emerge," says McGill University psychologist Laura Ann Petitto, PhD, who studies language acquisition in hearing and deaf children. "In time, you might see something tantamount to a paradigm shift in evolutionary circles."

The picture Petitto and others see is one of a brain that evolved to process certain signals--be they acoustic, as with speech, or visual, as with signed languages--in terms of a grammatical structure. The precursor to this ability, the theory states, lies in our ancestors' abilities to process general hand movements, such as grasping or picking, that themselves have a simple, grammatical structure. That is, they contain an agent, an action and an object as when a monkey (self as agent) grabs (action) a piece of food (object).

Such a theory suggests that the ability to process language grew in our distant evolutionary ancestors out of less complex abilities still seen in modern-day nonhuman primates.

"This kind of theory is much more satisfying" than the more traditional idea that language developed specially in humans, says Michael Arbib, PhD, director of the University of Southern California Brain Project. "You don't have the grammar built in. Rather you have a normal set of abilities, evolved from more basic abilities in our prelinguistic ancestors, that allow us to learn grammar."

Linking gesture and speech

Key to the connection between hand movements and language is the argument that our ancestors developed a mechanism for observing another's actions and comprehending at an abstract level the meaning of those actions.

Some of the most compelling data linking the hands and language come from combining monkey neurophysiology with human brain imaging studies. In particular, Giacomo Rizzolatti, PhD, and his colleagues at the Institute of Human Physiology of the University of Parma used cell recordings to find what they call "mirror neurons" in the F5 area of the monkey brain. This area sends signals to the monkey's motor cortex, which coordinates movements. The mirror neurons fire when monkeys grasp objects with their hands and when they merely watch researchers make their own meaningful hand movements, such as placing objects on a table, grasping food or manipulating objects.

This finding provides a connection between "doing" and "communicating about doing," says Arbib, who has worked with Rizzolatti to formulate a hypothesis about how language may have evolved from hand movements. The mirror neurons appear to represent a system that matches action to observation of action, enabling monkeys to guide their own actions and to comprehend the actions of other monkeys.

Making the finding even more interesting is research that links the monkey's F5 area to Broca's area, a part of the human brain associated with language. With that in mind, Rizzolatti, along with Daniela Perani, MD, and her colleagues at the Institute of Neuroscience and Bioimaging­CNR and the University of Milan conducted a brain imaging study in humans inspired by the neurophysiological findings in monkeys.

They found activation in Broca's area when people observed manual actions, similar to the activation seen in monkeys' F5 area. Not only that, but a new, yet unpublished study by Perani and colleagues finds that Broca's area is active during lip-reading. In fact, the more study participants recognized syllables and attributed the correct word to a pattern of lip movements, the more active Broca's area was during lip reading.

These findings indicate that Broca's area in humans--related through evolution to monkeys' F5 area--is linked not only with language processing, as traditionally known, but also with hand and mouth actions, says Perani.

Oral and optical patterns

Finding this connection between the monkey brain and the human brain is critical to any evolutionary theory of language, says McGill's Petitto. Also significant, she says, is finding evidence that the human brain processes hand signals as efficiently as it does acoustic signals.

Petitto has uncovered such evidence. In a series of now classic studies, she found that deaf children learn sign language on the exact same developmental trajectory as hearing children learn spoken language. In fact, children raised in households in which one parent speaks and the other signs, show no preference for learning spoken language over signed language.

"Whatever controls the timing of spoken language also controls the timing of sign language, and it certainly isn't based exclusively on sound," says Petitto. "The brain is working over some other computation, not raw sound."

She hypothesizes that the human brain has evolved a sensitivity to certain patterns that natural language has exploited. And these patterns can appear on the hand or on the tongue.

To test this theory, Petitto teamed up with Robert Zatorre, at the Montreal Neurological Institute's McDonnald-Pew Centre for Cognitive Neuroscience, where she is also a research scientist, to examine where in the brain adults process the building blocks of signed or spoken language.

They used positron emission tomography (PET) to look at the superior temporal gyrus within the secondary auditory cortex--a brain area that the scientific community largely agrees processes the phonetic and syllabic units of spoken language.

In the study, Petitto, Zatorre and their colleagues imaged the brains of hearing and deaf people as they viewed various stimuli. Among the many test conditions, the researchers included meaningless hand movements that happen to form phonetic and syllabic units in sign language.

When hearing participants saw these hand movements, as expected, their brains showed massive activation in the visual cortex, but, as expected, no activation in the left hemisphere areas associated with language. However, when deaf participants viewed the movements, they not only showed activity in the visual cortex but also in the exact tissue that's active in hearing people when they listen to phonetic and syllabic units of spoken language. This finding is particularly interesting because this tissue receives input from the primary auditory cortex via the ears and has been thought to only process auditory information.

"It's a remarkable finding," says Petitto. "The pathways into that tissue are from the ear, but in deaf people this is tissue that never heard sounds. And yet we saw robust activation in that tissue when they were looking at moving hands that represented phonetic and syllabic information. What it suggests is that the brain evolved sensitivity to very specific patterns that are critical to natural language; in this case, patterns of the phonetic and syllabic levels of language organization."

A primitive grammar of action

The human brain also appears to be tuned into grammatical patterns present in signed and spoken languages, says linguist Sherman Wilcox, PhD, of the University of New Mexico, who, along with Gallaudet University anthropologists David Armstrong, PhD, and William Stokoe, PhD, argue that language evolved out of gestures. Grammatical patterns give languages structure. For example, there's a grammatical pattern in English that says that in an active sentence the subject must come before the verb and the object after the verb. Similar patterns exist in signed languages.

Rizzolatti and Arbib argue that there is an inherent, though basic, grammar to nonhuman primate manual actions: When a monkey grasps a raisin, for example, "grasp" is the verb, "raisin" is the object and "self" is the subject. And when they watch others perform actions--a researcher grasping a raisin, for example--"grasp" is the verb, "raisin" is the object and the person is the subject. The ancestral brain, they posit, developed a general mechanism for understanding these basic grammatical patterns. In humans, this mechanism evolved to handle even more complex types of grammar that could be linked to pantomimed gestures, then abstract gestures and finally signed and spoken languages, they argue.

Many parts of this theory are still speculative, admits Arbib. But some psychologists and linguists have for years linked action and spoken language, calling speech "articulatory gestures." Instead of motions with the hands, speaking uses motions of the mouth and vocal chords, explains Arbib.

Few would go as far as to say that, without the hand, humans would never have evolved language. But, in his book, "The Hand" (Vintage Books, 1998), neurologist Frank R. Wilson, MD, argues that we can better understand how cognitive skills such as language evolved by examining them in the context of manual action.

"Any theory of human intelligence," writes Wilson, "which ignores the interdependence of hand and brain function, the historic origins of that relationship, or the impact of that history on developmental dynamics in modern humans, is grossly misleading and sterile."

Further Reading

  • Arbib, M. & Rizzolatti, G. (1997). Neural expectations: a possible evolutionary path from manual skills to language. Communication and Cognition, 29, 393­424.

  • Armstrong, D. (1999). "Original Signs: Gesture, Sign, and the Sources of Language." Washington, D.C.: Gallaudet University Press.

  • Armstrong, D., Stokoe, W., and Wilcox, S. (1995). "Gesture and the Nature of Language." Cambridge/New York: Cambridge University Press.

  • Gibson, K. and Ingold, T., Eds. (1993). "Tools, Language and Cognition in Human Evolution." Cambridge/New York: Cambridge University Press.

  • Petitto, L.A. (2000). On the biological foundations of language. In: H. Lane & K. Emmorey (Eds.). "The signs of language revisited." Mahwah, N.J.: Lawrence Erlbaum Associates, p. 447­471.