Cover Story

In a sampling of studies around the world, children with autism have verbally and physically engaged with robots-and in some cases these children have even demonstrated affect in those interactions.

However, the researchers conducting such studies caution that the results thus far are mostly anecdotal. The studies have been limited in size and scope, and autism researchers aren't sure what the results mean or if they will ultimately be of use in diagnosing or even treating autism.

"Any behavior work is difficult-behaviors are messy and difficult to quantify," says engineer Brian Scassellati, PhD, director of the social robotics lab at Yale University.

That said, his Yale team of robotics researchers and mental health specialists believe that with time, robots may help shed light on the mysteries of autism. Most immediately the Yale team hopes to use robotics to help standardize and improve the quality of diagnosis for autism.

Baby steps

The team aims to give clinicians comparative tools so that they can better measure potentially autistic behaviors, Scassellati explains.

For instance, to diagnose autism, a clinician has to assess a child's social skills by observing things like eye contact and gaze, facial expressions and gestures. There are diagnostic protocols against which a physician or psychologist can measure his or her observations, but the observations themselves are subjective and can often vary widely from clinician to clinician. Scassellati believes that a robot's potential to objectively record things like gaze tracking and social distance, combined with the ability to program it to perform specific behaviors-and only those behaviors-that prompt reaction by the child, can produce more quantifiable diagnoses and a clearer picture of the social deficits inherent in autism.

Robotics groups in different parts of the world have reported that when children with autism interact with robots, they show behaviors they don't normally show with adults, such as engagement and motivation, he says. However, the robot specifications and appearance have varied widely, from tank-like, to Barbie with a mechanical compartment in her chest to something that looks like a hamster rolling around the floor.

"We don't know which specifications are effective and why," explains Scassellati. Answering these questions will take time and be done in small stages, taking baby steps, he emphasizes.

Building a better diagnosis

One such step is a collaboration between Scassellati's group and the Yale Developmental Disabilities clinic. In a pilot study run by Scassellati and psychiatrist Fred Volkmar, MD, director of the clinic, researchers observed a small group of children-some with autism, some without-interacting with a simple, commercially produced robot that had a human-like face on a box-like body. The robot was not capable of reacting to anything the children did, but in some trials a researcher hidden behind a two-way mirror would manipulate the robot to respond to the children's actions. Most of the children-including some of the children with autism-were touching and verbalizing with the robot, notes Volkmar.

However, the children without autism would eventually get bored with the "non-contingent" robot-the one that didn't respond to their actions. The children with autism had a much higher tolerance for the non-responsive robot. They also rarely exhibit quasi-social behavior such as touching or verbalizing in their regular environment, notes Scassellati. Building on that, he and Volkmar want to isolate certain robotic actions to see what is garnering the response.

"Is it important that the robot have an identifiable face? Does contingency make a difference?" asks Scassellati. And more broadly-what makes the reaction of children with autism different? Ultimately, Scassellati wants to combine this information with findings from other researchers at the clinic about gaze tracking, social distance and prosody-not what children say, but the tone and intent of what they say-to build the diagnostic tools he envisions. Yale psychologist Ami Klin, PhD, is working with one of Scassellati's students to develop computational models that would classify social behavior.

Once researchers start to understand what is behind the social deficits in autism, they may be able to use the robots to teach skills, says Volkmar. We usually acquire social skills early in life-babies learn to babble responsively and read facial expressions within the first year, he explains. "How do we teach something that comes so easily to most of us?"

One of the problems is that teaching social skills is an inherently social situation, notes Volkmar. But not with robots. Not only can researchers better isolate specific social behaviors with robots, they can program them to react the same way over and over again, he explains. That's useful for quantifying a child's reaction, but also may be an advantage in achieving a narrow and consistent focus to teaching certain social skills such as communicating eye-to-eye or standing an appropriate distance from others, says Volkmar. People with autism generally function better with predictability, he adds.

A therapy robot for autism-if it is even possible-will be a long time coming, notes Scassellati. But the possibilities are certainly enticing.