In Brief

Vanderbilt University psychology professor Craig A. Smith, PhD, was skeptical when roboticist Nilanjan Sarkar, PhD, approached him about creating a robot that can sense and respond to a person's emotional state. But, more than a year later, the two professors are taking strides to make this robot a reality.

So far, they have programmed a small mobile robot to successfully differentiate between some basic human emotional states. To do this, the robot collects physiological data from button-sized sensors that people wear. The sensors monitor a person's heart rate, skin conductance, muscle activity and some facial muscle activity (such as a clenched jaw). The robot then bases its emotional reading on this physiological data.

For example, the Vanderbilt team recently programmed a small mobile robot to pick up on anxiety. The robot will explore a room, and when it senses high anxiety--based on the physiological data--it will approach the person and say: "I sense you are anxious. Is there anything I can do to help?"

Preliminary work on this project by Sarkar and his team was described in the November issue of Robotica (Vol. 20, No. 6). Subsequent progress on the project is detailed in a manuscript currently under review at the same journal.

Smith and Sarkar say their finding is a major first step toward what they eventually hope to create--a robot that can differentiate among a variety of emotional states, such as boredom, fatigue, optimal task engagement, anxiety, frustration and anger.

"It's to be used as a personal assistant where, without [a person] explicitly saying 'help,' the robot can still see that person is in distress and can come to that person's aid," Smith says. People express the same emotion differently, however, so Smith realizes that any system based on detecting emotions needs to be tailored to a person's individual patterns.

He has developed three tests to help measure such individual psychological patterns: an anagram problem-solving task, math problems that become increasingly difficult and a sound discrimination test in which a participant is asked to determine if tones are similar or dissimilar under time pressure. Smith seeks to determine whether people experience engagement, boredom, fatigue, frustration or anger as they perform the tasks.

Smith's partner, Sarkar--a Vanderbilt mechanical engineer--sees "emotionally sensitive" robots eventually proving useful in education and the workplace. For example, a robot could monitor a student's boredom or frustration levels during a tutorial, and adjust feedback and problem difficulty accordingly. Also, Sarkar says, a robot could detect when a person becomes inattentive or fatigued while working on a factory assembly line.

Says Smith of the project: "It's an interesting collaboration with a team of psychologists and engineers bringing together very different perspectives and two different worlds in how research gets done."