PROVIDENCE, R. I. - TJ can usually tell when someone is happy. Happiness is easy, but negative emotions are trickier. Sorrow can look like laughter; sadness can pass for fatigue. The expressions that signify guilt and hunger, he said, "are the exact same thing."
TJ Conway, 16, has Asperger's Syndrome, a neurological condition related to autism,
which makes it hard for him to interpret facial expressions. Is that smile sarcastic, apologetic, polite, angry, or just happy? TJ is not always sure. It's a disability that can strain his dealings with the world.
Today he is holding a small computer, about the size of a hardcover book, that, like TJ, is trying to solve the complex problems of recognizing facial expressions. Point the device at a person's face - as one would with a camera - and, ideally, it will identify the emotion they are feeling.
Although the computer prototype has many bugs to work out, it represents the next generation of technological assist that, if successful, could help people with Asperger's - including the half dozen or so teens in the Providence classroom with TJ - make sense of the emotional world around them. The device is a product of MIT's Media Lab, where computer scientists are studying ways to help machines think emotionally, like humans.
The prototype is called an iSet, which stands for "interactive social emotional toolkit." It's a tablet with a camera on one side and a screen on the other, like an oversized cellphone. TJ points the device at another teen, and holds it steady until a choppy black-and-white video image appears on the screen, along with such labels as "agreeing," "disagreeing," "concentrating," and "confused."
When the computer thinks it's identified a facial expression, a colored dot appears above the corresponding label. The dot grows larger as the computer becomes more confident it has identified the correct emotion.
The software behind the iSet combines commercially available face-recognition programs with machine-learning algorithms that allows the computer to compare new facial expressions to ones it's already seen, and to calculate probabilities that a certain facial expression might mean a certain thing. It was developed by MIT researcher Rana el Kaliouby.
Some social anomalies were apparent among the teens testing the device. One boy talks in blurted lists of facts. Another over-enunciates his words. Some are uncomfortable with direct eye contact. They call themselves "Aspies." Once a month they meet here, at the Groden Center, a school for children with autism and other disabilities. It's a rare opportunity for them to socialize on their own terms.
"There's a comfort level," said Virginia Law, TJ's mother. "They all know that they each have their own little quirks. Outside their group, it's harder."
"I think my awareness of other people's emotions is slightly less than other people's," said Stephen Shea, 16. "I tend to say the wrong thing more often than other people."
For all of them, the challenge is to consciously recognize cues, like a smile or a raised eyebrow, that most of us recognize by instinct.
Tuesday, October 28, 2008
New technology helps people with autism interpret facial expressions
From the intro to a story in The Boston Globe: