SAN DIEGO — The brain can apparently tell the difference between a word and a gesture — even when the word is a gesture.
Karen Emmorey, a cognitive neuroscientist at San Diego State University, has been looking at how the brains of deaf people interpret American Sign Language. She showed 10 subjects pictures of objects that have actions associated with them — a cup for “drink,” say, or a broom for “sweep.” She asked participants to either sign the word that goes with the picture or to pantomime using the object. In some cases, like “drink,” the word and the gesture are the same: Subjects pretended to hold a cup in one hand and brought it to their mouths. For other words, like “sweep,” the sign and the pantomime look different.
By taking positron emission tomography images of the brain as subjects signed, Emmorey found that the brain broadcast participants’ intentions: Different regions of the brain lit up when the deaf subjects signed than when they pantomimed, even when the word and gesture were identical.
"For sign production we find language regions engaged,” Emmorey said February 19 at the American Association for the Advancement of Science meeting. But when subjects were pantomiming, the brain regions that lit up were those associated with grasping, manipulation and motor planning.
“The fact that many signs are iconic doesn’t change the fundamental organization of language, nor does it change the neural systems that underlie language,” she said. The work has been submitted for consideration for publication in Language and Cognitive Processes.
A database of news and information about people with disabilities and disability issues... Copyright statement: Unless otherwise stated, all posts on this blog continue to be the property of the original author/publication/Web site, which can be found via the link at the beginning of each post.
Monday, February 22, 2010
Brains can tell the difference between words in ASL, gestures
From Science News: