Japanese researchers at NHK Science & Technology Research laboratories are developing an animated sign language translation system that automatically converts strings of Japanese words into gestures.
The goal is to improve sign language broadcasts for deaf viewers, especially in the case of disaster or breaking news.
Subtitles are all well and good, of course, but people who were born deaf and for whom signing is their primary form of communication may have more difficulty understanding them.
NHK’s system, when presented with a string of Japanese text, automatically compares and converts it into sign language delivered by a snappily-dressed avatar in what appears to be a virtual newsroom. The system is also able to replace words that don’t directly translate with synonyms.
Deaf people who watched the animations were able to understand it at a basic level, senior research engineer Naoto Kato says in the video, but as a whole, the translation can lack fluency.
As the system is still imperfect, it has a built in manual component where humans can adjust the translation to fix mistakes.
Saturday, June 4, 2011
Popular Science. A video of the system is on the Popular Science site.
Posted by BA Haller at 5:14 PM