Many people in this world suffer from severely debilitating syndromes leaving them paralyzed and completely dependent on the assistance of others. Some advances, like automated wheelchairs that use special input devices, have helped to increase the quality of life for these folks. To push the assistive technology even further, the University of South Florida researchers are working on refining a wheelchair that has its own mechanical arm. The system uses EEG to read one's brain waves and sends translated signals to the roboarm, directing it to move accordingly.
From the USF news office reports that the BCI system – developed, used and modified by USF psychology professor Emanuel Donchin and colleagues – captures P-300 brain wave responses and converts them to actions. Donchin and colleagues harnessed the P-300 brain signal to allow the user to “type” on a virtual keyboard by thinking with the P-300 response serving as the virtual “finger” for patients who cannot move, such as those with locked-in syndrome or those with Lou Gehrig’s Disease (ALS). Researchers in the USF Department of Mechanical Engineering’s Center for Rehabilitation Engineering and Technology, in collaboration with the Cognitive Psychophysiology Laboratory in the Department of Psychology, modified the BCI further to fit a specific WMRA requirement.
“We modified the BCI system to display a matrix of several options that include actions or directions that the user would like to have the WMRA perform,” said Redwan Alqasemi, a researcher in the USF Department of Mechanical Engineering’s Center for Rehabilitation Engineering and Technology.“The user wears a head cap fitted with electrodes to measure P-300 electroencephalogram (EEG) activities in the brain. While the movement options intensify on a screen and flash at certain frequencies, the user concentrates on the option desired to trigger the desired P-300 brain signal. The electrodes detect the signal, relate it to the desired action, then, the WMRA control system translates the brain signal to the robotic arm, which carries out the desired movements,” said Alqasemi.
Early testing by human users has shown that the WMRA can be controlled “without the user moving a muscle.” The WMRA does not use any pre-programmed movements unless chosen by the user.
According to Rajiv Dubey, professor and chair of the USF Department of Mechanical Engineering, and director of the Center for Rehabilitation Engineering & Technology, the design of intelligent therapeutic and assistive robotic systems such as the WMRA is based on sensor-fusion technology that is used to map limited human input into complex motion using “sensor-assisted scaled teleoperation.”
“Our Rehabilitation Engineering & Technology Program is aimed at designing and developing rehabilitation robotic systems that maximize the manipulation and mobility functions of persons with disabilities,” said Dubey. “The result will be that mobility-impaired persons can live more independently, with improved quality of life and even better employment outcomes.”
The WMRA holds particular promise for persons suffering from “locked-in syndrome,” a totally paralytic condition that leaves people unable to move but intellectually normal, a condition that has gained greater attention thanks to the book and subsequent movie The Diving-Bell and the Butterfly. Even in its development stage, the WMRA offers hope for a better quality of life for people with all levels of mobility challenges.
Sunday, February 8, 2009
USF developing smart chair for people with disabilities
From MedGadget.com: