Disability, John Spletzer believes, should no longer pose any obstacle to mobility. A blind person may not be able to see or a paraplegic to walk, but each can access the technology available to the rest of the world. And that technology has the potential to serve as a person’s feet, hands and eyes and thus restore his ability to interact with his environment.
Spletzer, an associate professor of computer science and engineering, recently received a five-year CAREER Award from the National Science Foundation (NSF) to develop a robotic wheelchair that navigates on its own, with no human guidance or remote control, through a crowded city.
Armed with high-fidelity lasers and detailed maps, the “smart” wheelchair will avoid stationery objects like parking meters and light poles as well as “random events” like pedestrians and bicyclists. It will transport users who may not be able to see or walk to their doctor’s appointments, to the pharmacy, to the grocery store.
Spletzer’s latest endeavor is an outgrowth of two older projects. Two years ago, he and his students worked with researchers from the University of Pennsylvania and Lockheed Martin to convert a Toyota Prius into a robot equipped with laser and camera sensors. The driverless vehicle, “Little Ben,” was one of only six cars, out of 89 original entries from several countries, to complete the 57-mile course in the 2007 DARPA Grand Challenge for robotic vehicles.
And in a continuing collaboration, Spletzer and engineers from Freedom Sciences LLC have invented the Automated Transport and Retrieval System (ATRS), which enables wheelchair users to get into and out of their vehicles, stow and retrieve their chairs, and drive while sitting in traditional auto seats that meet federal safety regulations. Freedom Sciences began selling the ATRS in the summer of 2008, shortly after the system was approved by the U.S. Food and Drug Administration (FDA).
“Our goal now,” says Spletzer, who directs Lehigh’s VADER (Vision, Assistive Devices, and Experimental Robotics) Laboratory, “is to extend the autonomy of the wheelchair so it can navigate completely in an urban setting and take you wherever you need to go.
“At the same time, we want to download Little Ben’s hardware, convert it to software, which is much less expensive, and upload it to the robotic wheelchair. This will give the chair the maps and images it needs to be able to interact with its environment.”
In order to “see” and respond to its environment as it navigates around a city, says Spletzer, a robot must possess two things: sensors that detect and recognize familiar landmarks (pictured) and a database with maps that show where those landmarks will be.
Spletzer and his students have taken a cue from Google Street View, which allows Internet users to take virtual tours of distant cities, block by block and building by building, by looking at thousands of stored images. These images are of little use to robots equipped with lasers, says Spletzer, because robots do not see what humans see. But the concept is applicable to the robotic wheelchair.
“To create Google Street View, people drive vehicles around cities, take thousands of images and make maps. We’re making similar maps that are useful for robots, not people. Our robots respond to different cues than humans respond to. Whereas people see the real world and all its details, robots using lasers recognize things like poles and building corners that reduce to a very exact point and are thus easy to track.”
Spletzer and his students are fitting the robotic wheelchair with a low cost LIDAR (an acronym for light detection and ranging) laser similar to, but much less expensive than, those that enabled Little Ben to detect other vehicles, highway lane markers and the edge of the pavement. They have made high-fidelity, 3-D laser maps of portions of South Bethlehem and of the Stabler Arena parking lot on Lehigh’s Goodman Campus. The team’s robotic wheelchair, guided by LIDAR but not GPS devices, has traversed a 1-kilometer route and arrived at its destination to within an accuracy of 20 centimeters.
“We have a server vehicle drive around and make a hi-fi 3-D map of the environment,” says Spletzer. “The robotic wheelchair can download this map and navigate the environment, halfway in the real world, halfway in the virtual world.
“The robot identifies landmarks—trees, poles, building faces and corners—in the real world and looks for them in the laser map. Once it finds them, it will be able to accurately estimate its position in the real world. It doesn’t need GPS, because of the accuracy of the server vehicle maps and because of the LIDARs.”
Meanwhile, says Spletzer, the robot learns from experience by comparing the new objects it sees in the environment with the images in its database.
At a cost of $250,000 to produce, Little Ben was the least expensive robotic car to complete the 2007 Grand Challenge.
By contrast, the cost of purchasing an ATRS is comparable with the cost of purchasing a van or SUV and revamping it to allow a person to drive while seated in the wheelchair, as most wheelchair drivers now do.
Spletzer’s goal is to combine the greater autonomy of Little Ben with the affordability of the ATRS.
“The ATRS is priced to sell and is effective,” says Spletzer. “But it is autonomous only within the immediate vicinity of a person’s vehicle. Little Ben drove itself 57 miles and achieved a high level of autonomy. But very few people are going to buy a robotic car for $250,000.
“We want to develop a cost-effective solution for mobile robot localization and mapping, to go from what we have now with the ATRS, which is a low-cost system with low-level autonomy, to a system that is low-cost with high-level autonomy. To do that, we need to replace hardware, which is expensive, with software, which is cheap—once you write the code and download it.”
Spletzer, who receives numerous letters and emails, expects people to find new uses for the robotic wheelchair when it is developed.
“In a big city, courier services could use robotic vehicles instead of bicyclists to deliver packages across town. And the robotic wheelchair would be useful for quadriplegics whose cognitive functions are intact, but who find it difficult and tedious to operate a wheelchair.
“Our team received a letter from one man who said his wife could not operate her wheelchair along the narrow halls of her house. Because she lacked sufficient motor skills, she was always bumping into something.
“We want to provide a solution for people like this that doesn’t require them to interact with their environment. We want them to just be able to say to their wheelchair, ‘Take me to the drug store’ or ‘Take me to the doctor.’”
Spletzer credits two of his graduate students—Mike Sands (computer science) and Chao Gao (computer engineering)—with making the laser maps and pushing along the robotic wheelchair project.
He marvels at the rate of change in robotics in the decade or so since his own days as a graduate student at Penn.
“When I was a graduate student, we were driving remote-control cars in the lab. Ten years later, we were creating robotic cars that could drive autonomously in the real world. That’s a quick evolution.
“Ten years from now, who knows where we’ll be?”
Wednesday, November 11, 2009
Laser sensors allow wheelchairs to "see"
From PhysOrg.com: