Monday, June 25, 2007
Guessing Robots Predict Their Environments For Better Navigation
Engineers at Purdue University are developing robots able to make "educated guesses" about what lies ahead as they traverse unfamiliar surroundings, reducing the amount of time it takes to successfully navigate those environments. The method works by using a new software algorithm that enables a robot to create partial maps as it travels through an environment for the first time. The robot refers to this partial map to predict what lies ahead.
The more repetitive the environment, the more accurate the prediction and the easier it is for the robot to successfully navigate, said C.S. George Lee, a Purdue professor of electrical and computer engineering who specializes in robotics.
"For example, it's going to be easier to navigate a parking garage using this map because every floor is the same or very similar, and the same could be said for some office buildings," he said.
Both simulated and actual robots in the research used information from a laser rangefinder and odometer to measure the environment and create the maps of the layout.
The algorithm modifies an approach, called SLAM, which was originated in the 1980s. The name SLAM, for simultaneous localization and mapping, was coined in the early 1990s by Hugh F. Durrant-Whyte and John J. Leonard, then engineers at the University of Oxford in the United Kingdom.
SLAM uses data from sensors to orient a robot by drawing maps of the immediate environment. Because the new method uses those maps to predict what lies ahead, it is called P-SLAM.
"Its effectiveness depends on the presence of repeated features, similar shapes and symmetric structures, such as straight walls, right-angle corners and a layout that contains similar rooms," Lee said. "This technique enables a robot to make educated guesses about what lies ahead based on the portion of the environment already mapped."
Research findings were detailed in a paper that appeared in April in IEEE Transactions on Robotics, published by the Institute of Electrical and Electronics Engineers. The paper was authored by doctoral student H. Jacky Chang, Lee, assistant professor Yung-Hsiang Lu and associate professor Y. Charlie Hu, all in Purdue's School of Electrical and Computer Engineering.
Potential applications include domestic robots and military and law enforcement robots that search buildings and other environments.
The Purdue researchers tested their algorithm in both simulated robots and in a real robot navigating the corridors of a building on the Purdue campus. Findings showed that a simulated robot using the algorithms was able to successfully navigate a virtual maze while exploring 33 percent less of the environment than would ordinarily be required.
Future research will extend the concept to four robots working as a team, operating with ant-like efficiency to explore an unknown environment by sharing the mapped information through a wireless network. The researchers also will work toward creating an "object-based prediction" that recognizes elements such as doors and chairs, as well as increasing the robots' energy efficiency.
Robots operating without the knowledge contained in the maps must rely entirely on sensors to guide them through the environment. Those sensors, however, are sometimes inaccurate, and mechanical errors also cause the robot to stray slightly off course.
The algorithm enables robots to correct such errors by referring to the map, navigating more precisely and efficiently.
"When the robot makes a turn to round a corner, let's say there is some mechanical error and it turns slightly too sharp or not sharply enough," Lee said. "Then, if the robot continues to travel in a straight line that small turning error will result in a huge navigation error in the long run."
The research has been funded by the National Science Foundation.
In separate work, Purdue undergraduate students in a senior design class have developed a prototype firefighting robot called Firebot.
Subscribe to:
Posts (Atom)