WASHINGTON: The US military wants robots that can work alongside soldiers without needing constant remote-control attention to keep them from knocking into things. That isn’t as easy as it sounds. While computers can out-process a human mind now by crunching huge numbers of numbers, when it comes to physical objects, even state-of-the-art robots make human toddlers look coordinated.
That clumsiness is something Georgia Tech professor Mike Stilman is working to cure. With a three-year, $900,000 grant from the Office of Naval Research, which supports research into everything from robotics to railguns, Stilman is trying to develop a robot that can not only avoid obstacles but can use them as improvised tools. He and his team call the project “MacGyver.”
Why is the military interested in all this? Don’t worry, Stilman isn’t building a prototype Terminator. (The military already has lethal robotic assassins, anyway; they’re called Predator drones. Of course, the Terminators did not have a man in the loop deciding when to shoot. They just had a man in their sights…).
Stilman’s team has its eye on nicer applications. They have suggested their robot may one day be able to lift rubble off a wounded soldier and carry him to safety, or pry open a stuck door to let troops out of a burning building. That’s the kind of military robot we can all live with, as long as it obeys.
“It comes from a line of research that starts with my Ph.D thesis,” completed at Carnegie Mellon in 2007, explained the 30-year-old Stilman. “Robots should be able to interact with their environments; they shouldn’t just get stuck. They should be able to move things out of the way.”
Stilman did his initial work at the Digital Human Research Center in Tokyo, programming a robot called HRP2 to push chairs and tables out of its path. “This is kind of the next step,” he said, “going beyond just moving things out of the way” to assessing those objects’ utility as tools and employing them to perform a task.
“How can you build a bridge? How can you build a lever? A wedge? Any object can be used as one of these things,” Stilman said. “You can think of even chains of objects — think about a Rube Goldberg machine,” he said, where one object acts on another which acts on another.
Human beings benefit from a few hundred million years of evolution when they try to understand and manipulate physical objects, but robots have to start almost from scratch. Stilman and his team are building on earlier work in object recognition. But that technology is just the beginning: The robot has to look at something and decide, okay, that is 60 percent likely to be a table. Then it has to move around, check out the object from multiple angles, and — Stilman’s big innovation — push on it, pull on it, and see it can bear the robot’s weight.
That’s a big deal for robots. When Stilman and his Japanese colleagues got a robot to push a chair out of its path a few years ago, it was the first time, to their knowledge, that a robot had autonomously figured out an obstacle in its path was movable and then moved it out of the way.
Stilman now wants to teach a robot to use objects as tools, for example climbing up on a table to reach something or levering open a stuck door with a pipe. Humans have been improvising tools this way since before they were humans — monkeys can get ants out of ant-hill with a stick — but robots can’t. Teaching them how would be a huge step forward in robots’ ability to navigate the physical world.
Edited at 11:45 am.
Commerce announces more semiconductor funding for military aircraft, commercial satellites
The Department of Commerce announced it is awarding BAE Systems Inc. and Rocket Lab a CHIPS Incentive Award worth up to a combined $59.4 million.