LAS VEGAS: The US military depends on drones. But amidst the justifiable excitement over the rise of the robots, it’s easy to overlook that today’s unmanned systems are not truly autonomous but rather require a lot of human guidance by remote control — and bad design often makes the human’s job needlessly awkward, to the point of causing crashes. Fixing that is the next big challenge for the unmanned industry.
“Too many screens with too much information, folks” — that’s the bottom line, said Col. John Dougherty, a Predator operations commander with the North Dakota National Guard, speaking at a workshop on the first day of 2012 conference of the Association for Unmanned Unmanned Vehicle Systems International (AUVSI) here in Vegas. “I am tired of all these black panels all over the place,” Dougherty went on, urging designers to “de-clutter for sanity.” But instead, he lamented, “they keep strapping the stuff on,” adding more and more sub-systems each with its own unique and user-unfriendly display.
“Human factors was not integrated into the original design of the Predator,” Dougherty said. “They were never given the time,” because what was originally a technology demonstration project proved so valuable it was rushed into widespread use. As a result, he said, the percentage of major mishaps caused by “human factors” is, ironically, higher for Predators than for manned aircraft.
It’s even harder to design a control system for troops operating unmanned systems in the field, instead of from a relatively pristine command center. Something as simple as having to look down at a handheld display can distract a foot soldier from the threats around him, and the light from the screen can give away his position at night, said Army Staff Sergeant Stanley Sweet, an unmanned ground vehicle trainer at Fort Benning and veteran of two tours in Iraq.
Often, Sweet went on, when engineers develop control systems, “they want to use a touchscreen, which looks neat — [but] the sand, the dirt, the mud, how is it going to affect the screen?” he asked. “How is it going to hold up? My experience is they don’t.” Controls meant for foot troops have to be physically rugged and conceptually uncomplicated, more like game controllers than like militarized iPads. Infantrymen have no time to navigate complex menus while wondering, “Oh, by the way am I going to get shot at,” said Sweet. “If the technology is slow, it will not be used.”
Such painful experiences can sour combat veterans on promising new technologies. For example, current robots require at least one human operator apiece and often more; each Predator, for instance, has a pilot and a sensor operator, much like a manned reconnaissance plane, except that they’re not physically on board. Technology enthusiasts believe the next generation will reverse those human-to-robot ratios and allow one operator to control several autonomous systems at once, or even a self-organizng “swarm” of many robots.
But, said Air Force Lt. Col. Anthony Tvaryanas, when you describe such visions to the young veterans of real-world unmanned operations in Afghanistan and Iraq, “who have experience in today’s limited systems, the ones with huge usability barriers, they have a hard time buying this.”
As these jaded operators rise up the chain of command and shape institutional cultures, Tvaryanas said, it’s their hearts and minds that unmanned-systems advocates have to win. The key is fixing the human factors — which heretofore have been engineering afterthoughts. “We see this time and time again: We do not start involving the humans until late, until we’re comfortable that the technology is mature,” said Tvaryanas. But the later a problem is discovered, he warned, the harder and more expensive it is to fix.
So the workshop spent much of its time exporing new ways to ease human control of unmanned systems. Many presenters at the workshop were looking at intelligent software agents that could take some of the burden of directing unmanned systems off the shoulders of the human operator. The solution that seemed to excite the panelists the most was the idea of giving the robots a “playbook,” a set of standard responses to common situations that the human operator could invoke when appropriate, like a coach or quarterback having a football team run plays.
The Army’s Army Aviation and Missile Research Development and Engineering Center (AMRDEC), working with NASA’s Ames Research Center, has already developed several standard “plays” and tested them both in simulation and with actual aircraft. Most of them are simply “variations of ‘go and look at something,'” AMRDEC senior research engineer Lisa Fern explained to Breaking Defense: They’re shortcuts so the human operator can get drones moving in the desired direction quickly without inputting a lot of detailed instructions. (The military is much more hesitant about giving drones comparable autonomy in, say, firing weapons). Just tell the drones to run a “play” on a designated area, for example, and the software will take care of plotting how each drone will fly to the area and orbit over it, then send them on their way.
Researchers’ next priority is giving human operators more flexibility to customize the playbooks’ software-generated flight paths for specific situations. The Air Force Research Laboratory and contractor SIFT have a program called FLEX-IT that lets an operator switch quickly back and forth between calling plays, which the drones then execute autonomously, and taking direct control, setting a selected drone’s exact flight path or even flying it with a joystick. (The program’s full name is Flexible Levels of Interaction – Interface Technologies).
Beyond playbook-based control of several unmanned systems is the farther-future vision of a “swarm,” large numbers of robots able to coordinate their own behavior with only broad guidance from a single human operator. “I grew up on a farm,” said Col. Dougherty. Watching dogs guide flocks of sheep and other herding animals, he said, “it’s amazing what they can do.” That, he argued, is the model for swarm-style autonomy: The robots are the herd animals, autonomous but not very smart, while the human operator acts as the sheep dog.
More autonomy isn’t always the solution, however. When operators do have to take more direct control of unmanned systems, they are badly hampered simply by not being in the vehicle. “In a prior life I was in the airplane, I was there, so a whole bunch of information was being fed to me simply because I was in it,” like whether the aircraft was accelerating or not, said Col. Dougherty, a former F-16 pilot. When the operator is in a command post on the ground, however, his screens may tell him the vehicle is moving ahead or swiveling its sensor array, but his inner ear and his peripheral vision are both telling him he’s standing still.
Some of the solutions on offer at the workshop included stereo images to improve depth perception, audio cues in three dimensions to alert operators to what’s happening behind them, and virtual-reality “telepresence” goggles that let the operator turn his head to see to the side, instead of sitting still and watching images slide past on a screen.
What’s essential, said Dougherty, is to break down the cultural preconceptions in the Air Force and elsewhere about what a proper control interface looks like. What works for manned aircraft may not translate to unmanned. “I don’t need a cockpit to feel good about myself,” he said. “What you need is an appropriate interface, [whether] it’s a dome that I’m immersed in or it’s a series of flat panels or something that comes down over my eyes with gloves.” Our thinking about how best to control the new unmanned technology is still catching up to the possibilities.
Israel signs $583 million deal to sell Barak air defense to Slovakia
The agreement marks the latest air defense export by Israel to Europe, despite its ongoing war in Gaza.