WASHINGTON: Air Force acquisition czar Will Roper’s long-held vision of an AI co-pilot ––  a real-world version of Star Wars’ famous R2D2 droid — took a step closer to reality last night, with a flight test of artificial intelligence technology working as a crew member on a U-2 spy plane.

“This is cool. An AI “guy in the back” sharing the mission load is a real step forward,” commented Mark Gunzinger, Mitchell Institute director of future concepts, in an email.

“I think it’s fantastic,” ‘Hawk’ Carlisle, head of the National Defense Industrial Association and a retired general who ran Air Combat Command from 2014-2017, told me in an interview today. “I think most everybody sees it as a continuation the evolution that’s occurred so far” in fourth and fifth generation fighters — an evolution “that makes that person in the loop, that young woman, or young man, that much more effective, that much more capable.”

Roper has been talking about the possibility of AI pilots since his days as head of DoD’s Strategic Capabilities Office. And, as Breaking D readers know, the Air Force also has been working on AI-piloted teaming under the Skyborg autonomous drone program. Indeed, Roper — a sci-fi buff — has nicknamed the future ‘brain’ for Skyborg ‘RD2D.’ The first Skyborg prototypes are expected to fly this coming July.

“ARTUµ’s primary responsibility was finding enemy launchers while the pilot was on the lookout for threatening aircraft, both sharing the U-2’s radar,” the Air Force press release issued this morning said. And further, the AI algorithm, called ARTUµ, not only operated, but actually controlled, the on-board radar looking for ground-based enemy air defenses during the 2.5 hour reconnaissance mission at Beale AFB in California.

“With no pilot override, ARTUµ made final calls on devoting the radar to missile hunting versus self-protection,” Roper explained in an op-ed in Popular Mechanics today. “The flight was a small step for the computerized copilot, but it’s a giant leap for “computerkind” in future military operations.”

ARTUµ — which Roper said in his op-ed was based on the µZero AI computer program that has been used to beat humans at chess, and Go —  was developed by Air Combat Command’s U-2 Federal Laboratory co-located at Beale. In October, the lab notched up another first by updating a U-2 Dragon Lady’s software while the aircraft was in flight, using the Kubernetes software tool that ensures a change in one piece of software code doesn’t create bugs in the overall system. Researchers trained it “to execute specific in-flight tasks that otherwise would be done by the pilot,” the Air Force press release said, using “over a half-million computer simulated training iterations.”

“The flight was part of a precisely constructed scenario which pitted the AI against another dynamic computer algorithm in order to prove the new technology,” the Air Force release added, but provided no detail about what the ‘bumpers’ were on the test.

Gunzinger said it’s unclear what the boundaries on the AI were because there is “not enough in the announcement to really tell.” That said, he noted that “the ‘second competing algorithm’ is interesting. [I] suspect they used a proven algorithm as a baseline to compare performance of ARTUµ against the same known target set. When I say “known” target set, it was probably known to the operator but not ARTUµ.”

“I think there’s a lot of work still to do,” said Carlisle, who has been following the work at Beale. “I think technologically we’ll get there…When you think about that, there’s things that are uniquely qualified for AI. For flying an instrument approach, having an AI copilot makes sense.”

That said, he noted that the real question will be “really understanding where a person in the loop needs to be. There’s things machines can do faster than the human mind,” but “you have to have the appropriate intersection where the people are part of the decision process.”

Teal Group’s Richard Aboulafia said in an email this morning that “AI has always had potential to improve targeting and force application in full-up shooting wars. Full-up shooting wars, thankfully, are quite rare.” For at least the past two decades, ground-centric counterinsurgency fights, no-fly zone enforcement, maritime patrol, and other proxy conflicts have defined the American way of war, all missions where “AI is of very limited utility,” Aboulafia said.