UKRAINE-RUSSIA-CRISIS-MILITARY

A Ukrainian soldier launches a hand-held drone in 2015 (Petro Zadorzhnyy/AFP via Getty Images)

WASHINGTON — For the US Air Force, few near-term initiatives seem as poised to change the nature of battle for the service as its collaborative combat aircraft (CCA) drone wingman program — which, if current goals hold, could start production within five years.

The Air Force pushing ahead with CCA reflects the culmination of years of work, but also underscores the new reality of modern warfare highlighted by conflicts raging around the globe: Drones are here to stay, and their importance seems likely to only grow as militaries find ways to innovate and increase their utility. 

Air Force Secretary Frank Kendall showed his service is serious about the CCA concept this year, committing to spending over half a billion dollars in fiscal 2024 alone to chart a course toward a first tranche of 1,000 uncrewed systems. CCA, officials say, could serve as missile trucks, perform sensing or electronic warfare missions and even draw enemy fire. 

Yet CCA represents the more exquisite end of contemporary drone warfare, where swarms of commercially-available, cheap suicide drones and loitering munitions have proven devastatingly effective in Ukraine, Nagorno-Karabakh and now Israel. Taking note of the development, Deputy Secretary of Defense Kathleen Hicks unveiled the Pentagon’s own initiative to field thousands of drones within two years, an effort dubbed Replicator, that chiefly aims to deter China. 

RELATED: 5 companies in early running for Air Force’s CCA drone wingmen

Where the Pentagon eyes potential with drone warfare, it also sees substantial risk to its own operations. The threat posed by enemy drones has become so overwhelming that the Army now eventually expects every soldier to be trained up to fend them off, and industry alongside the service is speeding ahead to try to field counter-UAS (uncrewed aerial system) technologies. The need for counter-UAS weapons is so acute, according to the Pentagon’s top weapons buyer, that it now constitutes a “crisis.”

[This article is one of many in a series in which Breaking Defense reporters look back on the most significant (and entertaining) news stories of 2023 and look forward to what 2024 may hold.]

The AI Question

The sprint to field an army of drones is also putting the Pentagon squarely at the forefront of perhaps the most pressing tech-related ethical question facing humanity: how to responsibly fuse artificial intelligence (AI) into daily tasks. Among its sister services, the Air Force, through CCA, is probably one of the furthest branches along in deploying AI in a way that resembles futurists’ fears, where an intelligent system could theoretically wield weapons. 

The course of CCA development thus makes it all the more intriguing in the case of Kendall, a former human rights lawyer. Kendall is pushing the pedal to the floor on moving CCA forward as a necessary means to field an “affordable mass” that can offset the consequences of an aging, shrinking aircraft inventory and effectively deter China. 

In the process, the secretary is not shying away from critical ethical questions implicated in his goals. Though no stranger to the debate, Kendall weighed in on the extent a human should be involved in executing AI-powered tasks — or staying “in-the-loop” — during a recent panel discussion at the Reagan National Defense Forum. While some expect humans to remain intimately involved in decision-making for AI operations, Kendall made clear that structure would be a recipe for failure.

If the human being is in the loop, you will lose” he said bluntly, though he stressed that people can remain involved in supervising AI operations. Kendall then went on to describe work done by DARPA that demonstrated an AI “routinely” beats a manned aircraft in combat, which he warned is “the reality we’re going to have to face.”

However, Air Force officials for now mostly downplay the idea of machines firing at foes independently. For example, a top Air Force official at an AFCEA summit on Dec. 14 echoed Kendall’s comments about losing if a human is kept in the loop. But then Kristyn Jones, who is performing the duties of the Air Force under secretary, said, “Like every other system we use, [AI technologies] will be governed by the laws of armed conflict, and there will be a human in the loop for all lethal decisions.”

Tim Grayson, a special assistant to Kendall, later at the AFCEA summit described CCA autonomy as mostly consisting of “glorified autopilot.” CCA autonomy is for “execution of functions, not decision making of those functions,” he explained.

As for the testers on the cutting edge of AI-powered development like Bill “Evil” Gray, the chief test pilot at the Air Force’s Test Pilot School at Edwards Air Force Base, operators are under no illusions that fielding AI is fraught and must be done with care.

“We see it as a very difficult problem with very real risks. And we are cautiously pushing ahead,” Gray said in a November interview with Breaking Defense at Edwards. 

vista

A ground crew preps the X-62 Variable In-flight Simulator Aircraft (VISTA) at Edwards Air Force Base to fly on Nov. 30, 2023 (Michael Marrow/Breaking Defense)

Gray, who pilots the Air Force’s X-62 VISTA aircraft that serves as a key autonomy testbed, emphasized that at least for the VISTA, no plans exist to evolve the platform into an armed combatant. Even still, the plane plays an important role in refining AI, making it all the more critical that users employ the jet, and the software that can be loaded into it, responsibly. 

“I don’t work with a bunch of mad scientists,” Gray said. “It’s quite the opposite in fact.”