via Reddit

Russian Uran-9 armed unmanned ground vehicle

WASHINGTON: Russia has created a new robotic combat unit of Uran-9 unmanned ground vehicles, which have been battle-tested in Syria, though with mixed results. It’s also developing an experimental unmanned version of its T-14 Armata tank, unmanned derivatives of the Cold War T-72 and BMP-3, and new long-range drones called Okhotnik and Altius.

But Russia’s quest for battle robots faces many of the same technical and policy problems as the US, said CNA and CNAS scholar Samuel Bendett, and Vladimir Putin is on a much tighter budget. Russia isn’t manufacturing useless Potemkin robots for propaganda purposes, but they’re not building the Terminator, either.

CNA photo

Sam Bendett

In many ways, Bendett told me in an interview, the US and Russian military robotics programs are much alike. Both have grand ambitions for highly autonomous war machines; both struggle with the limits of current unmanned systems that require constant human supervision; both worry that future AI might undermine human control.

US officials warn that Russia and China lack the ethical self-restraint of Western nations when it comes to battlefield automation. But, Bendett said, Russian leaders at least sound a lot like Americans in their insistence that a human, not a computer, must make the decision to use lethal force – at least for now.

“Russian MOD statements point to a US-like approach with a human firmly in the loop to make the final strike decision,” Bendett told me. “[It] is said across the MOD that a robotic machine should be never fully autonomous, since it won’t be able to replicate a human soldier’s full decision-making experience, and therefore cannot match true  human ‘intellect.’

“At the same time,” he noted, “there are official discussions, statements and deliberations coming from the MOD and its affiliated intuitions that point to an ever-growing role for military robotics in future combat — and the almost-inevitable full autonomy for these machines.”

“There are multiple projects in Russia where the set goal is to have the robotic system navigate to target on its own and wait for the final human approval to strike,” he said. “The new crop of heavy killer drones such as Okhtonik and Altius would supposedly have on-board AI for target selection, identification and even destruction, along with the ability to plot a course to and from target.”

Vitaly V. Kuzmin - http://www.vitalykuzmin.net/?q=node/604

Russia’s new T-14 Armata tank on parade in Moscow. The Armata program has been repeatedly cut and delayed, but now manufacturer Rostec plans an unmanned testbed version.

But that Russian capability may be more rhetoric than reality right now. The Uran-9 mini-tank, for instance, is routinely described as being able to autonomously navigate to and destroy a human-designated target. Yet, he said, “all public data we have on its testing still shows a human operator directing the vehicle” by remote control, much like the US Army’s Robotic Combat Vehicle testbeds. Combat experience in Syria also showed Uran-9 to be lightly armored and logistically burdensome.

Despite the emphasis on autonomy, Bendett said, “all Russian autonomous ground robots and drones today operate with constant human supervision.”

Besides physical unmanned vehicles, Russia is also experimenting with AI decision-making systems for capabilities that require lightning-quick reactions, like electronic warfare and missile defense. Even here, however, human control is paramount.

“We know that Russia is seeking to use AI in early-warning radars and in radar systems specifically tasked with nuclear warning,” Bendett said. “But the human is the key decision-maker in such systems, given Russian concerns that an AI system may launch weapons on its own after analyzing ‘dry facts,’ without the benefit of additional input that is based on human operator experience.

Vitaly Kuzmin photo

The mobile radar for the S-300 air defense system.

“Specifically, there was an exercise where such an AI system took data from radars, aircraft and ground-based anti-aircraft systems in Crimea; an exercise involving pulling and analyzing data by AI from Russia’s long-range radars and radar at S-300 and S-400 systems across the country; and an exercise at the Caspian Sea that pulled data from ship-based and other assets,” Bendett said. “This data could converge at the National Defense Management Center (NDMC) that opened in 2014, [which] is tasked with overseeing all data and information about Russian military at home and abroad. Last year, NDMC announced that it would utilize AI for decision-making, while humans will be firmly in control of final decisions.”

That sounds a lot like the US military’s experiments in Joint All-Domain Command & Control (JADC2), where an AI-driven network shares targeting data and other intelligence among forces on land, sea, air, space, and cyberspace. But as the American experience with JADC2 shows, getting such grand concepts to work in practice can be tremendously difficult.

Putin famously said in 2017 that the country that mastered AI would “become ruler of the world.” To this day, Bendett told me, “he and the MOD [Ministry of Defense] leadership speak of the importance of AI in weapons systems, the need for military autonomy.”

But, “of course, at this point it’s mostly rhetoric and future projections,” Bendett continued. “Current technological reality does not allow for the development of truly autonomous military systems” – yet.