Army photo

A soldier using the touchscreen-controlled, AI-assisted 50 mm cannon.

ABERDEEN PROVING GROUND, MD: One touch, one kill — is this the user-friendly future of warfare?

I sat in front of a touchscreen, watching black-and-white infrared video of the gunnery range outside. Lined up on the left edge of the screen were still-image close-ups of what an experimental AI had decided were valid targets: dummies representing enemy infantry and vehicles, plus a real pick-up truck. Disengage safety, tap a target with your finger, and, 20 yards away, an unmanned turret automatically slewed to aim its 50mm cannon at that target.

Army photo

A VR version of the ATLAS/ALAS-MC fire control interface.

If one more button had been enabled, I could have opened fire. But the Army didn’t enable that particular option for reporters checking out its experimental AI targeting system, ATLAS. In fact, the 44 soldiers who’ve tested the system since August haven’t gotten to shoot live ammunition either. Combat Capabilities Development Command (DEVCOM) is currently getting their feedback on the interface, not the gun itself. Sometime in the next month, Army officials said, they plan to live-fire ALTAS here at Aberdeen’s Edgewood range.

After that live-fire – the exact date is still TBD – the ATLAS program will analyze the data and swiftly report to senior Army leaders, said center scientist Richard Nabors. While that wraps up the current phase of the not-quite two-year-old ATLAS project, he told reporters, the Army’s decision on what to do next, Nabors said, could come “very shortly.”

Real soldiers’ feedback from this fall is “critical” to that decision, Nabors emphasized. What’s happening this fall at Aberdeen isn’t a formal developmental test – that happens much later in the development process – but something the service calls a Soldier Touch Point (STP). It’s part of the Army’s new push to get real troops’ input on new technology early and often during R&D, when it’s easier and cheaper to make changes and fix problems.

Army photo

The AI-assisted 50mm gun, installed on the Army’s ALAS-MC turret, which in turn is on a General Dynamics Griffin chassis

The system those soldiers — and a few lucky reporters — got to try out is an experimental combination of several separate research projects.

  • The artificial intelligence itself, the Advanced Targeting & Lethality Aided System, is a joint development by DEVCOM’s C5ISR Center at Aberdeen and its Armaments Center in Picatinny. ATLAS currently takes video feeds from infrared cameras; future upgrades could bring in radar and other sensors. The AI algorithms look for patterns in those images, recognize potential targets, highlight those images onscreen for the human operator, and provide detailed targeting data to a connected fire control system.

    Sydney J. Freedberg Jr. photo

    Mike Peck of General Dynamics shows the difference between the new 50mm round (left) and the current 25mm (right).

  • That fire control system, in this case, is another Armaments Center experiment, ALAS-MC (Advanced Lethality & Accuracy System for Medium Caliber). The version of ALAS-MC in use at Aberdeen replaces the traditional panoply of buttons, switches, and scopes with a single touchscreen display. Future upgrades will include a prioritization algorithm that looks at the targets spotted by ATLAS and tells the operator which ones are the biggest threats and must be dealt with first.
  • The weapon is the prototype XM913 chaingun, built by Northrop Grumman. For this exercise, the gun is integrated into an experimental unmanned turret, which, in turn, is mounted on the Griffin armored vehicle chassis developed by General Dynamics. The chaingun’s 50mm rounds are many times larger and more powerful than the current 25mm used on the Army’s M2 Bradley. They fly twice as far and include both armor-piercing and air-bursting explosive variants.

Now, none of these projects is dependent on the others. You could install them on different vehicles, or plug in a larger or smaller weapon, for example, or connect more sensors. (For example, there’s a plan to upgrade ALTAS’s infrared camera to a higher-resolution third-gen FLIR). Even without the AI element, the ALAS touchscreen would still be an improvement over current fire controls, and the 50mm would still be better than the current 25mm.

Put all of them together and they truly start to shine. When I tapped the infantry target on the screen, the system not only automatically slewed the 50mm gun to bear but automatically selected the ideal ammunition. In this case, it was a burst of multiple explosive shells, fused to explode amidst the hostile troops. The system even adjusts the detonation points to the shape of the enemy formation. (Two well-placed airbursts can kill as many infantry as 96 non-explosive rounds, Army scientists have calculated). When I tapped a vehicle instead, the system recommended a single round of armor-piercing ammo. Other options include anti-building — with the rounds fused to explode only after penetrating into the wall — and anti-helicopter, which also airburst rounds, but with different fuse settings.

Army photo

The gunner’s station in an M1 Abrams, in action in Iraq.

Compare that to current, manual controls. On the M1 Abrams heavy tank today, the commander scans in all directions for threats and targets, then picks one for the gunner to engage. The commander can slew the turret to point the gun in the right direction. But then it’s up to the gunner to get the target in the cross-hairs, select the 120 cannon or the coaxial machinegun depending on the target, tell the ballistic computer what kind of ammo it’s firing (the actual round is manually loaded by another crewman), use a ranging laser to determine the precise distance, and fire. And just in case that sounded simple, you have to hit all the right switches and toggles by muscle memory, without looking at them, because you never want to take your eyes off the gunsight.

A well-trained crew can go through this process in seconds, but the Army’s suggest ATLAS can be faster. Seconds do count in combat. Historical data shows the survivor of a tank battle is usually the side that sees the other and opens fire first. The slogan of the ATLAS project is “time is a weapon.”

Army photo

A soldier operates the ALAS-MC fire control while a civilian technician monitors the system.

ATLAS could be faster still if the Army were willing to take the human gunner out of the loop and allow the AI to fire the gun. That won’t happen, unless U.S. policy changes. The American military – unlike Russia and China – is profoundly wary of what autonomous weapons could do without human supervision. The US sees its highly trained troops as an asset that AI should empower, not an obstacle for AI to bypass.

Instead of replacing humans altogether, the Army’s Next Generation Combat Vehicle initiative wants to use AI to let one human do the work of two, combining the traditional commander and gunner roles. (Other AI could replace or assist the driver). That would allow one human to remotely operate multiple robots and manned vehicles to have smaller crews.

I’m not sure this is a good idea. Sure, AIs like ATLAS can help spot targets and aim the gun, but modern battlefield networks can bring in so many additional sources of information – feeds from drones, unmanned ground robots, jet fighters, even satellites – that you may need a human crewmember whose whole job is making sense of it all, separate from the AI-assisted gunner. But that’s the kind of question this experiment and others are trying to answer, before troops find out what works or doesn’t work the hard way, with lives at stake.