SAVANNAH, Ga. — For the first time the Army plans to stress test its Next Generation Command and Control’s integrated data layer on classified networks at the service’s sprawling Project Convergence Capstone 5 event, according to two officers involved in the program.
The Army demonstrated the integrated data layer’s abilities during the NetModX exercise in New Jersey back in September, but Convergence will be a bigger test.
“Our last demonstration […] was in New Jersey but that was more to push the data across tactical transport modalities. But this is going to be the first time we’ll run on secure classified networks. We’re pulling live data and then integrating it in all the different warfighting functions. So it’s a pretty big, pretty big push,” Col. Matt Skaggs, director of tactical applications and architecture at Army Futures Command, said in an interview on the sidelines of the Army’s Technical Exchange Meeting Tuesday.
“That’s candidly never been done before,” he added.
eBook: Project Convergence, the Army’s tech showcase for the future
The integrated data layer is a user interface where sensors from multiple domains can work together to supply information on enemy targets and other data the warfighter needs to make important calls — the type of key data-sharing capability the likes of which the Pentagon has been pursuing since launching its broader Combined Joint All Domain Command and Control (CJADC2) push. The data layer creates a framework where for example, artillery, operational, aviation and other systems can talk to each other so the warfighters operating these systems don’t have to digest the various data separately, Nielsen said.
“The initial focus for our strategy is building an integrated data layer for our ecosystem, and that allows all the different applications that normally would be on a different data model to be on the same foundation. So everybody can have the same view of the operational environment,” Skaggs said.
RELATED: Pentagon’s CJADC2 dreams ‘very aspirational’ right now, Marine Corps general says
As a former intelligence officer, Skaggs said the process of making sense of data without an interface like the integrated data layer was much more complex.
“So as an intel guy, we would pull data off of all the airborne sensors and all the ground-based sensors, and it would give us a view of what the enemy is doing,” he said. “But as you have separate data ingress and each of those data fields are getting processed differently, they’re not compatible. So we would have five or six different views of the battlefield, and so what we’re trying to do now by combining all that data and curating it all in the same place, that allows us to overlay all of those aforementioned fields at the same time.”
Col. Mike Kaloostian, director of transportation and network security for Army Futures Command, said the integrated data layer is “absolutely the focal point” of the Army’s Next Generation Command and Control initiative — the service’s plan to create an integrated C2 structure focusing on data centricity at all echelons.
He said that with the integrated data layer approach, the service will be able to focus more on the interpretation of data instead of collecting it.
“If we’re going to handle that type and process that type of data and push it all the way down to the edge, a tactical formation in the field […] we have to think about it differently, right?” Kaloostian asked.
The new data layer will help the service to focus on three areas: edge computing, the diverse transportation of data and software defined networking, he added.
“Those are the three areas that we’re really focused on,” Kaloostian said. “Moving there, orchestrating all of that to ensure that we’re getting data to the edge, and that data is flowing appropriately amongst the echelons before horizontally and vertically. That’s the focal point.”
But the colonels said the data transport layer won’t stop there. It will also have artificial intelligence and machine learning capabilities baked in that will perform data correlation and normalization, Skaggs said.
“If we have an example of correlations, we have three different sensors up, and they’re coming back with three slightly different views, or where an entity would be on a battlefield, the AI can infer where the correlation of that collection sigma is, and then put where that dot is.”
He added that this can also help differentiate between enemy systems and the Army’s own systems.
“Machines have a hard time parsing that, unless you manually correlate it. So the AI will pull out different data fields and see where the overlap is and then render that into an object that’s usable by the rest of the applications,” Skaggs said.