CYBERCON: If the Pentagon gets cloud computing right — and that’s a big if indeed — it just might square the circle between accelerating acquisition and improving security, a senior cybersecurity official said here Tuesday.
“By nature, there is a trade between the objective of agile software development and the assurance properties that we’re trying to achieve,” said Mitchell Komaroff, the principal cybersecurity advisor to DoD CIO Dana Deasy. But, he said, there are ways to reconcile them.
Right now, the Defense Department has the worst of both worlds. Cybersecurity certification can be so laborious and bureaucratic that software is often obsolete before it’s finished testing. Yet the final product isn’t necessarily secure anyway, because new threats emerge faster than the Pentagon can upgrade its defenses.
So being too slow on acquisition is bad for security as well. But speeding up acquisition creates risks of its own. For one thing, China is gleefully pilfering US trade secrets, including from the defense sector, and using them to improve its military. Getting the Pentagon to move faster doesn’t win any arms races if Beijing steals the new tech as fast as we can field it.
“We’ve in effect become the R&D base for our adversaries’ capabilities,” warned Maj. Gen. Thomas Murphy, director of the Pentagon’s Protecting Critical Technology Task Force. “My task force isn’t here to stop stealing because stealing is bad, although it is. It’s to stop it because it’s causing the erosion of the lethality of the joint force.
“We talk a lot about acquisition going faster, we have to have the speed,” Murphy said at yesterday’s CyberCon conference. “That’s great, it’s clearly slow and cumbersome the way it is, but let’s not get out in front of our skis and get so ahead, so fast, that we are not considering security.”
Murphy argues cybersecurity needs to be as important as speed, not an afterthought to it. “We need to [put] security in our requirements and acquisition process,” he said. “Until we grade people on security as well as cost, schedule, and performance, why would you go and do the extra credit work on security?”
Yes, Murphy said, there could still be a waiver process to skip security requirements when speed is of the essence – but it needs to be controlled at a much higher level than it is today. “Today there are far too many instances where there’s an O-4 or O-5 [a major or lieutenant colonel] program manager making risk decisions,” he said. In the system he’s putting together, “maybe it’s the service secretary.”
Centralizing decisions can certainly improve consistency and control – but it rarely speeds things up.
DevSecOps to the Rescue?
So, I asked, how do you reconcile security and speed? “The DevSecOps approach comes to the rescue,” Komaroff said.
See, the Defense Department is desperately trying to catch up with Silicon Valley – and stay ahead of Russia and China – by borrowing a private-sector practice known as DevOps. Instead of having development and operations done by separate teams with little contact, DevOps merges the two, so the people writing new code can get instant feedback from the people who actually have to use it, and the users can request upgrades directly from the developers. The common variant Komaroff is referring to is DevSecOps, which brings security experts into the mix so they can check the code both as it’s being written and while it’s being used.
One best practice that DevSecOps teams often use is to keep most of the code constant and introduce new features as plug-and-play modules that don’t affect the fundamentals of how the software works. That way, you can upgrade one aspect without affecting the others, such as security.
This technique is particularly applicable for cloud computing, Komaroff said. The way cloud works, the provider has to set up a foundation of hardware and software on which individual users’ data and custom software must reside. If the provider sets certain cybersecurity specifications, the clients must comply – in fact, if they don’t, their software may not be able to run on the cloud at all. Conversely, the users no longer have to reinvent the cybersecurity wheel for each of their databases and applications: They can rely on the cloud to protect them much of the time.
The goal, Komaroff said, is to design the cloud’s overall architecture – the “container” into which users’ code must fit – to be “loaded with as many security properties” as possible. Then you keep that foundation as stable as possible, making changes only slowly and deliberately and with extensive testing. Meanwhile, on a different and much shorter cycle, you can allow what’s built on top of that foundation – the “business logic” used by a given client – to evolve rapidly.
Of course, none of this is easy to do. The Defense Department has hundreds of different cloud computing systems already, many of them incompatible, and it keeps adding more. The Pentagon has struggled just to run a fair and open competition for a general-purpose cloud, called JEDI, which is meant to replace many but not all of these existing clouds with a new, unified, and highly secure system. Actually building the new cloud – and building it securely – will be harder still.
We fed every 2024 Pentagon briefing into ChatGPT. Here’s what it thought.
The US national security establishment is cautiously embracing generative AI, so Breaking Defense decided to do an experiment.