Opinion & Analysis
Opinion

Defending the homeland means fighting disinformation

Disinformation is not just a policy issue, but a wider military threat to homeland security, writes William Coffin in this op-ed.

A US Air Force client system technician types on his computer at Maxwell Air Force Base, Alabama, Feb.19, 2025. (U.S. Air Force photo by Senior Airman Elizabeth Figueroa.)

Hour by hour, America is under attack from foreign actors — not via bombs, but via the screens we surround ourselves with. Disinformation campaigns from Russia and China infiltrate platforms, email inboxes, televisions, and everyday conversation, systematically targeting American society. The evolution of artificial intelligence has enhanced these attacks with convincing deepfakes, automated campaigns, and new methods to exploit vulnerabilities.

This is a direct attack on America itself, as disinformation is a highly effective tool used by China and Russia to erode societal resilience and, by extension, challenge homeland defense by influencing American decision-making and shaping American perceptions. And if America is under attack, then the US military has a role to play in keeping the homeland safe. At a time when there is little societal trust, the military needs to step up into a higher-profile role directly combating foreign disinformation efforts.

The information environment is a vital layer of homeland defense, a cognitive battlefield where beliefs and perceptions are shaped and contested. Adversaries use it to spread false narratives, influence decision-making, and erode resilience. Disinformation exploits friction points and undermines trust in institutions. Homeland defense means more than intercepting missiles and guarding the border; it means contesting the information space adversaries currently occupy in America’s cognitive domain.

Russia’s disinformation campaign follows the “firehose of falsehoods” model: a constant stream of false information across social media so pervasive it overwhelms countermeasures. In 2020, Russian troll farms reached 140 million American users monthly, with platforms’ algorithms amplifying Russian disinformation in newsfeeds. These posts exploit divisions and target disaffected groups to serve Russian objectives.

China approaches things somewhat differently. Beijing aggressively pursues what analysts describe as ‘information pipes,’ acquiring influence in mainstream outlets such as the Associated Press and Reuters to funnel state propaganda to American audiences. China places stories from state-owned media, such as the Global Times, in front of users who may not recognize them as official propaganda and instead interpret them as legitimate news. When the BBC reported on human rights abuses against Uyghur Muslims, the PRC responded by deploying an organized network of trolls and 57 fake news websites to challenge the BBC’s credibility.

Artificial intelligence has exponentially increased these threats. Deepfakes have evolved from crude forgeries to sophisticated impersonations that fool government officials, including cases where AI-generated voices successfully impersonated the Secretary of State. Russia and China now use AI to generate content at an unprecedented scale, creating highly targeted and personalized disinformation that adapts in real time to maximize impact.

America’s response has been inadequate. Previous efforts stalled or collapsed under political pressure. The Biden administration’s DHS Disinformation Governance Board, intended to help the public “decipher disinformation” and coordinate threat response, was disbanded three weeks after backlash about government overreach. Additionally, with the dissolution of the Director for National Intelligence’s Foreign Malign Center and the State Department’s Global Engagement Center, America lacks a defense to counter domestic disinformation.

presented by

Solution: The Military’s Role In Information Warfare

The Trump administration’s 2025 National Security Strategy prioritizes  protecting “this country, its people, its territory, its economy, and its way of life from military attack and hostile foreign influence.” With over half of Americans getting news from social media, disinformation is a direct hostile influence. To counter this threat, the DoD should make it a priority and use its resources, including U.S. Northern and Cyber Commands, who have both the expertise and mandate, to counter these information threats as part of homeland defense.

The military’s most powerful tool against disinformation is trust. 82 percent of Americans trust the military, far more than Congress (39 percent), the Presidency (45 percent), or news media (33 percent). When fake accounts and narratives spread, Americans need a trusted voice. The military, as one of the most trusted institutions, can deliver the voice that will help combat the bias disinformation relies on.

The Defense Department already employs analysts who integrate intelligence to detect malign influence and predict adversary actions. The same methods protecting deployed forces can protect citizens at home from disinformation. The United States Army is already standing up units that are focused on disinformation in the IndoPacific and European theaters. The military should declassify intelligence that exposes disinformation, identifies fake accounts, and warns about domestic threats. This “prebunking” approach, inoculating the public against disinformation before it takes root, proved remarkably effective when the Biden administration declassified intelligence to expose Russia’s invasion plans and counter Kremlin disinformation about Ukraine, stripping away Putin’s pretexts and beating Russia in the information war.

It must be acknowledged that military involvement in counter-disinformation efforts carries significant societal risks. Political actors on both sides have used accusations of “fake news” and “disinformation” to undermine legitimate coverage and criticism. Government officials have at times amplified false or unverified narratives for political gain.

To address these risks, Congress, the White House, the Pentagon, and media institutions must develop and enforce clear frameworks that distinguish foreign adversary operations from domestic political disagreements. As foreign adversaries become increasingly adept at exploiting the information environment and exacerbating societal divides, the military must avoid entanglements in political debates about truth. However, in defending against coordinated foreign information warfare targeted at American security, the military’s intelligence resources and institutional credibility can make a decisive, necessary contribution.

Navigating the political aspects of the information war and preventing the politicization of the military requires strict adherence to DoD doctrine and regulations, operating within existing authorities, and transparency throughout operations up to the point of revealing capabilities to foreign adversaries. The Army’s field manual for Public Affairs Operations, FM 3-61, currently provides foundational definitions and operational frameworks that can establish critical guardrails for the Joint Force’s role in countering disinformation. Taking this document and creating enforceable regulations, such as a DoD Directive on combating disinformation, would help establish guardrails for military members engaged in the information war. 

An example of where these directives are incredibly helpful in preventing the military from overstepping its bounds is DoD Directive 5240.01, which governs intelligence analysis. This directive helps enforce boundaries around intelligence analysis in the DoD and prevents analysts from across the Joint Force and Intelligence Community from breaking norms. Within this directive, one can find the definition of military-related “intelligence activities” and the reports due to policymakers to ensure transparency. This directive also ensures compliance with Executive Order 12333, which safeguards the rights of American citizens and prevents intelligence analysis from infringing on those rights. 

Counter-disinformation efforts demand similar protection through a directive or regulation to prevent their misuse. Specifically, regulations should not only define disinformation but also establish that military counter-disinformation activities must target only foreign state-sponsored operations attributed by the intelligence community, require oversight from the Office of the Secretary of Defense, and define the operational role that the assigned combatant commander has in ensuring that the civilian-military divide is protected. 

Regulation and transparency must work in tandem to help these efforts avoid politicization. The Department of Defense should regularly brief Congress on these efforts to provide transparency, allow for oversight, and ensure efforts remain focused on foreign adversaries. The department must also regularly engage the media on these efforts to build trust and ensure accountability.

Disinformation isn’t just a policy problem; it’s a military threat to homeland security. Russia and China are waging information warfare against American society, and America must fight back with the institutions best equipped to win: the Department of Defense.

By declassifying intelligence, exposing adversary influence operations, and prebunking disinformation before it spreads, the military can defend the cognitive domain as effectively as it defends our physical borders. The question isn’t whether the military should engage in this fight; it’s whether America can afford to leave this battlefield undefended any longer. Protecting our information space and national unity demands urgent, decisive steps starting today.

William Coffin is a strategic intelligence professional with expertise in disinformation and operational planning. Previously serving as a Senior Intelligence Planner at the Department of the Army, the Defense Intelligence Agency, and the United States European Command, he possesses over fifteen years of experience in intelligence and global strategic operations. X: @Wcoffin

These views expressed are solely those of the author.