Should U.S. pledge not to make first cyberstrike?

During World War II,
the United States faced two enemy regimes that deserved to be put down by any means: Nazi
Germany and Imperial Japan.

Yet neither the militaries of those regimes nor those of the United States and its
allies used the terrible chemical weapons that devastated the Western front late in World
War I. Such weapons were difficult to use and would have caused severe counterstrikes by
the opponents.

In the nuclear world of the Cold War, the potential for world destruction forced the
United States and the U.S.S.R. to find ways avoid a final confrontation. Note that the
United States, faced with dozens of Soviet front line divisions poised to take Western
Europe, refused to renounce first use of nuclear weapons if the Soviets attacked.

One byproduct of the planning for Armageddon was the Internet, a network for routing
voice and data around areas hypothetically eliminated in a nuclear exchange. Now there is
much talk in Washington about cyberwarfare—the notion that U.S. forces would try to
cripple the information technology networks of potential opponents.

The idea has permeated even popular culture. The premise of the movie
“Independence Day” is that a U.S.-created computer virus could install itself
and quickly cripple an alien spaceship as big as Saddam Hussein’s ego.

One Air Force general I know has a mantra: “Whoever controls the electromagnetic
spectrum will win the battle.” He says it seriously, but sometimes I think he’s
pulling my leg. Still, the need to jam an enemy’s radar and attack his command,
control, communications, computers and intelligence assets is a doctrine already

The United States demonstrated its capabilities in this area as early as the Gulf War,
unmasking weaknesses in what appeared to be the significant strengths of Iraq’s army.

War is a dirty business. If strikes on an enemy’s IT bring victory with fewer
casualties, then we can’t afford not to do it.

But there is a huge difference between the use of cyberwar techniques to cripple an
enemy’s military assets, and letting loose a computer virus strain that is likely to
also rampage through the civilian portions of the enemy’s infrastructure.

The first difficulty is defining what is military. It becomes difficult to discern
whether an asset is civilian when the same assets are used for both civilian and military
purposes. For this reason, the Baghdad phone exchange deserved military treatment. But the
United States has a high percentage of computing assets per capita as compared with other
nations. Our financial system, our Social Security accounts, our utilities are all
IT-dependent and could be wiped out in cyberwar.

With so much to lose, the U.S. policy interest in cyberwar issues becomes more
difficult to discern.

So, perhaps it’s time to ask a question. Should the United States adopt a public
promise to not be the first to use cyberwarfare against civilian assets?

A good argument can be made that no such public promise should be made, since the likes
of Muammar Qaddafi and Saddam ought to have as many worries as possible. But even if the
Defense Department makes no public promise to forego such action, it seems a suitable
topic for a National Security Decision Directive. Safety of our own IT assets, including
those of the government, is a far more critical need than our society is willing to

Homefront hackers can disable key systems. Remember the kid in Massachusetts who took
over a Bell Atlantic Corp. telephone switch and killed local airport systems while planes
were landing? Think what would happen if a skilled foreign government put serious
resources into planning and carrying out such an attack.

Part of the U.S. Cold War doctrine was to use our soldiers to infiltrate enemy lines
and cripple power plants, pipelines and other infrastructure elements. Soon we may face
the prospect of attacking infrastructure from a distance, through computers.

A corollary of a no-civilian doctrine would be to make clear that any attack on U.S. IT
assets—military or civilian—would result in massive military retaliation.

If we are to express our values and fears through movies and television, perhaps more
to the point than “Independence Day” is “Star Trek: The Next
Generation,” in which we witness Capt. Jean Luc Picard’s morally based refusal
to inject the Borg collective with a virus. Perhaps Picard’s fictional call is one
worth considering in the real world of cyberwar.  

Stephen M. Ryan is a partner in the Washington law firm of Brand, Lowell &
Ryan. He has long experience in federal information technology issues. E-mail him at [email protected].


  • business meeting (Monkey Business Images/

    Civic tech volunteers help states with legacy systems

    As COVID-19 exposed vulnerabilities in state and local government IT systems, the newly formed U.S. Digital Response stepped in to help. Its successes offer insight into existing barriers and the future of the civic tech movement.

  • data analytics (

    More visible data helps drive DOD decision-making

    CDOs in the Defense Department are opening up their data to take advantage of artificial intelligence and machine learning tools that help surface insights and improve decision-making.

Stay Connected