We could begin by working through the existing global security framework. NATO allies, for example, could collaborate by sharing forensic intelligence from cyberattacks and building better detection and response techniques.
Newsletter Sign Up
Continue reading the main story
Sign Up for the Opinion Today Newsletter
Every weekday, get thought-provoking commentary from Op-Ed columnists, the Times editorial board and contributing writers from around the world.
Thank you for subscribing.
An error has occurred. Please try again later.
You are already subscribed to this email.
- See Sample
- Manage Email Preferences
- Not you?
- Opt out or contact us anytime
Separately, countries could create international working groups to discuss how to react to attacks and what to do in the days or weeks before we know where they came from.
It’s unrealistic to expect that any single country will unilaterally disarm its cyberarsenals while the threats remains. But governments could begin to discuss what constitutes a reasonable response when one state is attacked by another in cyberspace.
Otherwise, it’s only a matter of time before a nation under cyberattack responds by bombing the likely culprit even before the evidence is conclusive.
The United States is uniquely positioned to lead this effort and point the world toward a goal of an enforceable cyberwarfare treaty. Many of the institutions that would be instrumental in informing these principles are based in the United States, including research universities and the technology industry. Part of this effort would involve leading by example, and the United States can and should establish itself as a defender of a free and open internet everywhere.
The process needs to be transparent; an effective framework to govern international behavior cannot be created or administered in secret. Since most cyberwar is conducted covertly, governments avoid any public acknowledgment of their own abilities and shy away from engaging in any sort of “cyberdiplomacy.” Statecraft conducted in secret fails to create public norms for deterrence.
The challenge of organizing such an effort in this fraught, unstable international system might seem daunting. Cyberweapons have already been used by governments to interfere with elections, steal billions of dollars, harm critical infrastructure, censor the press, manipulate public conversations about crucial issues and harass dissidents and journalists. The intensity of cyberconflict around the world is increasing, and the tools are becoming cheaper and more readily available.
The cost of inaction is severe. In her Pulitzer Prize-winning history of the outbreak of World War I, “The Guns of August,” Barbara Tuchman describes how a single catastrophic event — the assassination of the heir presumptive to the Austro-Hungarian throne — led to a chain reaction that ignited a global conflict. The assassination was the catalyst, but the ingredients for the chain reaction, in the form of complex military and diplomatic entanglements, had been in place for some time. True, it’s an imperfect analogy in a world that seems to be disentangling more by the day, but it highlights the perils of escalation when powerful countries confront new threats.
We could soon be faced with a similar moment involving cyberwar. A broad international commitment might be the only thing that can prevent the next cyberwar from becoming the next Great War. If we don’t improve our preparedness to meet the challenges of our multidimensional world, we risk proceeding so far down a path of escalation that conflict becomes inevitable.