While the United States has no peers in conventional military power, it is especially vulnerable – as a free and democratic society – to cyber misinformation campaigns, a Stanford scholar says.
Herbert Lin, a senior research scholar for cyberpolicy and security at Stanford’s Center for International Security and Cooperation (CISAC), is the co-author of a new draft working paper that spells out the perilous risks facing democratic, wired-up countries around the world.
America’s adversaries are seeking “asymmetric” methods for social disruption, rather than direct military conflict, Lin said.
“Cyber warfare is one asymmetric counter to Western (and especially U.S.) military advantages that depend on the use of cyberspace,” wrote Lin and his co-author Jackie Kerr, a research fellow at the Lawrence Livermore National Laboratory.
This new type of cyber aggression is aimed at winning – and confusing – hearts and minds, the very control centers of human existence, Lin said.
As a result, “information/influence warfare and manipulation,” or IIWAM as Lin describes it, poses profound implications for Western democracies, even though much of it may not be illegal under international law. This approach is based on the deliberate use of information by one party on an adversary to confuse, mislead, and ultimately to influence the choices and decisions that the adversary makes.
A recent example in point would be the 2016 Russian hacking of the U.S. presidential election and the surge of so-called “fake news.”
Lin points out that while misinformation campaigns are not new, the technology to spread it far and wide globally is. He noted that the patron saint of distorting reality for war-like purposes is Sun Tzu, who wrote that, “The supreme art of war is to subdue the enemy without fighting.”
While traditional cyber attacks typically hit hard targets like computer systems, cyber “influence” campaigns are conducted over longer periods of time and rely on soft power – propaganda, persuasion, culture, social forces, confusion and deception, Lin said.
Words and images
How does it work? Lin explains:
“Victory is achieved by A when A succeeds changing B’s political goals so that they are aligned with those of A. But such alignment is not the result of B’s 'capitulation' or B’s loss of the ability to resist – on the contrary, B (the losing side) is openly willing.” That is, such victory shares the focus on subverting the opponent’s will, though not on destroying his military forces.
The ammunition in these cyberspace battles are “words and images,” the kind that persuade, inform, mislead, and deceive so that the adversary cannot respond militarily. In the example of a “fake news” story, they often take place below legal thresholds of “use of force” or “armed attack,” and at least in an international legal sense, do not trigger a military response.
The target is the “adversary’s perceptions,” which reside in the “cognitive dimension of the information environment.” In other words, such cyber warfare focuses on “damaging knowledge, truth, and confidence, rather than physical or digital artifacts,” according to Lin. It is the “brain-space.”
Additionally, IIWAM injects fear, anger, anxiety, uncertainty, and doubt into the adversary’s decision making processes, he added. Success is defined as altering such perceptions so the target makes choices favoring the aggressor.
“Sowing chaos and confusion is thus essentially operational preparation of the information battlefield – shaping actions that make the information environment more favorable for actual operations should they become necessary,” the researchers wrote.
These cyber manipulations often prey upon cognitive and emotional biases present in the psychological and mental makeup of human beings, Lin said.
For example, media channels such as Fox News play to “confirmation bias” for individuals with a right-of-center orientation, and similarly for MSNBC for those with a left-of-center, orientation, he wrote. Confirmation bias is the tendency to interpret new evidence as confirmation of one's existing beliefs or theories.
“Naming and shaming” is probably ineffective against many nation states conducting cyber disinformation campaigns, Lin said. And the idea that a government like the U.S. can quickly respond to misinformation created in the private sphere is unlikely to be effective as well.
What, then, might work? Lin suggests new tactics are needed, as no existing approach seems adequate. For example, Facebook is deploying a new protocol for its users to flag questionable news sites. Google has banned fake news web sites from using its online advertising service. Twitter, YouTube, and Facebook shut down accounts that they determine are promoting terrorist content. He noted that a recent Facebook letter from CEO Mark Zuckerberg states that, “Our approach will focus less on banning misinformation and more on surfacing additional perspectives and information, including that fact checkers dispute an item's accuracy.”
But such measures are unlikely to stem the “rising tide of misinformation conveyed” through cyber warfare, Lin said, because they mostly require users to do additional mental work.
Wired world riskier
Today’s Internet-driven Western world offers countless opportunities for cyber influence mischief, Lin wrote.
“Democracy has rested on an underlying foundation of an enlightened, informed populace engaging in rational debate and argument to sort out truth from fiction and half-truth in an attempt to produce the best possible policy and political outcomes,” Lin wrote.
Cyber manipulators have exploited an arguable gap between ideals and reality in democratic systems – “rendered it much more questionable” – through the tremendous reach and speed of misinformation, he said. Many countries cannot deal with the onslaught of such focused efforts. This serves to make the democratic process look weak and unstable in the eyes of its citizens. The same dynamic does not apply equally around the world.
“Cyber weapons pose a greater threat to nations that are more advanced users of information technology than to less-developed nations,” Lin wrote.
He said that less developed or authoritarian countries do not have much Internet infrastructure or that wield control over expression – North Korea is an example.
Herbert Lin, Center for International Security and Cooperation: (650) 497-8600, firstname.lastname@example.org
Clifton B. Parker, Center for International Security and Cooperation: (650) 725-0224, email@example.com