Rudi Keller @CDTCivilWar
During the Soviet era, Air Force Lt. Col. Jarred Prier wrote in his journal article “Commanding the Trend: Social Media as Information Warfare,” Russia used its propaganda tools to plant believable lies in foreign media, intending to sow discord among allies of the United States or weaken it in the eyes of other nations.
Now, as the indictments handed down Friday by Special Counsel Robert Mueller show, Russian disinformation campaigns manipulate opinion here. They have been so successful, Prier said in an interview, that his findings that the Russian cyber warfare team targeted the 2015 turmoil at the University of Missouri will not be believed by a large segment of the public.
“There are people who at face value don’t believe what you said because you said Russia did something,” Prier said. “On the opposite side, political left is so willing to believe anything that has to do with Russia right now.”
Prier is currently serving as director of operations for the 20th Bomb Squadron. He has studied the social media propaganda techniques of the Islamic State and Russia and found similar tactics used to serve different strategic goals. He spoke to the Tribune by telephone Wednesday.
Adopting the #PrayForMizzou hashtag in the hours after former UM System President Tim Wolfe resigned, Russian cyber trolls and their robotic repeaters stoked fear of a violent white backlash, Prier found in his peer-reviewed research, published in November 2017 in Strategic Studies Quarterly.
Some of the fear was well-grounded. A threat from inside Missouri posted on Yik Yak led to the arrest of Hunter Park in Rolla. But much of it was baseless, fed by Russian Twitter accounts including one with the handle @FanFan1911 and a user name of “Jermaine,” whose avatar was a photo of a black man. @FanFan1911 tweeted falsely that the Ku Klux Klan was marching on the campus backed by police.
Prier, a 2003 MU graduate, traced the activities of @FanFan1911 and other Russian troll actors while doing master’s degree research at the Air University for the School of Advanced Air and Space Studies. He remembered @FanFan1911 specifically because he called the Twitter user a liar on Nov. 11, 2015.
He’s not 100 percent certain that @FanFan1911 was a Russian, he said. But the way the user’s targets changed – from Europe to the United States, back to Europe again during the Syrian refugee crisis and again to target the U.S. during the election – and the way robots were set up to retweet him, the account fits every measure he has available.
“The final discriminator was after Hillary Clinton used basket of deplorables in a speech, all the accounts I had been monitoring changed their names to deplorables-something or other,” Prier said. “It was bizarro world.”
Prier’s findings about how Russians inserted themselves into MU’s problems make up only a small portion of his article, which is a broader look at the social media tactics employed by the Islamic State and Russia to achieve their strategic goals and how U.S. policy makers should consider it a new field of competition.
The title of Prier’s article is an allusion to Giulio Duohet’s seminal 1921 work on air power, “Command of the Air.” After World War I, Duohet imagined massive fleets of bombers that would reduce cities to rubble, demoralizing inhabitants and forcing their leaders to surrender.
Duohet correctly imagined the extent of future air power but not the result. In his concluding paragraph, Prier puts defense in the social media field on par with protecting infrastructure and information subject to hacking.
“This was not the cyber war we were promised,” Prier wrote. “Predictions of a catastrophic cyberattack dominated policy discussion, but few realized that social media could be used as a weapon against the minds of the population.”
Prier’s work is now being read at the National Intelligence University, where agents are trained.
HOW IT WORKED
On May 21, 2016, about a dozen white supremacists gathered outside the Houston Da’wah Islamic Center, attracted by a Facebook post by a group calling itself Heart of Texas for a protest event to “Stop the Islamization of Texas.” A counter-demonstration, also organized via Facebook by a group calling itself United Muslims of America, drew about 50 counterprotesters for an event to “Save Islamic Knowledge.”
Both events were organized by Russian agents who spent $200 to manipulate behavior on a local level in the United States, the Senate Intelligence Committee revealed Nov. 1, 2017.
“It is an interesting notion to have forces from outside come in and try to manipulate attitudes and public behaviors by inciting different groups to take action,” said Peverill Squire, professor of political science at MU. “It casts modern day politics in a different light.”
In the indictment, Mueller charged that Russia spent $1.25 million per month to influence the 2016 election. The activity began in 2014 and the indictment names the Internet Research Agency, identified by Prier as the likely home of the Twitter trolls he researched, first among 16 defendants.
The short-term result of the Russian’s focus on MU was to sow fear. The long-term damage to MU’s reputation was a false impression that the 2015 protests were violent. The episode served Russia’s strategic goal of reducing the U.S. presence on the world stage by focusing public attention on internal divisions, Prier said.
“They want to force the American public to go over into a corner and argue amongst themselves,” Prier said.
Prier’s analysis is “spot on,” said Cooper Drury, an MU professor of political science who researches foreign policy issues. The Russian long-term goal is not the victory of any political party but a weaker U.S., he said.
“If that is what your goal is, disruption, then the greater polarization you can get inside a democracy the more successful you will be,” Drury said.
The indictment states that Russia used its social media campaigns for the benefit of Donald Trump in the Republican Party and Sen. Bernie Sanders in the Democratic Party. The propaganda worked especially well because it created a false impression that there were vast numbers of people agitating a particular view, Prier said.
“At that time there was a kind of symbiotic relationship between legitimate American conservative thought and these Russian trolls,” he said. “These Russian trolls were driving clicks. Clicks are what keeps the business moving.”
It is the persuasion effect, said Mitchell McKinney, professor of communication at MU. Propaganda easily identified is likely to be discounted as false by most people, he said. Social media helps mask the source and volume creates believability, he said.
“So bombarded at every turn, they insert messages that may seem plausible or in the environment of uncertainty or environment of fear, insert message that might be accepted,” McKinney said.
The most successful are validated when they are reported by trusted news organizations, he said.
Prier’s findings that show the Russians used a network of human and robotic accounts to spread their messages fit what Mike Kearney, an assistant professor of journalism, found as he wrote his doctoral thesis on Twitter use in the 2016 election. He found hundreds of accounts that stopped tweeting as soon as the election was over, Kearney said.
“What doesn’t surprise me is that there is a lot of activity on Twitter that I don’t think is authentic in the way that we would think of it,” Kearney said.
In the fall of 2015, Prier was a major on a fellowship at Georgetown University’s Institute for the Study of Diplomacy, where he studied Islamic State social media. Part of his time was spent working at the State Department, he said.
The protests at MU exploded from a local news story to a major national and international story and a top topic for days on social media sites.
Prier didn’t take a screen grab of @FanFan1911’s tweet about the KKK, which included a picture of a black child with a bruised face and the fake accusation that he had been beaten on campus. He can’t be sure exactly when it was inserted into the stream but he remembers calling @FanFan1911 a liar and tweeting back the source of the photo, a story about a child beaten by police in 2013 in Ohio.
“I was livid because these were people saying things about my university and they were making me mad,” Prier said.
In 2015, the problem posed by ISIS social media was their successful recruiting, Prier said.
The accounts that targeted MU also sent messages amplifying ISIS propaganda, which seemed strange at the time. That is why he returned to them for study at the Air University. He spent hours researching accounts, creating spreadsheets where he identified accounts he believed live humans were generating the messages and those which were automatic repeater accounts.
“FanFan and about a dozen accounts I saw, they were mostly attack dogs, attacking journalists and trying to build a narrative,” he said.
Prier and the MU faculty interviewed for this article agreed that the best defense for individuals is a healthy skepticism of ideas spread on social media. Prier’s findings about how MU became enmeshed in the Russian social media were surprising but show how important it is to be careful of ideas from unknown sources, McKinney said.
“I was surprised just on the level of, this was such an immediate or personal issue for all of us at the university,” McKinney said. “Then to see what we had learned or were reading in terms of Russian involvement through social media in our national elections and at the national level, that that sort of targeting events in our country would even be down at the local level.”
The polarization of political life in the U.S. wasn’t created by Russian social media, Drury said. The traditional media, once trusted as a neutral provider of information, now has outlets that openly take sides, he said.
“Democrats don’t like to watch Fox news and Republicans don’t watch MSNBC, unless they want to get their blood pressure up,” Drury said.
Prier’s article seems pessimistic, Kearney said, as though there was no defense against being manipulated.
“But the corollary is that it makes it more easy to share and find information by ourselves,” Kearney said. “It is certainly direction in the progress of free information. It is easy for us to point to the bad, especially when it takes form or takes shape in ways that we didn’t expect.”
That was what he did when he called @FanFan1911 a liar, Prier said. But it was like spitting into a hurricane – it did not calm the tempest.
It is up to all providers of information – platforms like Twitter, outlets such as the Tribune and especially politicians – to be careful, Prier wrote. The platforms could ban robot accounts, which would eliminate trend creation but would hurt advertisers, he wrote.
“Journalists should do a better job of vetting sources rather than just retweeting something,” Prier said. “And the last piece of advice I give is that politicians got to quit using it.”