Illustration by Coelho / RFE/RL

Radio Free Europe / Radio Liberty

Russian Disinformation 2.0: Sowing chaos, confusion, and anger in the west

 

Since the outbreak of Russia’s conflict with Ukraine in 2014, Moscow’s campaign of disinformation — particularly targeting EU and NATO countries — has moved into overdrive. Only now, particularly following allegations of Russian interference in the 2016 U.S. presidential election, are the full extent and capabilities of this complex network coming into focus.

Last April, shortly before Dutch voters were to vote in a referendum on the European Union’s Association Agreement with Ukraine, a strange video appeared online in which several men who purported to be Ukrainian ultranationalists burned a Dutch flag and threatened terrorist strikes if the referendum failed.

“We will find you everywhere: in the cinema, at work, in your bedroom, public transport,” the masked speaker said. “We have our guys in the Netherlands, ready to follow any order.”

“Literally within half an hour after the video appeared, it began to be circulated very actively, viral distribution,” says Aric Toler, a researcher with Bellingcat, an organization that uses open-source information to investigate various claims about the wars in Ukraine and Syria. “When we looked more closely at the accounts that were being used and at who was behind them, we came to the conclusion that these are people from the infamous ‘troll factory’ [in St. Petersburg].”

Bellingcat researchers issued a report in which they documented how the fake video originated, how fake screenshots were created and disseminated purporting to prove that the video originated with Ukraine’s notorious right-wing Azov Battalion, and how social-media accounts originating at the St. Petersburg office of the Internet Research Agency, the “infamous troll factory” Toler referred to, reposted the video while posing as concerned Dutch citizens. The video ended up with more than 150,000 views.

The Dutch video was just one small shot in what has become an unrelenting volley of tendentious disinformation connected with Russia that is aimed at fomenting fear and discord inside the European Union and the NATO alliance. The Kremlin is no longer merely trying to argue Russia’s geopolitical point of view or buff Moscow’s image – it is now actively reaching into the political processes of democratic countries using all the tools of modern global communications.

Over the last two years, a vast net of websites, social-media pages, and troll accounts has emerged in all the member countries of those two blocs, issuing a nonstop stream of conspiracy theories, fear-mongering, and outright lies with a much more ambitious agenda.

In July 2016, the EU’s East StratCom Task Force, which monitors Russian disinformation spread in the EU, issued a summary report (PDF) of the 1,649 bogus stories it has exposed in its 10 months of work — stories in 18 different languages from hundreds of different sources. The Ukraine-based Stop Fake website has exposed more than 1,000 fake stories and allegations in the last 2 1/2 years.

A report (PDF) by the RAND corporation calls the phenomenon Russia’s “firehose of falsehood.” In order to poison the well of public discourse, the Kremlin draws on a variety of tactics. Yevhen Fedchenko, head of the journalism department of Kyiv’s Mohyla Academy and a co-founder of Stop Fake, says that in Ukraine "the Kremlin developed a toolbox of instruments that is easily transferable to basically any other country and can be employed very quickly in case of necessity.” As examples, Fedchenko mentions the so-called Lisa Case in Germany, the Dutch referendum, the 2014 Scottish independence referendum, and the recent U.K. referendum on leaving the European Union. Since then, the same tools have been applied to the election in the United States and, increasingly, to influence upcoming elections in France, Germany, and the Netherlands.

The Kremlin “manipulates local issues in different countries,” Fedchenko says. “It finds a soft spot over there; it finds cracks in the domestic political agenda; and then it tries to widen them, to use and manipulate already-existing problems or emerging problems.”

The “toolbox” Fedchenko mentions includes Russian state media, hastily assembled aggregating news websites, dubious think tanks like Canada’s Center for Research on Globalization, shady nongovernmental organizations like World Without Nazism, and a hardworking army of trolls spreading their “reports” far and wide.

It is a mechanism with which a story headlined “Putin Tells United States Citizens Not To Give Up Your Guns” on the website Ulta Din (motto – “A media to build Humankind, Society and Nation”) can be whisked around the world in minutes. Other stories on the website seem designed to stir up Islamophobia, such as “Christian had arms chopped off after refusing to convert to Islam’’ and “Muslim Pilot of Flight 804 Converted The Plane Into A Portable Mosque.”

In a report, NATO’s StratCom office calls the tactics a novel form of “information and psychological warfare” that amounts to “the weaponization of online media.” These tactics are powerful, the report says, because “this type of ‘warfare’ is continuously ongoing and hard to detect.”

Indeed, categorically identifying who is behind the source is often difficult. Untraceable anonymous accounts can quickly set up websites using popular blogging platforms. “It is complicated to identify its source, particularly as more often than not it is waged from several sources simultaneously,” the NATO report continues. “Such a warfare strategy penetrates all levels of society at a very low cost. Even if the audience does not necessarily believe in the planted information, the abundance of unvetted information of itself leads to a persistent distrust of public information and the media.”

Jessikka Aro is a Finnish investigative journalist who was among the first to write extensively about the St. Petersburg troll factory and who is writing a book about the “cross-border influence” of the Kremlin’s “troll empire.”

"Many people have told me that they started to lose touch with what is true and what is not true — for example, in the Ukraine war, especially at the beginning when it was really difficult to find out what was happening there because there were so many cover-up operations going on and even the media didn't quite find out what was going on and who was waging the war,” Aro says. “So when the trolls were spreading disinformation in different forums, people would get really confused."

"The trolls would even say things that other trolls would then deny,” she adds. “So it is all about just creating chaos and arguments between people who are participating in this."

Russia’s tactics ideally suit the current global information environment, Fedchenko says. During the Cold War, the Soviet Union had to bribe or blackmail mainstream foreign journalists in order to get their messages out. “But now you don't really need to be part of the national media landscape,” Fedchenko says. “You can do it from Russia. You can set up different media outlets in different languages. Physically, they might be 1,000 kilometers away, but as soon as it operates in the Estonian language, or Czech, or German, it still immediately becomes a part of the national media landscape and you can easily get your ideas to a national audience.

That broad scope also includes the United States. Throughout 2016, indications emerged that the Russian online disinformation machine sought to influence the presidential election in the United States. Adrian Chen, a writer for The New Yorker who studies the troll phenomenon, said in a Longform podcast in December 2015 that many social-media accounts of Russian trolls that he has been following for the last few years “have turned into conservative accounts, like fake conservatives.”

“I don’t know what’s going on, but they’re all tweeting about [Republican Party presidential nominee] Donald Trump and stuff,” he said. “I feel like maybe it’s some kind of really opaque strategy of electing Donald Trump to undermine the U.S. or something.”

Fellow journalists Michael Crowley and Julia Ioffe have reached the same conclusion.

To take just one example, the amateur news aggregator website Inspire To Change World (the stated goal of which is to “inspire people to question everything” in this “time for revolution”) has posted articles with headlines like “Trump Supporters Rising Up, Disgusted With Elites, No Trust In Media,” “US Regime Controlled Media In Full Meltdown Mode Over Trump’s NATO Remarks,” and “Trump Predicts ‘Great Relationship With Russia, Putin If Elected.” U.S. intelligence officials in January issued a report that concluded “with high confidence” that “Russian President Vladimir Putin ordered an influence campaign in 2016 aimed at the U.S. presidential election.” The purpose of the campaign was not merely to support a preferred candidate, but more broadly “to undermine public faith in the U.S. democratic process.”

The U.S. intelligence report did not divulge all the evidence that has been gathered, but some journalists have made more detailed reports that point to the same conclusions. According to research by the website Motherboard, there is “very strong” forensic evidence that recently leaked e-mails from the servers of the Democratic National Committee were nabbed by Russian hackers before being distributed on a suspicious WordPress blog called Guccifer 2.0. According to the Motherboard report, at least some of the documents were “modified” by hackers using Russian-language settings and a user name that referred to the founder of the Soviet secret police, Feliks Dzerzhinsky.

Since the U.S. election, there is growing evidence that many of the same tactics – and often even the same accounts and sources – are being directed to influence important elections in France and Germany.

“There’s a lot of evidence that there are now targeted attempted to massively attack [German Chancellor Angela] Merkel, including with bots,” German political scientist Simon Hegelich told Bloomberg. “A lot of accounts that pretty obviously are pro-Trump bots are now joining the anti-Merkel debate.”

Jakub Janda, of the European Values think tank in Prague, wrote in The Observer in December that “Merkel will be the next target of full-scale disinformation and influence operations of the Kremlin and its proxies.”

Janda’s think tank has issued a report (PDF) documenting at least 39 pro-Russian disinformation websites actively publishing in Czech.

And that is just the tip of the iceberg. One after another, European countries have been issuing reports or setting up government units aimed at countering Russian efforts to tip public opinion. In January 2016, Finland’s government organized 100 officials in a program to study and counter growing disinformation aimed at undermining the government.

Rene Nyberg, a former Finnish ambassador to Moscow, told Foreign Policy magazine that Finland is able to counter Moscow’s disruptive messaging because “Merkel is the main course.”

“We’re just a side dish,” Nyberg said.

The RAND corporation’s report, The Russian ‘Firehose Of Falsehood’ Propaganda Model, argues that the Russian tactics are particularly effective because they exploit known psychological traits — that people trust information that they see from multiple sources, that people assume information from multiple sources reflects multiple perspectives, and that information that is endorsed by a large number of people is more reliable.

“The experimental psychology literature suggests that, all other things being equal, messages received in greater volume and from more sources will be more persuasive,” the report says. “Quantity does indeed have a quality all of its own.”

The report notes the advantages of the Kremlin’s disregard for the truth in the current media environment. Stories that have been proven fake continue to circulate long after being debunked, and people often remember the stories while forgetting their refutations. Psychologists call this the “sleeper effect,” when information from a questionable source comes to be regarded as reliable after the source is forgotten or when information that has been retracted continues “to shape people’s memory and influence their reasoning.”

A study by Columbia University found that 59 percent of links shared on social media have never actually been clicked.

“This is typical of modern information consumption,” study co-author Arnaud Legout was quoted as saying by The Washington Post. “People form an opinion based on a summary, or a summary of summaries, without making the effort to go deeper.”

And there are indications that the Russian disinformation toolbox is even more capable than has been seen so far — and that those running it are more ruthless in unleashing it than many believe.

On September 11, 2014, many residents of St. Mary Parish, Louisiana, in the southern United States, received official-looking SMS messages warning them of an explosion at a local chemical plant. Accounts on Twitter simultaneously began circulating a faked screenshot of a local newspaper carrying a story about the purported explosion. Pages about the “incident” were created on Wikipedia and Facebook. A fake video was posted to YouTube and then rapidly cross-posted elsewhere. It was asserted that the Islamic State extremist group claimed responsibility for the attack.

The incident was thoroughly unpacked by journalist Chen, writing in The New York Times Magazine in June 2015. Chen traced the entire hoax back to the Internet Research Agency troll factory in St. Petersburg.

“The Columbian Chemicals hoax was not some simple prank by a bored sadist,” Chen wrote. “It was a highly coordinated disinformation campaign, involving dozens of fake accounts that posted hundreds of tweets for hours, targeting a list of figures precisely chosen to generate maximum attention. The perpetrators didn’t just doctor screenshots from CNN: they also created fully functional clones of the websites of Louisiana TV stations and newspapers. The YouTube video of the man watching TV had been tailor-made for the project…. It must have taken a team of programmers and content producers to pull off.”

The FBI opened an investigation into the incident.