Russian info war matches its land war: Loud, but unsophisticated
AP
Kyiv, Ukraine
The letter circulating on Telegram recently offered an explanation for why Ukrainian fighters in Mariupol continue to resist despite overwhelming odds. Ukrainian soldiers, it said, face execution if they are found to have surrendered to Russian forces.
But Ukrainian fact-checkers quickly leaped into action to debunk it – a task that proved relatively easy. The letter contained layout anomalies when compared with other National Guard documents, as well as linguistic errors. That, combined with the illogic of the letter’s claims – why would the Ukrainian military threaten to execute soldiers no longer within its command structure? – showed it was another instance of Russian disinformation.
Russia’s invasion of Ukraine has been accompanied by a high-volume, multilingual disinformation campaign that has jolted fact-checking experts in Ukraine and the West into action. But experts have been surprised at just how unsophisticated that campaign has been. They say Russia has been taking the same approach to deploying fakes as it does to soldiers on the battlefield: large numbers, but with poor ammunition.
Why We Wrote This
Russia’s war in Ukraine isn’t just on the battlefield. It’s online, too. But experts have been surprised that Russia’s disinformation strategies are as unsophisticated as their real-life ones.
“The strongest side of Russian disinformation is not the quality, but continuation and repetition,” says Ukrainian fact-checker Oksana Iliuk, on a Zoom call from the southwestern town of Chernivtsi, near the border with Romania. “They do not care about each fake being sophisticated and deliberate. They just want to flood the information space.”
That doesn’t make them less dangerous. The atrocities against civilians in the Ukrainian town of Bucha – which Russian officials and media falsely claimed were staged “fakes” involving the use of “crisis actors” – underscored the seriousness of separating fact from fiction in times of war.
“The strategy of the Russians is what they call the fire hose of falsehood,” says Sam Gregory, program director at Witness, a New York-based organization using technology and video to defend human rights. “Basically pumping out lots of different, contradictory accounts. You are not trying to establish a conclusive truth. You’re trying to muddy the waters, make people believe that they can’t really believe anything, and then make it easier for people to throw their hands up in the air and say, ‘Well, we just don’t know what is going on here.’”
“It’s almost absurd”
Many expected that Russia’s disinformation campaign would be sharper and slicker, drawing on technological advances and lessons learned in Syria. But the quality of the fakes circulated by pro-Kremlin, Russian-language accounts on platforms like Telegram has been underwhelming. Often the material is recycled from earlier stages of the conflict and quickly debunked.
Fakes – news without clear sources of information or facts – have been constant in the war and have taken many forms. They spread across messaging platforms like Viber and Telegram and social media networks like Facebook, which is particularly popular with Ukrainians. On TikTok, video game footage quickly emerged claiming it was from the Ukraine conflict. Early on, Ukrainian soldiers received SMS messages urging them to lay down their weapons and go home. That did not happen in part because they knew that kind of attack could be coming.
“It’s actually been surprising how bad Russian disinformation has been,” said Eliot Higgins, founder of Bellingcat, a Netherlands-based investigative journalism group focused on social media fact-checking and open-source intelligence, at a Chatham House event. “I mean, we’ve had years of disinformation from Syria and Ukraine being debunked and kind of thought we would see Russia upping its game. ... Russia’s disinformation has not improved. It’s almost as if they’ve swapped the words Syria with Ukraine and jihadists with Nazis in many cases. It’s almost absurd, but it’s still very important to address that information.”
In the case of the Bucha massacre, Russian authorities tried to cast doubt on Russian military involvement by challenging the timeline of events, as well as claiming that the arm of one of the dead seen in a video of the scene moved. Analyzing Telegram channels – a major vehicle for Russian-language disinformation in Ukraine – Ms. Iliuk says she identified 18 variations of the message that the massacre was staged by Ukrainians. They advanced theories such as this was a ploy to get more weapons from the West and Bucha’s civilians were killed by the Ukrainian territorial defense forces.
But on-site media investigations and multiple cross-checks of photos and videos succeeded in debunking Russian claims. Experts concluded that the seeming “movement” was an optical illusion caused by a rain droplet on the windshield of the car from within which the video was filmed. And a New York Times investigation backed by satellite images conclusively showed that the corpses were already there when Russians controlled Bucha.
“The war is not just on the ground; it is in the information space,” says fact-checker Alona Romanyuk, who launched the website putinlies.com.ua. “A lot of the fakes for February and the first days of March aimed to spread panic. ... A lot of Russian fakes were stillborn. But a lot of these fakes [did] spread panic.”
Equipping an international audience
Russian disinformation is not a new phenomenon. Ukrainians have been dealing with it since 2014; indeed, the country boasts a healthy constellation of fact-checking and media-literacy organizations.
But the Russian messaging isn’t solely for Ukraine. “Russia uses the information space to explain to their own audience and foreigners why they brought about a war,” says Ms. Romanyuk, who is also an analyst at Detector Media, a Ukrainian think tank focused on media literacy and battling misinformation. “There were no reasons to invade Ukraine and do these terrible things, to destroy Ukrainian cities and kill civilians. There isn’t any reason. But in the information space, Russia explains it.”
But “the international audience is at a bigger risk of believing Russian fakes than Ukrainian people because we have faced Russian disinformation for years,” she adds. “We [Ukrainians] are used to how aggressive it can be.”
So Ukrainian fact-checkers are working to educate broader audiences, too. One key example is Russian President Vladimir Putin’s now infamous essay on the so-called historical unity of Russia and Ukraine, published about a month before putinlies.com.ua was launched. The article has proved a foundation for Mr. Putin’s efforts to justify war against Ukraine, including its theme that Ukraine is a failed state run by Nazis.
Ms. Romanyuk worked with a team of historians, journalists, and teachers to review the 130 claims contained in the article. Within those claims, they found 105 manipulations – statements mixing truths with lies, things taken out of context, misinterpretations of the facts, or labeling – and 58 fakes, or falsehoods. Now they are working to translate that website into English to offer the Ukrainian take on history to international audiences.
European fact-checkers are also busy battling Russian disinformation. The European Digital Media Observatory, a disinformation expert network and online platform, created a specialized 14-person task force to tackle Ukraine. Fake stories that appear in one European country – such as a report that Polish clinics were kicking out cancer patients to accommodate Ukrainian refugees – typically take a day or two to spread to another language or geography.
“The same networks that were spreading COVID-19 disinformation are now spreading disinformation on the war in Ukraine,” notes Paula Gori, the organization’s secretary-general. “The same accounts and also the same channels ... and often the same strategies: It’s a fiction. So COVID-19 doesn’t exist. The war is staged.”
The flood of information
Ultimately, experts say, the risk from Russia’s disinformation campaign isn’t in its sophistication, which is lacking. (Ms. Romanyuk notes that one of the few apparent Russian uses of a “deep fake” video forgery, of Ukrainian President Volodymyr Zelenskyy allegedly surrendering, was “school level” and “really funny.”) It’s in the sheer volume of it, and how easily it seeps into the already massive amount of news information that people already consume, especially via images.
“Disinformation spreads more with images because images have a powerful impact on our brains,” says Ms. Gori. “That’s why we see lots of videos or pictures, and unfortunately, in a war, this is even more impactful.” Part of the problem is that average news consumers rush through information and rarely stop to look at images critically, she says. They lack or are unfamiliar with tools that fact-checkers use to identify fake or recycled images.
And even well-intentioned amateur efforts can sometimes muddy the waters even further, warns Mr. Gregory. The best defense? He highlights the SIFT method: stop, investigate the source, find alternate coverage, and trace the original. Ms. Iliuk highlights that many of the fakes in circulation now have already appeared and been debunked before, allowing for easy cross-checks through a simple Google search.
“If you read emotional news, don’t share it,” says Ms. Romanyuk. “Go to the kitchen, have a cup of tea or a coffee. When you calm down, only then you can make a conclusion. Is it true or fake?”