Whose reality? Propaganda in the age of social media

The Pentagon logo is seen behind the podium in the briefing room at the Pentagon in Arlington, Virginia, Jan. 8, 2020.

Al Drago/Reuters

January 28, 2020

Truth, it is often said, is the first casualty of war, and in the wake of the assassination of Maj. Gen. Qassem Soleimani of Iran on Jan. 3, misinformation ran rampant.

In the days following the U.S.-led drone strike, the White House’s rationales for the attack shifted wildly. Initially, the Trump administration claimed that the general was plotting an “imminent” attack against Americans. Less than a week later, it was to prevent an attack on a U.S. embassy, or four embassies. Or maybe it was U.S. military bases. In the end, “it doesn’t really matter,” the president tweeted, “because of [Soleimani’s] horrible past!”

At the same time, social networks like TikTok, Instagram, and Facebook began to overflow with doctored photos and outdated images, false rumors about the draft, and fake tweets from administration officials. In one sense, this is nothing new; war fighting has always had a psychological component, which includes spreading lies. But in another sense, the informational fog speaks to the defining epistemological issue of our era: the fracturing of the American public’s shared sense of reality.

Why We Wrote This

Propaganda has always been a part of conflict. But in the wake of Maj. Gen. Qassem Soleimani's death, Americans got a glimpse of how easily misinformation can spread in this social media age. What to do?

“The objective seems to be to question the idea that anything is certain,” says University of Southern California historian Nicholas Cull. “In a world of uncertainty, the only one you can trust is the strongest guy in the room.”

A fog of misinformation

One rumor claimed that the U.S. had instituted the draft. Another rumor claimed that LGBT people were exempt. Another claimed that felons were exempt. By the end of the day of the drone strike, the servers for the U.S. Selective Service System’s website had crashed.

“The two-day period after the news broke were very confusing to young people,” says Katy Byron, the editor and program manager of MediaWise, a Poynter Institute for Media Studies project aimed at helping teenagers identify online misinformation. 

Professor Cull points out that draft rumors are nothing new; a prevalent rumor in the spring of 1942 claimed Jews were exempt. But, he says, modern technology confounds our efforts to tell fact from fiction. “We’re living in one of those dangerous moments right now. [There is a] tremendous instability from the overlap of social media and big data-powered targeting of social media that we are in the process of learning to deal with.”

Programmed to spread

Misinformation on social media presents a thorny problem. Recommendation algorithms, like the kind that determine what appears in your Facebook newsfeed or YouTube’s “up next” list, cannot by themselves distinguish truth from lies. All they can do is measure how people “engage” with the content: whether they scroll past it or linger on it, whether they click on the thumbs, hearts, or other icons associated with the post, and how much time they spend interacting with it.

This algorithmic agnosticism often leads to perverse results. Social networks assign higher ranks to content that prompts more engagement, regardless of whether the content is true or whether it improves the engager’s life. And the more highly ranked a piece of content is, the more likely people will see it. That means that, if you come across a Facebook post that you think is false and you write a thoughtful point-by-point rebuttal of it, you’ve made it more likely that the misinformation will appear in more peoples’ Facebook news feeds.

What’s more, today’s media landscape is immeasurably more fragmented than it was in 1967. To reach its audience, misinformation – and its deliberate subset, disinformation – no longer needs to travel the highways afforded by the major news outlets; instead, it can travel the backroads created by niche targeting. That means that what you see may not be what everyone else, including professional fact-checkers, see.

Can Syria heal? For many, Step 1 is learning the difficult truth.

“The beauty of freedom of speech, historically, is that the freedom of speech is constrained by the ability to observe and respond to problematic speech,” says Josh Pasek, a media professor at the University of Michigan. “The challenge in this social media environment is that you can’t necessarily do that.”

Professor Pasek suggests that one way for the government to curb misinformation without running afoul of free speech protections would be to require social media companies to be transparent about who is targeting what content to which users. 

“You don’t have the right to not have your speech known about just because you’re trying to hide it from someone,” he says.

How to respond?

If you spot misinformation in your news feed, one option, Professor Cull suggests, might be to go outside the infrastructure. In other words, if a friend posts a false news report on Twitter, instead of responding via Twitter and inadvertently boosting engagement to the original post, you could send that friend an email instead.

Ms. Byron of MediaWise, whose project is backed by Google’s charitable arm, Google.org, advises online readers to ask three simple questions: Who is behind the information? What is the evidence? And what are other sources saying?

If you’re still not sure, she says, you can tag the post with #IsThisLegit and @mediawise. “We’ll help you figure out if what you’re seeing is real or not,” she says.

Professor Cull says that the combination of rapid technological shifts and political instability places society in a dangerous situation, but, if we survive this moment, our minds will eventually adjust to the new media landscape, just as we adjusted to television and radio in the past.

“In a couple of years,” he says, “we’ll be able to look back on it and think, ‘Wow, can you imagine that we didn’t understand that? Now we understand it. It’s second nature. We can move beyond it.’”