Online misinformation is rampant. Four tips on stopping it.
Graeme Sloan/Sipa USA/AP
New York
This spring, Amber pored over Facebook posts as a self-appointed fact checker. She’d seen how the president’s remarks at an April news briefing became mangled through memes.
No, Donald Trump didn’t urge Americans to ward off the coronavirus by drinking bleach. (Fact-checks here.) But confusion over his speech spiraled out of context across websites, social media, and late-night comedy.
Amber, a human resources specialist from Crystal River, Florida, says she calmly commented on friends’ posts with a link to a video of the event for reference.
Why We Wrote This
Nobody likes being wrong. But what if corrections came from someone you trust? Experts urge Americans to fight misinformation as a shared responsibility.
“I find that helps me not create an enemy with them,” says Amber, who asked to omit her last name because of her employer’s restrictions on speaking to the press. If she insults someone’s intelligence, she adds, “they’re just going to discount everything I said and also try to insult me back.”
About two-thirds of U.S. adults got their news from social media in 2018. Americans are more likely to share misinformation (inaccurate content shared unknowingly), rather than disinformation (inaccurate content shared to deceive), experts say. Both can deepen divisions in an already polarized society, and – at worst – inspire violence.
As tech companies and researchers continue working to thwart a range of “information disorder,” experts say citizens like Amber can also play important roles in combating misinformation – not just as critically minded consumers, but as concerned family and friends. Consider it a cyber civic duty.
“They can act as trusted messengers,” says Emma Margolin, senior research analyst at Data & Society.
Postelection, media scholar Claire Wardle hopes to see more emphasis on society-based strategies.
“Fundamentally, if none of us shared any of this stuff, then we wouldn’t have a problem,” says Dr. Wardle, co-founder and U.S. director of First Draft News. She notes that disinformation actors are powerless without us sharing their content: “We have to be part of the solution. We just haven’t had these conversations enough.”
PEN America, a nonprofit that advocates for writers and human rights, is among a growing group of organizations offering media literacy training. Its tip sheet on how to talk to acquaintances constructively about misinformation came in response to participants’ demand, says Nora Benavidez, director of PEN America’s U.S. Free Expression Programs.
“People are so hungry for permission to be empathetic,” says Ms. Benavidez, who wrote the tip sheet.
Of course, correcting your leftist ex or right-wing uncle can be awkward at best, and not every attempt will be a win. Here are some suggestions to start.
Fact-check first.
Is the post about that candidate actually false? Or does flattering coverage of them simply make your blood boil? Before confronting others, make sure your facts are straight – no matter your social media savvy.
“We need to [debunk] the myth that experts are never going to be victims, or believe in false or misleading content, because I absolutely have,” says Ms. Benavidez at PEN America. “That’s how insidious and potent disinformation is.”
Beware of partisan websites posing as unbiased news sources, as well as misleading headlines. Beyond sleuthing out the source, see if reputable fact-check websites have already verified the claim, like the Poynter Institute’s PolitiFact, The Washington Post Fact Checker, Snopes, or FactCheck.org. A simple reverse image search on Google may help verify images by determining their online origins.
Alert the platform early.
Correct misinformation and disinformation as early as possible. Experts encourage users to flag troubling posts on social media sites, which may remove the content. Again, the earlier the better – before the masses have been exposed to it, says Ms. Margolin at Data & Society.
If a platform removes flagged content after it goes viral, the act of removal could backfire as it “becomes its own story,” she adds. Users may accuse the platform of bias, or censorship of certain worldviews.
The “illusory truth effect” underscores the need to act fast: We’re more prone to believe information – even if false – that we’ve encountered more than once.
The effect holds true even when prior knowledge contradicts what we’re reading, psychologist Lisa Fazio’s research suggests. This impacts our experience online, where content proliferates.
“We’re seeing [misinformation] multiple times, and if we don’t do anything to stop it, that repetition will likely increase our belief,” says Dr. Fazio, assistant professor of psychology at Vanderbilt University.
Public or private outreach? Consider context.
Whether to confront your doom-scrolling Facebook friend publicly or privately about his problematic post requires some thought. Research suggests that observing others’ public correction can reduce misperceptions.
Yet due to the way social media algorithms work, publicly engaging with a post can actually amplify its false or misleading content. Public call-outs may also be seen as shaming. As PEN America’s tip sheet notes: “If something was just posted, you might try sending a private note politely pointing out that it’s incorrect.”
If you’re craving technique, consider the “truth sandwich” from linguist George Lakoff. Lead with the truth, point out the lie, then conclude with the truth, so that correct information is most memorable.
It also helps to offer credible sources, says Kathleen Carley, director of the Center for Informed Democracy & Social Cybersecurity at Carnegie Mellon University. “As opposed to just pointing out that something is wrong, provide an alternative,” she says.
Lead with empathy.
Our social media posts are tied to our identities. That means attacking someone’s content can be perceived as a personal attack, says Dr. Wardle of First Draft News. That could make them double down.
“It’s about having empathy for why somebody might believe something … rather than saying ‘you’re wrong,’” says Dr. Wardle.
In lieu of lecturing, empathetic outreach could focus on a user’s online actions or behaviors rather than their content. Dr. Wardle suggests invoking a shared commitment to a healthier democracy, such as, “I’m worried that maybe this is a deliberate tactic to try and make us angry, or to divide us, or to confuse us.”
Correcting false content online is nonpartisan action. It requires everyone’s vigilance, as we can all fall prey to our emotions, confirmation bias, and other mental shortcuts. As Dr. Fazio says, “It’s not a Republican problem, it’s not a Democrat problem ... it affects all of us.”
Hone your verification skills with resources from First Draft News, MediaWise, the Shorenstein Center, and PEN America’s online training.