Is Facebook's 'disputed' news tag too weak to succeed?
The new system flags fake news articles and provides links to fact-checking sites debunking them, but users will still be able to share the articles as they have in the past.
Esteban Felix/AP/File
Facebook released a new system to fight the spread of fake news on Friday amid ongoing pressure from groups calling for more accurate information from the site many rely on as a significant news source. While the review of news articles by independent fact-checkers could have a positive effect, some critics say the new feature does not go far enough to stop the proliferation of false or misleading information.
The new feature allows some users to tag news as fake, and flags certain articles with untrue or misleading information based on the diagnosis of fact-checking sites like Snopes or Politifact. Stories with dubious facts will be marked by Facebook as "disputed," and will be accompanied with a link to an article explaining why, though users will still be allowed to share the stories if they wish.
"This is a good first step, but I don't necessarily think that we're going to see a dramatic decrease in fake news any time soon," says Elizabeth Cohen, a professor of communication studies at West Virginia University in Morgantown, W.Va.
Dr. Cohen, who specializes in the psychology of social media, tells The Christian Science Monitor in an email that there is a lot of gray area in fake news that makes it difficult to deal with, especially online.
"The lines between editorial content, bias, half-truths, and blatant lies can be so blurred, I'm not sure that stopping the proliferation of fake news should even be the aim," she writes. "The best-case scenario is that a system like this will raise people’s level of awareness about the presence of misinformation in their news diet."
According to the Pew Research Center, 62 percent of US adults got news from social media last year, and 18 percent reported turning to social media often for their news. That's a majority of adults who could potentially be susceptible to the distorted reports or outright hoaxes that often run rampant on social media.
And that puts companies like Facebook in a tough position.
Unlike traditional news media outlets, committed to informing their readers and viewers with accuracy and integrity, social media platforms were designed for entertainment. But with so many people turning to them as a convenient way to learn about current events, sites such as Facebook and others like it have found themselves forced to take a political and social stand that, as businesses, they never wanted to make.
"I do think Facebook is taking a risk by becoming an actor in a very political situation," says Cohen. "By attempting to police fake news, the company could be perceived as partisan and it could lose user trust."
Fake news began to make real headlines leading up to the November election, when the internet was inundated with sensational, attention-grabbing, and untrue headlines meant to appeal especially to potential Trump voters. Notable examples included the assertion that Pope Francis had endorsed then-candidate Donald Trump for president, and that Hillary Clinton was linked to a child sex ring operating out of a pizza parlor.
The latter case nearly ended tragically when an armed man went to investigate the parlor in question. Fortunately, no one was injured.
These sensational headlines, and others like them, spread like wildfire, in part because of their very implausibility, making Facebook a great deal of ad revenue in the process. The articles' popularity, in turn, caused many people to doubt legitimate news organizations who seemed to be refusing to cover the "real" events described in such articles, even after many were debunked.
"The problem is that we are too credulous of news that reinforces our predispositions and too critical of sites that contradict them," Brendan Nyhan, a political scientist at Dartmouth College in Hanover, N.H., told the Monitor in December.
These "echo chambers" of opinion and confirmation bias are nothing new, but the increasing politicization of fake news has only made the problem worse. Before the election, most fake news was aimed toward conservative readers, but since then, there has been a spike in fake news for liberals as well. This proliferation can make it harder for many people to differentiate what information is true, even after being presented with the facts.
"There's a psychological phenomenon called the 'backfire effect' in which people tend to hold on to incorrect beliefs more strongly after being presented with factual information that contradicts those beliefs," says Cohen. "In other words, sometimes when you give people facts to refute misperceptions, it actually makes them more resistant to the truth."
She adds that the "backfire effect" is not a new issue, and that there is no reason to believe it would be substantially exacerbated by the new Facebook system.
But potential backfiring isn't the only problem with the new "disputed" flags. Getting a fake news article fact-checked takes time, during which the article can be shared thousands of times. And users will still be able to share the flagged articles, with the only obstacle being a quick pop-up warning alerting the user as to the dubious nature of the link they are about to share.
"It's a huge mistake to expect this system to help change people’s beliefs. If you're expecting that, you're going to be disappointed," says Cohen. "Many will continue to reject the truth no matter how many times it's fact-checked."
But for people who still haven't made up their minds, she says, this new system could be an important step in the right direction.
"Just about everyone has trouble navigating all of the news out there today," Cohen adds. "This has the potential to help people make decisions."