Facebook to flag 'worst of the worst' fake news with fact-checking partners

The social networking giant says it will make it easier for users to report fake news, flag fake stories with the help of fact-checkers, and disrupt financial incentives that drive fake news.

This photo combo of images provided by Facebook demonstrates some of the new measures Facebook is taking to curb the spread of fake news on its huge and influential social network. The company is focusing on the "worst of the worst" offenders and partnering with outside fact-checkers to sort honest news reports from made-up stories that play to people's passions and preconceived notions.

Facebook via AP

December 16, 2016

Facebook will enlist the help of fact-checkers to begin flagging fake news stories, the social networking giant announced Thursday, outlining a number of tests and features designed to address critiques that its platform is a hotbed of politically consequential misinformation.

Shortly after last month's elections, Facebook chief executive Mark Zuckerberg said it was a "crazy idea" to think his platform had influenced the outcome. But the company seems to have changed its tone as it seeks to walk a tight rope, taking some responsibility for the spread of false information online, but trying to avoid being accused of partisanship or censorship.

"We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we're approaching this problem carefully," Adam Mosseri, the vice president of Facebook News Feed, wrote in a blog post. "We've focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third party organizations."

Ukraine’s Pokrovsk was about to fall to Russia 2 months ago. It’s hanging on.

Mr. Mosseri said the company is focusing on four key areas of improvement: making it easier for users to report an article as fake, flagging stories as disputed, ensuring that those who share disputed stories know what they are sharing, and disrupting the revenue streams that currently drive much of the fake news industry.

In order to flag fake news, Facebook started working with third-party fact-checkers who must agree to abide by a five-part code assembled by The Poynter Institute. There are currently 43 organizations worldwide that have signed onto Poynter's statement, including ABC News, the Associated Press, PolitiFact, Snopes, and The Washington Post Fact Checker – all of whom have agreed to produce fair, transparent, and nonpartisan work.

Facebook will append an alert to any story the fact-checkers determine to be false.

"If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why," Mosseri wrote.

Users will still be permitted to read and share fake news on Facebook, but they will be confronted with warnings to make it clear that the veracity of a particular article has been questioned. Illustrations published with Mosseri's blog depict a pop-up window that requires users to acknowledge that fact-checkers dispute a story before they share it.

Howard University hoped to make history. Now it’s ready for a different role.

Additionally, flagged stories are ineligible for advertisement on Facebook.

A number of websites – including Liberty Writers News, Alex Jones' Info Wars, and Ending the Fed – have spread lies and conspiracy theories that were particularly popular among supporters of President-elect Donald Trump, as The Christian Science Monitor's Story Hinckley reported on Thursday.

"The problem is that we are too credulous of news that reinforces our predispositions and too critical of sites that contradict them," Brendan Nyhan, a political scientist at Dartmouth University, told the Monitor.

"Facebook created the platform and the election created the topic that would deliver the hits and shares," he added.

As policymakers and the public continue to discuss the real-world implications of fabricated information online, Facebook says it will look to continue modifying its approach, as well.

"We're excited about this progress, but we know there's more to be done," Mosseri wrote. "We’re going to keep working on this problem for as long as it takes to get it right."