With tweaks to trending box, Facebook targets fake news
Loading...
Facebook rolled out changes to its trending topics box on Wednesday, in its latest response to the proliferation of fake news items that earned the company censure following the US presidential election.
Trending topics, the company said in a news release, will now factor in user alerts that an item is spam or fake news, and will now identify groups of articles shared on the platform instead of just the mentions earned by a particular topic. The presentation of the box is changing, too: Headlines from an attributed publisher now appear below the topic name, and the same topics will appear uniformly across different regions.
Slight as they are, the modifications return attention to Facebook’s difficulties in striking a balance between its channeling of popular interest and its duties as curator of information.
The company has veered between competing approaches. In August, notes The Wall Street Journal, it fired its teams of contractors hired to select headlines – and weed out dubious items that had gained traction – amid accusations that the teams squeezed out news from conservative sources. That decision gave way to the laissez-faire approach of the election season, which saw the ascent of intentionally fabricated items circulating in "echo chambers" that tended to reinforce users' existing beliefs.
The persistent popularity of hoax items has also revealed a pattern in how their readers see the news in general, as The Christian Science Monitor reported this week:
Jack Zhou, an instructor in environmental politics at Duke University in Durham, N.C., says some occupants of so-called "news bubbles" may prefer to accept fake news as truth. "The state of fragmented media may dull the potential practical impact of inoculation messages, particularly in terms of the audiences serviced by those media," Mr. Zhou, who has researched the identity politics of climate change, tells the Monitor in an email.
After all, sites with fake news are only catering to their audiences. Paul Levinson, a communications professor at Fordham University in New York, told the Monitor in December that, "These bubbles have not been imposed upon the public – it was what the people want. As long as social media continues to provide a very easy forum for these news bubbles ... it is not going to stop."
Facebook chief executive officer Mark Zuckerberg initially rejected criticism of how the company curated its election-season items. Within weeks, though, the company started tinkering. In mid-November, Mr. Zuckerberg made a list of ways it could improve, and in December, it started carrying some of them out, enlisting third-party fact-checkers to flag fake news items, as the Monitor reported:
[Vice president of Facebook News Feed Adam] Mosseri said the company is focusing on four key areas of improvement: making it easier for users to report an article as fake, flagging stories as disputed, ensuring that those who share disputed stories know what they are sharing, and disrupting the revenue streams that currently drive much of the fake news industry....
Facebook will append an alert to any story the fact-checkers determine to be false.
"If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why," Mosseri wrote.
Part of Wednesday’s changes amount to finding credibility in numbers: Where a single hoax story could go viral before, the algorithm will now factor in how many publishers are reporting on the topic in question, in addition to taking into account the "historical engagement" of the publishers.
"If just one story or post went viral, it wouldn’t make it into the trending as it might previously," Will Cathcart, a Facebook vice president of product management, told the Journal. "It really takes a mass of publishers writing about the same topic to make the cut."