Facebook's news algorithm promotes 9/11 'truther' article

Is it a mistake for Facebook to rely on algorithms when it comes to sensitive topics?

In this May 16, 2012, file photo, the Facebook logo is displayed on an iPad in Philadelphia. After switching from human curation to a new algorithm, Facebook's Trending Topics section linked out to a 9-11 'truther' article on the anniversary of the attacks.

Matt Rourke/AP/File

September 11, 2016

On the 15th anniversary of the September 11 attacks, Facebook commemorated the event in a strange way – by promoting an article which claimed the attacks were staged.

The article, which alleged to show proof that “bombs were planted in Twin Towers,” appeared in Facebook’s Trending Topics section on Friday. In the weeks after switching from human curators to an automated algorithm to select Trending Topics, the feed has promoted a series of questionable links, which include thinly-veiled hit pieces and even pornography.

In the wake of such missteps, experts and users have decried the algorithm’s apparent inability to navigate sensitive stories. Do some topics need a human touch?

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

On Facebook, Trending Topics are chosen based on what users are actively posting about, while featured articles are chosen in-house and cycled hourly. Located in the upper right sidebar, Trending Topics receive high visibility on the social media platform.

So when one topic linked out to a 9/11 "truther" article, you can be sure that a lot of people saw it. The piece, which originally appeared in a UK tabloid called the Daily Star, referred to theories that the attacks had been orchestrated by the US government.

"We're aware a hoax article showed up there and as a temporary step to resolving this we've removed the topic," a Facebook spokesman said in a statement.

On August 27, Facebook announced that it would replace human news curators with a new algorithm. The change was prompted by claims that the Trending Topics section, which was curated mostly by young, Ivy League-educated journalists, had a liberal slant. An algorithm, representatives said, could reduce those biases and present “a breadth of ideas” to users.

Facebook has said that it would employ a team of real people to monitor the links for quality control. But ever since the switch, the platform has been marred by problematic links. One link led to an erroneous claim that Fox News anchor Megyn Kelly had been fired for “backing Hillary.” Another referred to Meghan McCain, the daughter of Arizona senator John McCain, as “Miss Piggie.”

Howard University hoped to make history. Now it’s ready for a different role.

Such incidents expose a weakness in Facebook’s algorithm: a lack of human sensitivity.

Algorithms are good at relating phrases and photos across huge swaths of data. They can tell you what topics are trending, and what stories are getting the most clicks. But most can’t recognize a hoax, or distinguish between a news story and a cruel gossip piece. Are there some cues that can’t be taught to technology?

Last week, a Norwegian newspaper posted Nick Ut’s Pulitzer-prize winning photo of Kim Phuc, originally taken in 1972 after napalm attacks in Vietnam. But Facebook’s algorithm soon deleted the post because it contained child nudity. That’s because algorithms such as these are designed to “optimize” solutions, rather than find the “correct” one.

For some, the deletion of Ut's photo raised troubling questions of censorship. "The media have a responsibility to consider publication [of stories] in every single case," wrote Espen Egil Hansen, editor at Norway’s largest newspaper, in an open letter to Mr. Zuckerberg. "This right and duty, which all editors in the world have, should not be undermined by algorithms encoded in your office in California."

At press time, the 9/11 anniversary topic was still absent from Facebook’s news feed, replaced by related topics such as “Air Force One” and “Lower Manhattan.”