Does Facebook censor conservative news stories?
Facebook's algorithm – which controls what users see in the site's news feed – is routinely criticized as opaque, making it difficult to know exactly how the site influences what its users see.
But former Facebook workers say they frequently suppressed news stories that involved conservative figures or those that might be of interest to conservative readers, tech site Gizmodo reports, adding more fuel to an argument that the social media site's impact on how we consume news isn't all benign.
Several former "news curators" for Facebook told Gizmodo that their actions often stopped stories about the Conservative Political Action Conference, Mitt Romney, Sen. Rand Paul (R) of Kentucky and others from appearing in the trending news section, even if many users were viewing and sharing them on the site.
What's more, they told the site, they were also told to artificially "inject" selected stories from a large, ranked list of trending topics into the feed that users saw. They were also instructed to remove stories about Facebook itself. As Gizmodo reports:
In other words, Facebook's news section operates like a traditional newsroom, reflecting the biases of its workers and the institutional imperatives of the corporation. Imposing human editorial values onto the lists of topics an algorithm spits out is by no means a bad thing – but it is in stark contrast to the company's claims that the trending module simply lists 'topics that have recently become popular on Facebook.'
The report may also serve to bolsters criticisms that the site may hold too much influence over what readers see.
A study by researchers affiliated with Facebook last year found that the site's algorithm had a relatively small level of influence on how much "cross cutting material" – or incoming news that represented the views of a users' friends with opposing politics.
As the Monitor's Pete Spotts reported at the time, this algorithm reduced the cross-cutting content by slightly less than 1 percent, while a user's own bubble of friends reduced such content by about 4 percent, the Facebook study found. But some researchers were skeptical of the idea that the algorithm was relatively blameless.
"Selectivity has always existed. But now we're living in different world," Dietram Scheufele, who specializes in science communication at the University of Wisconsin at Madison, told the Monitor. Facebook "is enabling levels of selectivity that have never been possible before," he added.
Since then, those concerns have mounted. In April, the site's head, Mark Zuckerberg, delivered a harsh criticism during the company annual developer conference that many took to be a shot at presidential candidate Donald Trump.
"I hear fearful voices calling for building walls and distancing people they label as others, for blocking free expression, for slowing immigration, reducing trade.... Instead of building walls, we can build bridges," he said.
Mr. Zuckerberg's comments, along with a question from an internal poll of Facebook employees that reportedly read, "What responsibility does Facebook have to help prevent President Trump in 2017?" led to questions about whether site would try to manipulate the outcome of the election through what users see on the newsfeed.
Mr. Trump himself dismissed the speculation, telling Fox News, "I’m very successful on Facebook. Somebody said I'm one of [Facebook's] great stars – so ... I don't think they'll be doing very much," while chief operating officer Sheryl Sandberg told the Indian news site NDTV in 2014 that "Facebook would never try to control elections."
But some media critics were alarmed. "Facebook essentially acts as a utility, this isn't like a newspaper, this isn't like the Boston Globe publishing a fake page criticizing Donald Trump," Northeastern University journalism professor Dan Kennedy said on PBS' Beat the Press on April 29.
"This is more like the phone company acting in a way that – Donald Trump sets up a phone bank to try and reach potential donors and for some reason their calls don't go through ... regardless of what you think about Trump, this is potentially very dangerous," Professor Kennedy added.