Will Facebook's 'trending' kerfuffle alter the way we read news?

Facebook said it had found no evidence of 'systematic political bias' in its trending news feed. But it is changing how it chooses news, offering insight into its algorithm. 

A Facebook employee walks past a sign at Facebook headquarters in Menlo Park, Calif. in March 2013. The company said it had found no evidence of "systematic political bias" in its trending news feed, but a firestorm of controversy has also led to more transparency, some say.

Jeff Chiu/AP/File

May 25, 2016

As it negotiates a controversy about how it develops "trending" topics, Facebook is confronting thorny questions about political bias that have long dogged traditional news organizations.

But the Facebook "scandal" highlights anew an important issue in a democracy frequently facing political stalemate: What role do digital news sources play in bridging – or reinforcing – political polarization?

The social network responded to its conservative critics Monday with a series of changes to how it will choose trending news stories for its 1.1 billion daily users. It's not clear that the changes to this increasingly important news platform will result in more or less political bias.

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

Facebook's "Trending" news feed may also be less influential, say observers, than what actually appears in each users news feed. That's because, unlike the trending feed, a user's news feed is affected by both an algorithm and who their friends are, though how much influence that algorithm has is still disputed.

"At the end of the day, really I don't think this is accusing Facebook of anything. This personalization is all around us: your Netflix feed, on Twitter, Google searches alone, we’re surrounded by this kind of hyper-personalization," says Jon Keegan, a data journalist at The Wall Street Journal.

Mr. Keegan recently highlighted the issue of readers living in their own biased news bubble: drawing data from a Facebook study released last year, he created the "Blue Feed, Red Feed" project,  a side-by-side look at what the news feeds of a conservative user and a liberal user might look like on controversial issues. What do news sources on each side say about the latest on transgender bathrooms, the Freddie Gray police trials or the California primary, for example?

Users who identify as either very conservative or very liberal tend to drive political discussions, according to a 2014 Pew Research Center study, a trend that carries over to sites such Facebook.

"I think that the design of Facebook's product makes this kind of difficult for people," Keegan says. "If say somebody was a conservative, and they wanted to broaden their perspective and include some [of the liberal magazine] Mother Jones in their mix of their news, they would have to 'like' that page." 

In the race to attract students, historically Black colleges sprint out front

That's problematic, he says, because of the public shaming that could occur. The Pew study found that consistent liberals are more likely to "block" or "unfriend" people because of their politics, while conservatives are more likely to have mainly close friends who share their political views to begin with. 

The Facebook study the Wall Street Journal used to create its project has faced criticism of its own.

By only focusing on a small group who listed their political affiliation on the site, the company's researchers were able to suggest that users bore more of a responsibility for the polarization on the site than Facebook’s own algorithm, wrote Christian Sandvig, an associate professor at the University of Michigan, in a blog post last year.

Where does this leave Facebook and its supposedly liberal trending news feature?

On Monday, Facebook told a US senator that it had found no evidence of "systematic political bias" in how it selects trending stories, which are edited by human curators, but that it would still revamp how the feature works to eliminate the sway of individual biases.

In a 12-page letter to Sen. John Thune (R) of South Dakota, the company said it would "eliminate our reliance on external websites and news outlets to identify, validate, or assess the importance of trending topics." Instead, Facebook will rely exclusively on what users were sharing with each other, without verifying stories through "trusted" news outlets, including Fox News and The New York Times.

Perhaps unwittingly, the controversy puts Facebook in a similar position to news organizations themselves, which often have to navigate accusations of bias, says Natalia Jomini Stroud, an associate professor at the University of Texas at Austin who directs the Engaging News Project.

"There are similarities here in how Facebook is responding – traditional news organizations also are responsive to their audiences and try to avoid bias in their reporting," she writes in an email to the Christian Science Monitor. But although the controversy will change how Facebook evaluates news sources, its impact is uncertain, she says. 

Facebook's CEO Mark Zuckerberg met with prominent conservative leaders last week to address concerns that the site's trending feed was suppressing news about conservative leaders and pulling from a select group of media sources. But some conservatives argued the meeting wasn't addressing the issue of the site's role as a media provider.

"It is being criticized not as a corporation, but in its chosen role as a media entity. Facebook curates the news; it is a news source for the vast majority of Americans," wrote The Federalist's Ben Domenech in a post on Friday, blaming "a corporate culture that failed to represent its community and left out the perspective of half the nation." 

TV host Glenn Beck, who attended the meeting, instead praised Mr. Zuckerberg for his "thoughtfulness," writing that  "I hope that they want to be open, but I will fight for their right to be who they want to be even if I do not like their decision" in a Thursday post on Medium. 

Senator Thune, who had harshly criticized the site, said Monday's letter marked a step towards more transparency. "The seriousness with which Facebook has treated these allegations and its desire to serve as an open platform for all viewpoints is evident and encouraging and I look forward to the company's actions meeting its public rhetoric," he said in a statement.

In its letter to Thune, the social network also acknowledged that when it came to the trending news feed, it couldn't rule out "isolated improper actions or unintentional bias," in the way that human curators select stories.

The changes, including revamping the terminology the site uses in its trending feed, were a first step, wrote Colin Stretch, Facebook's general counsel.

"We currently use people to bridge the gap between what an algorithm can do today and what we hope it will be able to do in the future," he wrote.

But Jonathan Koren, a former Facebook developer who says he worked on algorithmic ranking for Facebook from 2014 to 2015, was skeptical. "There's a popular belief that somehow if you just just define and automate, and throw enough data [at] a problem, Data Mining / Machine Learning / Artificial Intelligence (or whatever we’re calling it this week) will solve all our problems in a perfectly rational and objective manner," he wrote on LinkedIn.

Algorithms also aren't necessarily neutral: they depend on what information is fed into them, notes Professor Stroud.

The controversy has, at minimum, shed more light on how Facebook chooses news for readers. "It does bring up big questions, and everybody is very curious how these algorithms work. I think everybody would be better off when these companies are as transparent as possible," Keegan says.

The Journal's "Blue Feed. Red Feed" tool is one way to push that process along by allowing users to directly compare what different users may be seeing, a sort of opposing-views button on a slew of controversial issues.

The paper added an option to see news on "Freddie Gray" following the acquittal of one of the six officers charged in his death in Baltimore on Monday.

"I hope it starts a lot of good conversations with people," says Keegan. "It seems to have struck a chord with people that this phenomenon [of political polarization] is real, and Facebook is one of the places where it’s happening."