‘Does Facebook reward outrage?’ What leaked papers show the company knew.

Facebook whistleblower Frances Haugen gives evidence to the joint committee for the Draft Online Safety Bill, as part of government plans for social media regulation, in London, Oct. 25, 2021. Ms. Haugen says Facebook is making extremism worse and outlined how it could improve online safety.

Annabel Moeller/UK Parliament/AP

October 29, 2021

Facebook, the social media behemoth with more than 2.8 billion users worldwide, has been weathering storms of criticism ever since its founder Mark Zuckerberg began it as an online project connecting college students in 2004. 

But the 17-year-old company newly rebranded as Meta – which also includes Instagram and WhatsApp, with their billions of global users – is now confronting an unprecedented level of scrutiny after a former Facebook employee secretly copied tens of thousands of pages of the company’s internal research. These appear to show the company has been well aware that its platforms contribute to a host of social ills, even as its profits reach new heights.

This trove of private information, which has been dubbed the Facebook Papers, reveals how the company’s own studies found its platforms damage the well-being of teenage girls, how its algorithms seize on human rage and foment the spread of misinformation and civic strife, and how it has enabled human trafficking and ethnic violence in countries in which it makes little effort to moderate content.

Why We Wrote This

Many companies have put short-term profit over long-term values. The Facebook whistleblower says no one at her former company was “malevolent,” but the misalignment of incentives, she and others say, has led to social discord.

As a result, Congress, federal regulators, and a consortium of news organizations have been combing through these documents as lawmakers consider how to rein in the company’s outsized influence in the flow of global information. 

“The Facebook Papers are forcing a rare level of organizational transparency that has heretofore been missing from the public debate about the potential benefits and harms of these social media platforms,” says Matthew Taylor, professor at the School of Journalism and Strategic Media at Middle Tennessee State University. “We should be asking, why is there such a lack of transparency? Coca-Cola can keep its secret formula, but we still know enough about the product to study its effect on public health. We can’t say the same for Facebook.”

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

What are the Facebook Papers?

Earlier this year, the data scientist Frances Haugen, who worked for Facebook on its civic integrity team, felt alarmed at the way the company chose to optimize its profits instead of addressing the harms it knew it helped cause. 

“When we live in an information environment that is full of angry, hateful, polarizing content it erodes our civic trust, it erodes our faith in each other, it erodes our ability to want to care for each other,” Ms. Haugen told “60 Minutes.” “The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”

So she started to copy thousands of pages of company documents before she left. She realized, she said, “I’m gonna have to do this in a systematic way, and I have to get out enough that no one can question that this is real.” 

At first an anonymous whistleblower, Ms. Haugen turned these papers over to the Federal Trade Commission. Her legal team also provided a redacted version to Congress and the news media.

“Good faith criticism helps us get better, but my view is that we are seeing a coordinated effort to selectively use leaked documents to paint a false picture of our company,” Mr. Zuckerberg said this week. “The reality is that we have an open culture that encourages discussion and research on our work so we can make progress on many complex issues that are not specific to just us.”

Howard University hoped to make history. Now it’s ready for a different role.

What do the Facebook Papers reveal?

In one study first reported by The Wall Street Journal, Facebook researchers found that its platforms “make body image issues worse for 1 in 3 teen girls,” a company document said. On Instagram, company researchers found, 13.5% of teenage girls reported the platform makes thoughts of suicide and self-injury worse, and 17% said it makes eating disorders such as anorexia worse. 

Facebook officials stand by the findings of the study, but argue that it is out of context, “cherry picked” from a host of other company studies that have found positive effects of Instagram on the well-being of teenage girls. 

In other studies, Facebook found how its platforms spread misinformation. In a 2019 study, “Carol’s Journey to QAnon,” researchers designed an account for a fictional 41-year-old mom who followed the Facebook pages of Fox News, former President Donald Trump, and other conservative figures. In just two days, the Facebook algorithm pushed a recommendation to join a page devoted to the conspiracy theory QAnon. 

After the Jan. 6 insurrection, one Facebook employee posted on an internal message board, “We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.” 

While Facebook indeed invested a lot of time and effort to remove Stop the Steal groups and others rooted in conspiracy theories, its efforts in other countries have not measured up, company documents suggest.

When its platforms were being used to traffic maids in the Mideast two years ago, Apple threatened to pull Facebook and Instagram from its app store. In company documents, Facebook acknowledged that it was “under-enforcing on confirmed abuse activity,” the Associated Press reported

How does Facebook’s business model make such problems worse?

In a deeper way, the Facebook Papers also raise questions about social media’s fundamental business model, which provides a “free” service in exchange for the kind of intimate self-revelations that become a gold mine for marketers.

By relentlessly analyzing user behavior online, the company, in effect, attempts to predict the kind of information users are most likely to engage with, and therefore make them spend more time on the site. This kind of behavioral analysis has revolutionized advertising, allowing those selling products to pinpoint users most likely to make a purchase.

But the means by which they do it often appeal to the darker sides of human nature. 

“One of the consequences of how Facebook is picking out that content today is it is optimizing for content that gets engagement, or reaction,” Ms. Haugen told “60 Minutes.” “But its own research is showing that content that is hateful, that is divisive, that is polarizing – it’s easier to inspire people to anger than it is to other emotions.”

“No one at Facebook is malevolent, but the incentives are misaligned, right?” she continued. “Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.”

Scholars have used the term “affective engagement” to describe the algorithmic strategies that discover and then amplify the kinds of information that keeps users’ emotions aroused.

Five years ago, after Facebook introduced a suite of new emoji – anger, laughter, love, etc. – to buttress its famous Like button, it also programmed its algorithms to give these emotional reactions five times more weight than a simple like. 

Some employees wondered right away whether this new strategy would amplify conflict.  

“Quick question to play devil’s advocate: will weighting Reactions 5x stronger than Likes lead to News Feed having a higher ratio of controversial than agreeable content?” one employee posted on an internal message board. “I.e. if I post a story that I bought a coffee (pretty boring example I know) I might invite a few Likes from friends. However, if I post ‘Steve Bannon Punches Hillary’ I’ll probably get more polarized reactions with Angry emojis and thus (5x?) more distribution.”

In a 2018 internal document entitled “Does Facebook reward outrage?” researchers found that more negative comments on a Facebook post meant more clicks for the post’s particular link. “The mechanics of our platform are not neutral,” one staffer wrote, according to CNN

Some critics say Facebook’s focus on affective engagement has hurt itself in the long run.

“I guess what surprises me most is sort of the short-sightedness that you see here,” says David Richard, of the communications department at Emerson College in Boston. “Enabling circumstances for conflict – maybe in the short term it sells ads, but in the long term, it hurts the country, it hurts the general business climate and the economy, and in turn, that will hurt Facebook. That’s the surprise to me, that the ‘smartest people in the room’ didn’t see that.”