Facebook will work with Germany to crack down on racist posts. What now?

Questions about the social media site's policies on hate speech remain as it announces it will work with German officials to curb posts directed at migrants.

A Facebook login page, displayed on a computer screen in Ottawa, Canada, in 2009.

Adrian Wyld/AP file

September 15, 2015

As a debate rages in Europe over a growing tide of xenophobic social media posts directed at migrants – one key question emerges: Are users participating in an impassioned political debate about the refugee crisis, or simply expressing racist views?

That distinction came into focus on Monday, when Facebook announced that it would work with the German Justice Ministry to crack down on racist and xenophobic posts on the social media site.

In a joint news conference, executives from the site said they would form a joint task force to examine posts flagged by the site’s users as racist and xenophobic. The task force would determine whether such posts were protected as free speech or were violating local laws, which prohibit hate speech directed against a person or a group because of their ethnic or religious background. The offense is punishable by up to three years in prison.

Ukraine’s Pokrovsk was about to fall to Russia 2 months ago. It’s hanging on.

“The idea is to better identify content that is against the law and remove it faster from the Web,” German Justice Minister Heiko Maas said on Monday, according to the Wall Street Journal.

But the US-based Facebook said it didn’t plan on revising its existing policies on what types of posts are allowed on the site. Currently, the site’s community standards prohibit posts that attack a person’s religious affiliation; racial, ethnic, or national identity; sexual orientation; or disability.

A key concern, the site says, is to ensure that its policies do not stifle political debates. Currently, Facebook allows the exchange of “satire, humor and social commentary” related to what users may consider hate speech, noting that one key goal of the site is to allow its users to “challenge ideas, institutions, and practices.”

Because free speech laws differ drastically in Europe from those in the US – and even more so in countries such as China or Egypt – defining what constitutes a “political debate” on social media sites that are used all over the world is often difficult, observers say.

“This is a complex issue,” says Jillian York, director of international freedom of expression at the Electronic Freedom Foundation, in an e-mail to the Monitor. "Personally, I don't think that companies should be in the business of regulating speech."

They took up arms to fight Russia. They’ve taken up pens to express themselves.

Ms. York says Facebook’s current standards are somewhat ill-defined. Using algorithms or similar automated technology to screen posts online has the potential to filter out legitimate speech, she notes.

“In this case, I think Facebook's statements – though not necessarily their position – are wrong,” she adds. “Whether or not you think hate speech should be banned, there's no ‘legitimate debate’ that includes mocking and saying hateful things about refugees, full stop.”

But exactly what types of posts should be removed, and how much responsibility falls to Facebook and other sites to remove speech that is flagged by users as racist or xenophobic is still an open question, as the Monitor previously reported in July.

During the meeting on Monday, Facebook executives argued they should not be responsible for removing posts that are not prohibited by either German law or Facebook’s own policies. The site currently has a team of German speakers that removes posts that are found to be in violation, the Wall Street Journal reported.

The site must also balance free speech concerns with the urgency of Germany’s refugee crisis.

The country is expected to receive 800,000 applications for asylum this year, with attacks on refugee centers and demonstrations also increasing dramatically. Officials said there were about 200 acts of violence directed against the migrants in the first six months of this year, more than all of last year, the Los Angeles Times reported in August.

“It’s not like Facebook can go and say there’s a definitive right answer, but can they innovate on their platform to focus on a particular country’s needs,” says Daniel Castro, vice president of the Information Technology and Innovation Foundation, a technology-focused think tank based in Washington.

“I think it becomes less about regulating speech, and more about ensuring that the platforms are being used for good,” Mr. Castro says, noting that Facebook could introduce more specialized tools to hide comments that may be offensive or limit them to specific users instead of open to all. “The more control you give users, the more ambiguity and flexibility you allow, and I think that’s a good thing."

During Monday’s meeting in Berlin, executives from the site also announced that Facebook would provide financial support to groups that collect examples of online hate speech and begin a campaign to encourage anti-hate speech online, the Journal reported.

But York of the Electronic Frontier Foundation, who is based in Berlin, wondered about the ultimate impact of a collaboration between government and the private sector to curb racist posts online.

“I think we need to be having a conversation about why we think corporations are best tasked with regulating speech,” she says by e-mail. “In Germany, it's particularly interesting – Facebook bans nudity (which you can see on any public street here) but fails to take down ‘hate speech’ that the government wants it to. What does that mean for a society?”