Is Germany’s bold new law a way to clean up the internet or is it stifling free expression?
Loading...
| Berlin
It was a seemingly innocuous tweet: The police in the German state of North-Rhine Westphalia (NRW) were extending New Year’s greetings to residents. In addition to a missive in German, the department sent their well wishes in French, English, and Arabic.
The last one didn’t sit well with Beatrix von Storch, a member of the far-right Alternative for Germany party. “What the hell is wrong with this country? Why is the official police page in NRW tweeting in Arabic?” she asked in a tweet of her own. Harking back to New Year’s Eve 2015, when groups of young men, many of whom authorities described as immigrants from North Africa, sexually assaulted women during the holiday revelry in Cologne, Germany, she wrote: “Do you think it is to appease the barbaric, gang-raping hordes of Muslim men?”
Several hundred kilometers south, editors at the satirical magazine Titanic saw the tweet and seized an opportunity to mock the politician. They changed their profile picture on Twitter to an image of Ms. von Storch and began parodying her. “The last thing that I want is mollified barbarian, Muslim, gang-raping hordes of men,” they tweeted.
On Jan. 1, Twitter suspended the accounts of both von Storch and Titanic. Von Storch posted the content of her tweet on Facebook, and that post was deleted, too. It was the first day of enforcement of a new German law forcing social media companies to promptly remove hate speech and other illegal content posted on their networks.
In the struggle to deal with the explosion of abusive and hateful content on social media sites, Germany is staking out one of the most aggressive and far-reaching positions of any country in the world – and is being closely watched as a result.
The new law, known as NetzDG in Germany, requires large social media companies to remove illegal content from their sites, in most cases within 24 hours, or face heavy fines. Proponents say it’s a much-needed effort to bring Germany’s laws on hate speech, written decades before the internet existed, in line with modern realties to curb the rampant proliferation of abusive content.
Yet the move has provoked a strong outcry, both from free speech advocates and political parties across the ideological spectrum in Germany. Critics charge that the law endangers freedom of expression by outsourcing censorship to unaccountable private companies and will lead to a purging of content that isn’t illegal. The New Year’s Day saga with von Storch and Titanic encapsulates the controversy swirling around the law.
“Right there, you’ve got some of the core kinds of speech that you often presume would be protected in democratic societies – political speech and satire,” says Emma Llansó, director of the Free Expression Project at the Center for Democracy and Technology in Washington, D.C. “Some of this content may have very well violated German law, but that is something for German courts to decide. It’s not something that private companies are competent to decide.”
Germany highlights just one dimension of rising global concern over social media content. Around the world, governments and citizens are worried about how the platforms have become a home for not only hate speech, but also “fake news,” invasions of privacy, and voter manipulation. Look no further than the investigations in Washington over whether Russia used social media to try to influence the 2016 presidential election, and the recent outcry in the wake of news that a consulting firm linked to Donald Trump’s presidential campaign harvested data on some 50 million Facebook users.
Yet it is the success or failure of Germany’s law that may well set the tone for what other countries around the world do in regulating social media, at least on hate speech. Several European countries are already considering similar plans, while some authoritarian governments are citing Germany’s move as justification for their own crackdowns on freedom of expression.
More than anything, NetzDG illustrates one of the most fundamental tensions facing societies in the Digital Age: how to curb online hate speech and abusive content without trampling freedom of expression.
“In Germany we have made a decision, especially with our history, that you cannot spread hatred against ethnic minorities,” says Stefan Heumann of the Berlin-based Foundation for a New Responsibility, which studies the intersection of technology and society. “We don’t accept that. We have laws outlawing that.... It is important to enforce those in the online world as well. Most of the public actually supports that those laws should be enforced online. Then the big question is, how do you do that?”
***
The story of how NetzDG was created begins in the summer of 2015, at the height of the flow of refugees into Europe. A sense of optimism pulsated throughout Germany’s southern state of Bavaria, especially in the Munich train station. It was the physical entry point for most of the newcomers and symbolized the vaunted Willkommenskultur, or “welcome culture,” that blossomed across the country. Yet a profound unease existed as well. Arsonists began attacking refugee housing centers, and other opponents of the great migration flooded Facebook with abusive comments.
One refugee, Anas Modamani, arrived that summer from Syria along with the rest of the masses. Mr. Modamani would have remained just an anonymous case file among the 1.3 million other asylum-seekers who registered with the German government, except for one seemingly innocuous, spontaneous act: He took a “selfie” with German Chancellor Angela Merkel when she visited his refugee shelter in September 2015. (See photo.)
At first Germans proudly circulated the photo of Ms. Merkel smiling in a baby blue blazer and Modamani exuding a look of serenity over social media as a sign of the country’s spirit of benevolence. But immigration critics quickly picked up the image and used it for their own purposes, mainly to attack Merkel but tarnishing Modamani in the process.
The photo appeared on Facebook after a group of young refugees from Syria and Libya set a homeless man on fire in Berlin, accompanied with false comments that Merkel had posed with one of the perpetrators. After the bombing in the Brussels airport and subway train in March 2016, Modamani’s face appeared again, this time under the words that Merkel took a selfie with a “terrorist.” “I cried when I saw it,” Modamani told Al Jazeera.
Later, an altered image showed him in the foreground of a photo of the truck used to kill 12 people at the Berlin Christmas market in 2016. “I want to live in peace in Germany. I fled from the war and bloodshed in Syria to live in safety,” he told the news outlet. “I was too afraid to leave my house after I saw what people wrote about me. This is not just my problem. It’s a problem of our time.”
Chan-jo Jun, a fast-talking lawyer who zips around his sleek offices in the postcard-perfect town of Würzburg, north of Munich, thought he saw a clear legal answer to the problem of hateful words and falsified information posted on the internet.
Stopping the posts about Modamani, he reasoned, should have been as simple as notifying Facebook that they were illegal under German law. The photos were clearly altered, the rumors clearly false, and they were invading Modamani’s privacy. “When management has positive knowledge of a specific case, they have to act or else they would be personally liable,” says Mr. Jun. “I thought if I could just demonstrate once that everyone should just report illegal content to [Facebook] management, they’d have to react or else be punishable by law.”
But each time he asked the company to take down the photos he got the same response: The images didn’t violate Facebook’s “community standards,” guidelines that the company has set up to define what is offensive and therefore removable.
He concluded the company didn’t care about criminal law in Germany and instead looked at him as a nuisance – “some stupid lawyer in Würzburg,” as he puts it. He took Facebook to court in a case that turned into one of the most closely watched in Germany. In March 2017, Jun lost the trial, but he believes it helped bring the problem of hate speech to the attention of lawmakers, who, as passions flared over migration, were becoming the target of vitriolic attacks themselves. Suddenly, they could empathize with a refugee who was being maligned by postings that a private company refused to take down.
“There was lots of anti-Muslim rhetoric and violent speech against politicians,” says Mr. Heumann of the Foundation for a New Responsibility. “The law in part came because politicians were so personally affected by it.”
Because of its history, Germany has long had strong regulations on hate speech. Inciting hatred against a religious or ethnic group is illegal, as is assaulting the human dignity of others.
Jun thinks his loss ultimately led to a recognition among lawmakers that Germany’s laws against hate speech, created decades before the internet existed, aren’t tough enough to police the social media landscape.
“It was better with a lost case than a won case,” he says, “because otherwise people would say, ‘You just have to go to court to get justice. They don’t have to change anything. You just keep suing Facebook.’ ”
Facebook’s lack of responsiveness in removing material that users had flagged spurred lawmakers to act, too.
“Facebook dug its own pit by completely ignoring every problematic behavior on their platform,” says Matthias Spielkamp, founder and executive director of
AlgorithmWatch.org, a Berlin-based advocacy group that seeks to bring accountability to automated decisionmaking systems. Mr. Spielkamp doesn’t support the law but believes Facebook’s “arrogance” ultimately backfired. “Now this is quite a mess,” he says, “because what in the end resulted was a law that no one is really happy about. Not even the government.”
***
NetzDG, passed by the German parliament last June, requires social media companies with more than 2 million users to create a system to receive and respond to complaints of allegedly illegal content.
They have 24 hours to remove illegal postings in most cases and a week for more complicated evaluations. Under the law, they must also issue a public report on their actions every six months. Failure to comply can lead to fines of up to €50 million ($62 million). Deciding whether flagged content is illegal requires monitors to consult 22 provisions of Germany’s criminal code.
Last year Facebook added 3,000 people worldwide to its “community operations” team of 4,500, though a spokesperson said the increase was not in response to NetzDG. In Germany, the company has also outsourced monitoring to two companies in Berlin and Essen employing 1,200 people.
A Twitter spokesman declined to comment on how it was enforcing the law, but referenced a Human Rights Watch statement assailing the statute. A Facebook spokesperson said in a statement that the company has “devoted significant time and resources” to complying with NetzDG but criticized the law as well.
“We believe the best solutions will be found when government, civil society and industry work together and that this law as it stands now will not improve efforts to tackle this important societal problem,” said the statement. “We feel that the lack of scrutiny and consultation do not do justice to the importance of the subject. We will continue to do everything we can to ensure safety for the people on our platform.”
It’s not just social media companies that oppose the law: NetzDG has provoked a strong backlash in Germany by a wide variety of political groups, from the far-right Alternative for Germany (AfD) party to the liberal Free Democratic Party to leftist parties. Critics say the law will lead to over-censorship because of the combination of hefty fines and the short time window for companies to respond to complaints. Concern looms, too, that satirical and humorous content will get indiscriminately removed from sites.
Sophie Passmann knows something about what that feels like. She is a 24-year-old comedian who lives in Cologne. Every New Year’s Eve she becomes the butt of her friends’ jokes because of one of Germany’s most enduring year-end traditions: the airing of “Dinner for One” across German television. A British sketch, it’s about the 90th birthday party of an Englishwoman named Miss Sophie who has outlived all her friends. So her butler makes his way around the dinner table, impersonating each of the guests.
“It’s terrible for a person called Sophie,” says Ms. Passmann. “My friends think it’s funny to call me Miss Sophie all night.” So this year she shot back. On the day NetzDG went into effect, she posted a tweet at 9 a.m. that mocked the idea, often put forth by the far-right, that refugees were destroying German culture: “As long as it’s a tradition in Germany to watch ‘Dinner for One,’ refugees can totally come to Germany and destroy our culture.”
It was not her best joke, she concedes, but it should have been obvious that it was humor. By that evening, however, she’d received multiple notifications from Twitter that users found it offensive, and by nightfall the company had removed it. She says her parents don’t love her sense of humor, so she called them for their opinion. They agreed that it wasn’t the most adept one-
liner. “But they understood it was a joke,” she says. Twitter should have, too, but their reflex is to err on the side of not being fined, she believes.
Figuring out what is hate speech or likely to incite people to violence, and what is satire or humor, isn’t always easy. Ms. Llansó of the Center for Democracy and Technology in Washington says if you’re a company trying to manage risk and being forced to navigate more than 20 provisions of the German criminal code, you are going to be “fairly aggressive in your interpretation of what might violate the law.”
Another complaint is that there is no appeal process once material is removed and little explanation for why it was taken down. When Twitter suspended the account of Titanic magazine, staffers at the publication tried to contact Twitter but couldn’t reach a person. All they could do was send an email. Twitter reinstated Titanic’s account 48 hours later without explanation.
But perhaps the most serious criticism of the law is that it moves the responsibility for regulating speech from the government to private companies that have no accountability. This raises issues of transparency as well as definition: Speech that is illegal may be different from speech that violates a company’s community standards or terms of service. How do you know why something has been removed? Should it really even be removed?
“The law is a slap in the face for all democratic principles because, in a constitutional state, courts rather than companies make decisions about what is unlawful and what is not,” Sahra Wagenknecht, parliamentary leader of the radical Left party, told the German press.
Given all the criticism of NetzDG, German lawmakers are looking at ways to reform it only a few short months after it took effect. Some legislators want to create an appeal process for users whose content has been deleted. Many also want to establish an independent body that would assess complaints and decide whether to take down content, rather than have the companies do it themselves.
All this is important because other countries are looking at adopting their own version of NetzDG, which critics find particularly alarming.
The European Commission has already called for social media platforms to take more responsibility for content, while the British and French governments are developing plans to press social media companies to do more to identify and remove terrorist or hateful material.
But what concerns free speech advocates the most are moves by authoritarian or authoritarian-leaning governments. Russia, the Philippines, and Singapore have all cited Germany’s law as they have pressed forward with efforts to restrict speech.
While NetzDG certainly has its share of detractors, it has also had its salutary effects, even some critics admit. At the very least, says Spielkamp of AlgorithmWatch
.org, it has made more Germans aware of the fundamental value of free speech.
“Not everything that harms your feelings needs to be taken down,” he says. “You have to accept in a democracy it is legal, and should be legal, [for others] to say things that you don’t agree with.”
The law has also spurred a deeper discussion about whether Germany needs to back off its historical sensitivity about hate speech. Not surprisingly, editors at Titanic are among those wanting to widen the guardrails on free expression.
The pages of the satirical magazine regularly offer a Rorschach test on where the country stands on the limits of political and provocative expression. The publication has been the subject of more than 40 libel, slander, and other lawsuits since it was founded in 1979. Moritz Hürtgen, social media editor at Titanic, who was behind the AfD parody, says the magazine’s target is always the powerful.
“The way we understand satire is that it always punches up but never down.” That means major churches, companies, or politicians are fair game. Those without power – refugees, for example – are not.
Tim Wolff, Titanic’s editor in chief, believes modern Germany should stop reflexively shielding itself from ugly words and comments. He calls NetzDG “very post-Third Reich German.”
“It tries to hide ugliness instead of really getting rid of it,” he says. “It is shifting all the work to Twitter or Facebook instead of showing what is really going on in people’s minds.”
He draws a parallel between the law and Germany’s banning of “Mein Kampf” and denials of the Holocaust. “It always works that we are ashamed of that so don’t show it. It’s not getting to the core of the problem.”
Jun, the lawyer in Würzburg, has a different view. He says Germany is seeing far too much hate speech today – accepting behavior online that society would never tolerate in conventional public discourse. He believes online content should be elevated to – not allowed to bring down – the standards of acceptable speech.
“What has happened in the past 1-1/2 or two years is that people have come to the insight that complete freedom in social media is not a guarantee of freedom of society,” he says. “It is quite the contrary, that we may need to intervene and regulate social media, in order to protect our constitutional values.”