Europe criticizes Trump Twitter ban – but not for reason you'd expect
Axel Schmidt/Reuters/File
Berlin
When Twitter banned Donald Trump, the chattering class in Europe began twittering.
The American president was stripped of his digital megaphone? Based on what criteria? And, ultimately, who should be able to decide?
German Chancellor Angela Merkel was clear: She found Twitter’s decision “problematic” – not because of the ban, but because of who imposed it. Rather than private companies having final say on the “fundamental right” to expression, it should be “the law and within the framework defined by legislators,” said her spokesman.
Why We Wrote This
Whose responsibility is it to decide when someone should be banned from social media: private enterprise or public authorities?
Tyson Barker, head of technology policy at the German Council on Foreign Relations, checked the Twitter account of Germany’s most prominent far-right nationalist. Still active. “The platforms’ approach is being applied selectively,” he says, “and at their heart these are political decisions.” Better, he argues, would be an agile, trust-based regulatory ecosystem that includes government, civil society, and users checking each other.
In 2017, the German government embarked on a first step toward a regulatory system by passing a law compelling social media platforms to identify and remove “illegal” content. A few years into the most ambitious attempt to police online content that a Western democracy has yet made, it’s unclear whether the move has civilized online discourse, or simply pushed dark speech off moderated platforms and into closed groups such as Telegram.
Yet with Mr. Trump’s social media bans animating debate, the German law being continually revised, and a Europe-wide version currently before the European Parliament, a long-overdue global conversation is finally underway.
“We’re playing catch-up to 15 years of internet development,” says Lisa Dittmer, internet freedom advocacy officer for Reporters Without Borders. “It’s ridiculous it’s taken this long. This issue has been pushed to the forefront in a huge way now. The positive side is we’re starting this debate of how to democratize internet spaces.”
How they do it in Germany
Under Germany’s Network Enforcement Act, large social media platforms such as Twitter and Facebook must allow users to flag content. Once content is flagged, the onus is on the platforms to review and remove within 24 hours what they deem “illegal” under Germany’s criminal code.
In other words, German lawmakers set the legal framework, and social media platforms must operate within it.
“Germany basically says government should regulate speech, not private companies,” says Ian Rosenberg, an American media lawyer and author of “The Fight for Free Speech: Ten Cases That Define Our First Amendment Freedoms.” “That is absolutely opposite to the American constitutional tradition, which is that the government should be prohibited from regulating private speech. They are totally antithetical to each other.”
It’s worth noting that Germans are traditionally more comfortable than Americans with state interference in speech, given the country’s dark history of Nazi-related hate speech.
Sedition, public incitement, terrorism, and symbols of unconstitutional organizations are targeted as “illegal” content by the German law. Platforms have engaged hundreds of new moderators to review flagged content, and must publish transparency reports every six months.
The major platforms are largely complying. During the first half of 2020, Twitter removed or blocked 120,000 tweets. Facebook’s compliance has been less transparent; in 2019 it was fined for incomplete reporting. That year it deleted or blocked only 349 pieces of content.
Facebook’s flag mechanism wasn’t “user-friendly,” says Amélie Heldt, a Berlin-based researcher on platform governance. “Users simply could not see or understand” how to make a complaint.
But it doesn’t always work
While platforms are largely seen as attempting to comply, say analysts, the climate of online conversation does not appear to have become more civil. The law also has clear problems.
Twenty-four hours is hardly enough time to act prudently and there’s practically no downside to removing more content than needed in order to avoid fines. That creates “an incentive for overblocking, for removing content that when further assessed, is in fact legal,” says Jana Gooth, a European Parliament legal policy advisor.
Secondly, it compels platforms to do the job of German law enforcement in deciding what’s “illegal,” a particularly grave responsibility since the government does not systematically evaluate those decisions, says Stephan Mundges, a digital communications researcher at TU Dortmund University.
At the same time, the law does not allow users to appeal, a failing that the Europe-wide Digital Services Act, soon to be debated in the EU parliament, will attempt to address.
Further, many of the conspiracy theories being shared, such as climate change denial or false claims about coronavirus, do not constitute illegal speech under German law. German nationalist Attila Hildmann, for example, still has an active Twitter account, despite having organized protests against coronavirus restrictions, ranted against Ms. Merkel’s legitimacy, and spoken at an August 2020 rally from which people attempted to breach the Reichstag, the German parliament.
In fact, the vast majority of removed posts are taken down not to comply with German law, but with Twitter and Facebook’s own community standards, which do not depend on user complaints.
Some critics fear the German law could set a bad precedent for authoritarian governments. A Danish think tank reported in 2019 that at least 10 countries with varying levels of civil rights protections have directly or indirectly referred to the German law as justification for their own online censorship rules, including Russia, Vietnam, and Kenya.
Meanwhile, dark speech is clearly moving into unmoderated messaging spaces such as Signal and Telegram. German nationalist Mr. Hildmann, for example, often uses Twitter to direct his followers to Telegram, where groups of up to 200,000 members are allowed and content is not moderated.
The debate over who should have the responsibility to regulate online content, and how they should exercise it, has a complicated path ahead of it, say internet experts.
“It’s not clear cut,” says Mr. Rosenberg, the media lawyer. “It’s not like the European model has eliminated hateful discrimination or violence against minorities. It can be a good thing [that Americans] don’t allow government to regulate speech. Both systems have something to learn from each other.”
Ultimately, societies must address the roots underlying extremism, says Ms. Dittmer of Reporters Without Borders. “This is also a societal problem, not just a digital policy issue,” she points out. “You also need to ask questions: why we have extremism, and how we can reach people before they fall into these networks.”