Libya, Egypt riots: Can anti-Islam speech be shut down?

An attack on the US consulate in Libya has drawn widespread attention to an anti-Islam film that enraged rioters. But can – and should – the circulation of this type of material be stopped?

September 12, 2012

Tuesday's deadly attack on the US consulate in Benghazi, Libya, that ended with the death of US Ambassador J. Christopher Stevens has drawn widespread attention to an anti-Islam film that enraged rioters on the scene. As the Monitor's Dan Murphy notes, the situation is reminiscent of the riots over the Muhammad cartoons published in 2005 by Denmark's Jyllands-Posten newspaper. He writes:

In some ways, it was the beginning of an era of manufactured outrage, with a group of fringe hate-mongers in the West developing a symbiotic relationship with radical clerics across the East. The Westerners deliberately cause offense by describing Islam as a fundamentally violent religion, and all too often mobs in Muslim-majority states oblige by engaging in violence.

He correctly notes that the film's authors "cannot be blamed for the violence … that blame goes to the perpetrators." Still, some may argue that the best way to end this vicious cycle of hateful message-and-response is to stifle the message. But – putting aside the question of whether this is the best course of action – is it even a legal option? Can the US government act to stop the circulation of offensive material like this? Can someone else?

The answer in the American system of government is simple: no. Any attempt by the federal government or a US state government to silence speakers like the filmmakers – regardless of how repellant their message might be – would be a restraint on their freedom of speech and a violation of the First Amendment.  Even were the government to pass a law against hate speech, the First Amendment would still trump such a law and render it unconstitutional. 

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

But while the Bill of Rights prohibits the countenance of a hate speech law in the US, the situation is quite different in Europe. While free speech is indeed protected in Europe both by the European Union and national governments, generally speaking those protections do not extend to hate speech. Many European nations, including France, Germany, and Britain, forbid expressions of hatred against a person or group of people based on their race, religion, gender, nationality, sexual orientation, and the like. As such, much of Europe would be able to crack down on messages like the film at issue now, based on its clear anti-Muslim agenda.

There is another set of entities that could act to stymie circulation of these hate messages, and indeed are well placed to do so: the websites hosting the offensive content. The First Amendment and other free speech protections generally apply only to state actors, and have no effect on private actors like Google, for example, which owns YouTube, where clips of the offending film have been posted. Should Google or other Web hosts be so inclined, they are free to decide which content they publish and which they remove without fear of running afoul of free speech protections.

A hosting website may be constrained in what they can do with a user's content under the site's terms of use, which is a kind of contract between the host and the user. But host sites often explicitly warn in their terms of use that users are not allowed to post certain kinds of content: see, for example, YouTube's express prohibition of hate speech. And many sites reserve the right to remove any user content at all, for any reason. So YouTube could remove the offending film clips from its website if the company so desired, and the filmmaker would have no recourse.

Still, web hosts are under no obligation to police their users' content, and indeed, were they to do so it could impair the ability of individuals to express themselves. Again, take YouTube: The company is the largest streaming-video host in the world. If YouTube began aggressively censoring content according to what its board or management thought was morally right, those users with different views would be left without an outlet to publish their content. And while that may be acceptable to the general public in cases of hate speech, it might not be quite so acceptable if YouTube decided that speech on a topic with larger support – say, gay marriage – ought to be censored.