Facebook graphic content woes: When are beheading videos okay?
Loading...
Family Online Safety Institute (FOSI) CEO Stephen Balkam can’t get the video out of his mind even though he only watched about half of it. He certainly doesn’t want his teenage daughter or any of the other 94 percent of American teen social media users who have Facebook seeing it either.
“The screen shot is a man holding up a woman’s [severed] head,” he says.
Facebook found itself in a tricky situation as a video of a man in a mask sawing off a woman’s head circulated on the social network, despite repeated requests to take down the video. The ensuing outcry prompted a clarification by Facebook of its policy for explicit content, and highlighted the gray area between shareable and questionable content.
FOSI is one of Facebook’s safety advisors, a group of organizations that Facebook consults on safety issues from time to time. However, it was Mr. Balkam and FOSI who came to Facebook last May when they were alerted to a gruesome beheading video making its rounds on the social media site, and requested it be taken down. Though at the time Facebook said it would review its policy on explicit content, Balkam found this month that it was on the site yet again – and the social networking site would not be taking it down.
“It was a surprise and not a good surprise,” he says.
Facebook claims the second appearance of the video was uploaded to condemn violence and therefore should remain online as protest. But a day later, the video was taken down and Facebook released a “Fact Check” statement that clarified that the video was taken down due to its irresponsible glorification of violence. Plus, the site would be taking a broader approach toward flagged content, ensuring that the only contentious graphic content allowed would condemn violence and be shared in a responsible way. But graphic content could potentially remain.
So when are beheading videos okay?
A Facebook spokesperson says the process takes a broad range of factors into consideration. Whenever content is flagged, an employee at Facebook will assess it by looking at who uploaded the image or video, where on the site it was shared, and other contextual factors.
This is where things get a bit more confusing. For example, the origin of the beheading video is tough to discern, as it was shared across much of the network, but it was believed to be filmed in Mexico and affiliated with gang violence. It was posted once in May, and then taken down by Facebook, then again in October, when it was taken down last Tuesday.
The key? Context. The Facebook spokesperson says the second posting was likely meant to dare Facebook into taking it down, as a way to highlight how graphic content is allowed on the site. At first, Facebook deemed this as a method to condemn violence. However, after the outcry, the video was decided to be irresponsible.
Tarleton Gillespie, an associate professor of Communication at Cornell University, is working on a book about technology companies and content policies. He thinks that the confusion comes from Facebook’s enormous range of purposes, and the range of contexts that comes with it.
“Right now Facebook is supposed to be, wants to be, the way teens share their favorite interests, the way families keep track of each other, the way companies interact with customers, and the way political activists agitate for their causes,” Mr. Gillespie writes in an e-mail interview. “But it is extremely hard to be all of those things at the same time, and have coherent policies that fit.”
There are several examples of where graphic content has been used to condemn terrible events. A photo of the mangled legs of a man at the Boston Marathon bombing was circulated on the social media site, and though it generated a bit of controversy, overall it was seen as an example of the devastating effect of terrorism. In some areas of the world, Facebook is the only means to get content out beyond their community, such as in Syria, where activists disseminate violent photos and videos in hopes of spreading the word about the destruction in often-inaccessible corners of the conflict.
However, this clarification in content policy comes on the heels of two updates to Facebook’s privacy policy that could make future content questions difficult. First, Facebook recently announced that everyone on the social network would be searchable via Facebook search, removing a potential layer of anonymity for those that wanted to enjoy the social network with one group of friends but not everyone at once. Second, Facebook decided that images and updates from teens should no longer be kept to just "friends of my friends" – now, posts by teens can be visible to all users. The site says that both policy changes push users to be more responsible about what content they share with what audiences, rather than who has access to their content in general. But often it is difficult to tell where a post is going to end up, be shared, or spread across the increasingly interconnected network, says Gillespie.
“Not only is it hard to know exactly who 'friends of my friends' are, but the mechanisms they've built for making a single post more or less private are arcane and confusing,” he writes. “So it is a little unfair to ask users to be more careful about who they post to while having made it much more difficult to tell who you're posting to.”
And seeing content even with a warning label can be damaging, especially for teen users according to recent research. Sahara Byrne, a professor in the department of Communication at Cornell University, found that any content can have a “boomerang effect" where simple exposure to violent videos, even wrapped in warnings and condemnation, can promote that behavior in impressionable young minds.
With 1.1 billion people using the site, Facebook will have to continue to find ways to make sure the service works for all corners of their network.
“I just think that Facebook and other social media sites are having to contend with new forms of reporting that had not existed before on traditional medias,” Balkam says. “Particularly Facebook where you have over a billion users having a conversation. It is not surprising that there will be many, many fights over what is acceptable.”
[Editor's note: The original version of this story misstated Facebook's new policy toward posts from teenagers. Teens now have the option to share posts publicly.]