YouTube joins Twitter and Facebook to ban QAnon content
Loading...
| Oakland, California
YouTube is following the lead of Twitter and Facebook, saying that it is taking more steps to limit QAnon and other baseless conspiracy theories that can lead to real-world violence.
The Google-owned video platform said Thursday it will now prohibit material targeting a person or group with conspiracy theories that have been used to justify violence.
One example would be videos that threaten or harass someone by suggesting they are complicit in a conspiracy such as QAnon, which paints United States President Donald Trump as a secret warrior against a supposed child-trafficking ring run by celebrities and “deep state” government officials.
The QAnon phenomenon sprawls across a patchwork of secret Facebook groups, Twitter accounts, and YouTube videos. QAnon has also been linked to real-world violence such as criminal reports of kidnapping and dangerous claims that the coronavirus is a hoax. Although restricted to the backwaters of the internet for years, QAnon posts reached millions of people via social media this year.
Pizzagate is another internet conspiracy theory – essentially a predecessor to QAnon – that would fall in the banned category. Its promoters claimed children were being harmed at a pizza restaurant in Washington, D.C. A man who believed in the conspiracy entered the restaurant in December 2016 and fired an assault rifle. He was sentenced to prison in 2017.
YouTube is the third of the major social platforms to announce policies intended rein in QAnon, a conspiracy theory they all helped spread.
Twitter announced in July a crackdown on QAnon, though it did not ban its supporters from its platform. It did ban thousands of accounts associated with QAnon content and blocked URLs associated with it from being shared. Twitter also said that it would stop highlighting and recommending tweets associated with QAnon. It expected over 150,000 accounts globally to have reduced visibility due to these measures.
Facebook, meanwhile, announced last week that it was banning groups that openly support QAnon. It said it would remove pages, groups, and Instagram accounts for representing QAnon – even if they don’t promote violence.
The social network said it will consider a variety of factors in deciding whether a group meets its criteria for a ban. Those include the group’s name, its biography or “about” section, and discussions within the page or group on Facebook, or account on Instagram, which is owned by Facebook. Mentions of QAnon in a group focused on a different subject won't necessarily lead to a ban. Administrators of banned groups will have their personal accounts disabled as well.
Facebook’s move came two months after it announced a softer crackdown, saying it would stop promoting the group and its adherents. But that effort faltered due to spotty enforcement.
YouTube said it had already removed tens of thousands of QAnon-videos and eliminated hundreds of channels under its existing policies – especially those that explicitly threaten violence or deny the existence of major violent events.
“All of this work has been pivotal in curbing the reach of harmful conspiracies, but there’s even more we can do to address certain conspiracy theories that are used to justify real-world violence, like QAnon,” the company said in Thursday’s blog post.
Experts said the move shows that YouTube is taking threats around violent conspiracy theories seriously and recognizes the importance of limiting the spread of such conspiracies. But with QAnon increasingly creeping into mainstream politics and U.S. life, they wonder if it is too late.
“While this is an important change, for almost three years YouTube was a primary site for the spread of QAnon,” said Sophie Bjork-James, an anthropologist at Vanderbilt University who studies QAnon. “Without the platform, Q would likely remain an obscure conspiracy. For years YouTube provided this radical group an international audience.”
This story was reported by The Associated Press.