YouTube bans QAnon, other conspiracy content that targets individuals

YouTube said Thursday that it would no longer allow content that targets individuals and groups with conspiracy theories, specifically QAnon and its antecedent, “pizzagate.”

“Today, we are taking another step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence,” the company announced on its blog.

The new rules, an expansion of YouTube’s existing hate and harassment policies, will prohibit content that “threatens or harrasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate,” the post read.

YouTube said it would be enforcing the updated policy immediately and plans to “ramp up in the weeks to come.”

YouTube’s move to rid the platform of QAnon content follows similar recent changes by other social media platforms. In July, Twitter removed QAnon accounts and restricted QAnon content. Last week, Facebook said it would remove groups, pages and Instagram accounts that identified with QAnon.

QAnon is a conspiracy theory that baselessly claims high-profile Democrats and Hollywood celebrities are ritually sacrificing children as part of a cabal that President Donald Trump is fighting. Online, QAnon followers relentlessly attack public and private figures they imagine to be part of the satanic cabal. Some followers have taken their violent fantasies into the real world, allegedly committing violent crimes, including murder, a fact that moved the F.B.I. to label it a potential domestic terror threat in 2019.

Since 2018, YouTube has taken a series of enforcement actions and policy changes to reduce QAnon content using existing rules against incitement to violence or revealing private information. These moves led to the removal of tens of thousands of QAnon videos and the termination of hundreds of Q-related channels, YouTube said.

But, as YouTube CEO Susan Wojcicki told CNN this week, “a lot” of QAnon videos have escaped moderation thus far because they are considered “borderline content,” having not violated any specific policy.

“I think with every policy, it has to be defined very clearly. Like what does that exactly mean, a QAnon group exactly?” Wojcicki said. “That’s a kind of thing that we would need to put in terms of the policies and make sure that we were super clear.”

While not a blanket ban, YouTube expects the new policy to have a significant impact on the remaining QAnon content.

QAnon followers have targeted celebrities, politicians and companies seemingly at random with accusations of pedophilia in videos posted to YouTube. Those videos are then so widely viewed that they saturate the top of search results with conspiracy theories, drowning out official sources of information and, for celebrities, even movie trailers and clips from TV shows.

QAnon followers have repeatedly falsely targeted the actor Tom Hanks, and searches for the actor on YouTube routinely return fantastical conspiracy theories. Hanks’ son, Chet Hanks, said on Instagram that he was “taking a break from social media for a while” this week because of abuse from QAnon followers.

Becca Lewis, a research affiliate at the University of North Carolina’s Center for Information, Technology, and Public Life who focuses on internet radicalization, said that YouTube’s new QAnon policy is much narrower in scope than Facebook’s outright ban, so “the actual impact is incredibly difficult to assess until we see how it’s enforced in practice.”

“On the one hand, it is certainly more aggressive than their current harassment or conspiracy theory policies,” Lewis said. “On the other hand, by only prohibiting conspiratorial content that specifically targets other individuals or groups, it may leave huge amounts of leeway for QAnon content to continue to thrive.”

A spokesperson for YouTube declined to comment on specific QAnon videos or creators that would be affected by the new policy. Some of the most viral QAnon content on YouTube, including a documentary-style QAnon film that had racked up more than 15 million views and entire channels dedicated to the QAnon conspiracy with millions of views and collective subscribers, were taken down Thursday and replaced with banners explaining the removal was “due to multiple or severe violations of YouTube’s policy prohibiting content designed to harass, bully or threaten.” Other viral QAnon videos with millions of views were still accessible.

Image: Brandy ZadroznyBrandy Zadrozny

Brandy Zadrozny is an investigative reporter for NBC News.

Ben Collins

Ben Collins covers disinformation, extremism and the internet for NBC News.

LEAVE A REPLY

Please enter your comment!
Please enter your name here