YouTube Toughens Rules for QAnon Conspiracy Content

Desk Report

Published: 17 Oct 2020, 12:30 pm

On Thursday, YouTube said that it was tightening the rules for the dissemination of conspiracy theories, explicitly targeting the already restricted QAnon campaign on Twitter and Facebook, reports AFP.

The Google-owned video-sharing service said it was expanding its policies on hate and harassment “to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.”

By implying that they are involved in a plot such as Pizzagate, about a supposed child sex trafficking ring with ties to former Democratic White House nominee Hillary Clinton who worked from a Washington pizzeria, this may mean deleting videos threatening or bullying individuals.

During the pandemic, QAnon grew sharply because it served as a linking force, combining its central tenet of anti-Semitic and white supremacist tropes with long-running theories of conspiracy on vaccinations and 5 G mobile technology, as well as far-right and libertarian politics.

YouTube said that "tens of thousands of QAnon videos" had previously been removed and some channels used by the movement, especially those that directly threaten violence or deny the existence of major violent events, were terminated.

Facebook blocked QAnon-linked accounts on its core social network and on Instagram earlier this month. Earlier this year, Twitter started a crackdown on QAnon.

The latest move by YouTube comes amid heightened tensions over misinformation spreading on social media, while some conservatives have accused platforms of bias in taking down content.

Editor & Publisher: Eliash Uddin Palash

Address: 10/22 Iqbal Road, Block A, Mohammadpur, Dhaka-1207

Design & Developed By Root Soft Bangladesh