Technology

YouTube won’t ban QAnon content, but will remove videos that could promote violence


YouTube is the latest Silicon Valley company to update its moderation policies around the fringe theory QAnon, announcing that content that targets or harasses people based on conspiracy theories will be removed. YouTube will not issue a blanket ban on QAnon content, though.

The company is trying to curb harassment and hate by “removing more conspiracy theory content used to justify real-world violence,” according to its new blog post. That means if people are posting videos about QAnon and alleging anything that could result in actual harm or harassment for a specific person or group, those videos will be removed. YouTube’s blog post did not talk about whether or not those accounts would be removed, although YouTube tends to operate on a three-strike policy before a channel is taken down.

“As always, context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups may stay up,” the blog post reads. “We will begin enforcing this updated policy today, and will ramp up in the weeks to come.”

YouTube has spent some time updating its policies over the last couple of years to target hateful videos, some of which include conspiracy theory videos, according to the blog. The policies are supposed to limit algorithmic recommendations of these types of videos, and the number of views for QAnon content that came from non-subscribed recommendations has dropped by more than 80 percent since January 2019, the company says.

Although YouTube is the latest company to take an additional stance against QAnon conspiracy theories, other social platforms also are starting to take firmer stances. Facebook banned content related to QAnon just last week, although posts from individual accounts are still fine. It was the biggest step taken from Facebook in its ongoing fight against misinformation spreading on the platform. Pinterest followed suit, saying it would ban all QAnon content. Peloton also removed hashtags related to the conspiracy theory.

YouTube’s blog post adds it has “removed tens of thousands of QAnon-videos and terminated hundreds of channels” since the updated policy went into place. The company calls the work “pivotal in curbing the reach of harmful conspiracies,” but acknowledged there’s more to be done.

“There’s even more we can do to address certain conspiracy theories that are used to justify real-world violence, like QAnon.”



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.