YouTube is taking aim at QAnon, the pro-Trump conspiracy-theory cult. The video platform said it is expanding its hate and harassment policies to prohibit content targeting an individual or group with “conspiracy theories that have been used to justify real-world violence.”
YouTube is positioning the move not as a “ban” of QAnon per se. Rather, according to the platform, its policies prohibiting certain kinds of content are based on the nature of what’s in a video rather than who posted it.
But in practice, it’s largely targeted at content posted by QAnon believers and sympathizers. YouTube’s expanded policies, for example, ban videos that threaten or harass someone by suggesting they are complicit in a harmful conspiracy — with YouTube specifically calling out QAnon and “Pizzagate,” a forerunner of QAnon.
Under its existing policies, according to YouTube, it already has deleted “tens of thousands of QAnon videos” and terminated several hundred channels. “All of this work has been pivotal in curbing the reach of harmful conspiracies, but there’s even more we can do to address certain conspiracy theories that are used to justify real-world violence, like QAnon,” YouTube said in a blog post Thursday.
Last week, Facebook announced a blanket ban on QAnon-affiliated content on its platforms, while Twitter has taken steps to purge QAnon accounts.
In the bizzaro world of QAnon, the movement’s followers may interpret YouTube’s new crackdown as validation of their conspiratorial worldview: After Facebook’s announcement, QAnon adherents spread the notion that the company’s ban somehow proved the legitimacy of their unfounded ideas.
QAnon, which first emerged in 2017, is based on postings from “Q,” whom followers believe is an anonymous U.S. government insider. The movement revolves around President Trump’s supposed secret war on “deep state” enemies and a child sex-trafficking cabal run by satanic pedophiles/cannibals.
YouTube, in announcing the expansion of its hate and harassment policies, noted that “As always, context matters,” meaning that news coverage of QAnon and videos discussing such conspiracy theories are still permitted.
According to YouTube, changes it made in its recommendations algorithms in 2018 to reduce the spread of harmful misinformation resulted in in a 70% drop in views coming from its search and discovery systems. For QAnon content, YouTube says, the number of views that come from non-subscribed recommendations to prominent QAnon-related channels has dropped by more than 80% since January 2019.
YouTube added, “Due to the evolving nature and shifting tactics of groups promoting these conspiracy theories, we’ll continue to adapt our policies to stay current and remain committed to taking the steps needed to live up to this responsibility.”
The Google-owned video giant says that it has long maintained a policy of removing content that explicitly threatens someone or doxxes them and that it has always prohibited incitement to violence. Per YouTube, QAnon-related videos that have been removed under this policy include those that directly called for violence against Hillary Clinton.
In addition, last year YouTube adopted a ban on videos promoting the idea that one group is superior to others — e.g., white supremacist and anti-Semitic content — as well as conspiracy theories denying that certain violent events took place, like the Holocaust or the Sandy Hook school shooting. According to YouTube, that resulted in a fivefold increase in the number of videos and channels it removed for violating its guidelines, many of which were related to QAnon.
In 2020, YouTube updated its “harmful and dangerous” content policy to begin removing content that contains COVID-19 misinformation. That has included deleting videos promoting QAnon conspiracy theories alleging 5G causes coronavirus, as well as the “Plandemic” and “Plandemic 2” videos.
Source: Read Full Article