As tech companies continue to grapple with how to address the online conspiracy community QAnon, YouTube CEO Susan Wojcicki said the company is addressing the conspiracy theory but didn’t commit to an outright ban on content.
“We’re looking very closely at QAnon,” Wojcicki said during an interview with CNN. “We already implemented a large number of different policies that have helped to maintain that in a responsible way.”
QAnon is a right-wing conspiracy group that believes, among other things, that a group of powerful politicians is involved in a child sex-trafficking ring and that a deep state exists to thwart President Trump. In 2017, “pizzagate” conspiracies propagated by QAnon led to a man opening fire in a Washington pizzeria.
In 2019, YouTube launched a series of tweaks to its recommendation algorithm specifically meant to address the spread of misinformation and conspiracy theories on its platform. Most of the content remained live, but the recommendation algorithms were taught to identify conspiratorial content and demote it, according to Wired. Wojcicki told CNN that those algorithmic changes alone have reduced QAnon content viewership by more than 80%.
YouTube also shifted its approach from recommendations based on likes to recommendations based on engagement, such as how long users watched the content, individual search histories, and geographic location.
“It’s not that we’re not looking at it, Wojcicki said. “I think if you look at QAnon, part of the challenge, part of it is that it’s a grassroots movement, and so you can see just lots and lots of different people who are uploading content that has different QAnon theories.”
YouTube’s work focuses on three pillars: removing violative content, raising up authoritative content, and reducing the spread of borderline content.
Despite not having a ban specifically on QAnon content, Wojcicki said that “hundreds of thousands” of videos have been taken down because they “violate other parts of our policies: hate, harassment, COVID information.”
YouTube also provides additional context for searches related to conspiracy theories. For example, a YouTube search for “pizzagate” will take users to a page with a Wikipedia article about the theory. The top 10 results include content from Today, Vox, CBS Evening News, Fox News, and CNN.
Last week, Facebook announced that it would ban all QAnon accounts from its platform. “We will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content,” Facebook said in a statement. “This is an update from the initial policy in August that removed Pages, Groups and Instagram accounts associated with QAnon when they discussed potential violence while imposing a series of restrictions to limit the reach of other Pages, Groups and Instagram accounts associated with the movement.”
The Washington Examiner reached out to YouTube for comment.