Under YouTube's new policy, which goes into effect today, "content that threatens or harasses someone by suggesting they are complicit" in a harmful theory like QAnon or Pizzagate will be banned. News coverage of these theories and videos that discuss the theories without targeting individuals or groups may still be allowed.
The QAnon movement began in 2017, when an anonymous poster under the handle "Q Clearance Patriot," or "Q," began posting cryptic messages on 4chan, the notoriously toxic message board, claiming to possess classified information about a secret battle between President Donald Trump and a global cabal of paedophiles. QAnon believers — known as "bakers" — began discussing and decoding them in real time on platforms including Reddit and Twitter, connecting the dots on a modern rebranding of centuries-old anti-Semitic tropes that falsely accused prominent Democrats, including Hillary Clinton and the liberal financier George Soros, of pulling the strings on a global sex trafficking conspiracy.
Few platforms played a bigger role in moving QAnon from the fringes to the mainstream than YouTube. In the movement's early days, QAnon followers produced YouTube documentaries that offered an introductory crash course in the movement's core beliefs. The videos were posted on Facebook and other platforms, and were often used to draw new recruits. Some were viewed millions of times.
QAnon followers also started YouTube talk shows to discuss new developments related to the theory. Some of these channels amassed large audiences and made their owners prominent voices within the movement.
"YouTube has a huge role in the Q mythology," said Mike Rothschild, a conspiracy theory debunker who is writing a book about QAnon. "There are major figures in the Q world who make videos on a daily basis, getting hundreds of thousands of views and packaging their theories in slick clips that are a world away from the straight-to-camera rambles so prominent in conspiracy theory video making."
YouTube has tried for years to curb the spread of misinformation and conspiracy theories on its platform, and tweak the recommendations algorithm that was sending millions of viewers to what it considered low-quality content. In 2019, the company began to demote what it called "borderline content" — videos that tested its rules, but didn't quite break them outright — and reduce the visibility of those videos in search results and recommendations.
The company says that these changes have decreased by more than 70 per cent the number of views borderline content gets from recommendations, although that figure cannot be independently verified. YouTube also says that among a set of pro-QAnon channels, the number of views coming from recommendations dropped by more than 80 per cent following the 2019 policy change.
Social media platforms have been under scrutiny for their policy decisions in recent weeks, as Democrats accuse them of doing too little to stop the spread of right-wing misinformation, and Republicans, including Trump, paint them as censorious menaces to free speech.
YouTube, which is owned by Google, has thus far stayed mostly out of the political fray despite the platform's enormous popularity — users watch more than 1 billion hours of YouTube videos every day — and the surfeit of misinformation and conspiracy theories on the service. Its chief executive, Susan Wojcicki, has not been personally attacked by Trump or had to testify to Congress, unlike Jack Dorsey of Twitter and Mark Zuckerberg of Facebook.
Vanita Gupta, the chief executive of the Leadership Conference on Civil and Human Rights, a coalition of civil rights groups, praised YouTube's move to crack down on QAnon content.
"We commend YouTube for banning this harmful and hateful content that targets people with conspiracy theories used to justify violence offline, particularly through efforts like QAnon," Gupta said. "This online content can result in real-world violence, and fosters hate that harms entire communities."
Rothschild, the QAnon researcher, predicted that QAnon believers who were kicked off YouTube would find ways to distribute their videos through smaller platforms. He also cautioned that the movement's followers were known for trying to evade platform bans, and that YouTube would have to remain vigilant to keep them from restarting their channels and trying again.
"YouTube banning Q videos and suspending Q promoters is a good step," he said, "but it won't be the end of Q. Nothing has been so far."
Written by: Kevin Roose
Photographs by: Al Drago
© 2020 THE NEW YORK TIMES