A protester holding a Q sign waits to enter a Trump campaign rally. Photo / AP
Facebook pushed adverts linked to the QAnon conspiracy theory into its users' feeds as many as 2.4 million times from March through August, profiting from the movement and helping it grow even as employees investigated its potential for violence.
The social media giant first removed a network of fake accounts seeking to spread QAnon's beliefs in April this year, and finally banned all QAnon groups, pages and Instagram accounts from its services last week.
However, between those two points, Facebook's online advertising archive shows it approved at least 204 adverts using the hashtags, symbols or slogans of the movement, which has been a key incubator of coronavirus denialism, as well as violent attacks and plots.
The adverts ranged from online merchants seeking to capitalise on the phenomenon with branded T-shirts, through ardent believers organising marches, to a Lithuanian addiction hypnotherapist who described coronavirus as a Satanic plot to control the world. Some of the adverts were bought by candidates running for office in the US, while one advertiser selling QAnon-branded T-shirts, which managed to briefly bypass Facebook's advertiser verification checks, paid for its promotion with rubles.
The real number of adverts and views is likely to be higher because the survey only included those using clear terms associated with QAnon, and could not catch any that might have eluded Facebook's security checks by falsely classifying themselves as non-political.
Nevertheless, the figures shed new light on QAnon's rapid dissemination to potentially millions of people across Facebook's services, propelled by sharing within private groups and the company's own recommendation algorithms.
A spokesman for Facebook declined to comment beyond pointing to the succession of ever-tightening restrictions it has imposed on the movement since August. He added that the company had made an error by approving the adverts paid for in rubles.
QAnon is a loose, fractious movement devoted to the theory that a huge number of celebrities and politicians belong to a secret Satanist cabal that kidnaps children in order to abuse them and consume their blood in the form of a psychoactive drug.
Adherents believe that US president Donald Trump is engaged in a secret war against this conspiracy, and look forward to an imminent moment of national transformation in which his political opponents are rounded up or executed en masse.
Since it arose in 2017, the movement has been linked to numerous acts of violence, including murders, terror plots and an armed stand-off at the Hoover Dam. As early as 2019, the FBI described it as a potential terror threat.
It has also functioned as a self-appointed online honour guard to Trump, unleashing abuse campaigns against his perceived enemies. In July, Twitter restricted about 150,000 QAnon-related accounts which it said were part of a mass harassment ring.
Most tech companies have been slow to act. Even as the movement exploded across social networks, driven by the loneliness resulting from coronavirus lockdowns and the massive increase in time spent online, Facebook took until August to crack down, finally branding it a "militarised social movement" in September and all but banning it last week.
The adverts Facebook has shown since April are highly eclectic. A huge number hawked QAnon merchandise, sometimes sold by apparent believers and sometimes by non-partisan entrepreneurs who were simultaneously running other online businesses catering to fans of Joe Biden.
Others were from dedicated QAnon groups, podcasts and TV shows, or individual disciples spreading the bad news. Some were from US politicians.
All Facebook adverts must be checked and approved by the company before going live, but much of this work is done by computers. However, adverts often slip through the net and rack up thousands or tens of thousands of views before being removed.
Many adverts have been taken down in this way, though it was not clear which rules they had broken, while others appeared to have been allowed when they were posted.
Some seem likely to have broken the rules at the time, and a few were able to bypass Facebook's verification process and run without a "paid-for" disclaimer before being caught.
Counter-terror specialists within Facebook are deeply worried about QAnon and other radical movements causing violence during and after the US election on November 3, which is likely to be dragged out by postal voting and whose result may be bitterly disputed.
Escalating incidents of "real-world harm" this summer persuaded the company to adopt tactics similar to those it has deployed for years against Islamic fundamentalist and white supremacist terrorists.
However, any comprehensive reform to Facebook's algorithms appears to have been ruled out by the company's grand commercial plan to push as many users as possible into groups.