What is particularly jarring is that this is history repeating itself: once again, short-sightedness from Silicon Valley has allowed extremist thinking to flourish.
In 2018, former YouTube staffer Guillaume Chaslot criticised the video site's recommendations algorithm for pushing some users down a conspiracy-theory rabbit hole. Google-owned YouTube's recommendations generate 70 per cent of views on the video platform. They have been crafted to keep you engaged for as long as possible, allowing more opportunity to serve advertising. This could mean repeatedly showing you similar content, Chaslot argued, deepening existing biases you might have. These are blind spots in the business model. The company promised in 2019 to do more to downrank the biggest conspiracy theories, though critics say it is yet to convincingly solve the problem.
So what had warranted Facebook's QAnon advances towards me? The email was linked to my work Facebook page, which I use to monitor posts and live streams from Mark Zuckerberg and other Facebook executives. According to my search history, I had looked up the phrase "QAnon" several days earlier, likely triggering its recommendations algorithm.
By design, Facebook's algorithms seem no less toxic and stubborn today than YouTube's back then. Permitting such dangerous theories to circulate is one thing, but actively contributing to their proliferation is quite another.
Internal Facebook research in 2016 found that 64 per cent of new members of extremist groups had joined due to its recommendation tools. Its QAnon community grew to more than four million followers and members by August, up 34 per cent from around three million in June, according to the Guardian.
Companies like Facebook pride themselves on delivering the future. But they don't seem to be able to escape their past, which dangerously affects our present.
Facebook has since made moves to clamp down on QAnon, removing pages from its recommendations algorithms, banning advertising and downranking content in a bid to "restrict their ability to organise on our platform".
Still, that it was three years after the theory was born before Facebook took action is alarming, particularly since Zuckerberg has announced a shift from an open friends-focused social network towards hosting more walled-off, private interest-based groups.
There is no denying such groups pose unique challenges. Flagging and taking down foreign terrorist groups such as Isis is a fairly unambiguous exercise. But how does one rank conspiracy theories? Can an algorithm assess where collective paranoia ends and a more violent conspiracy theory begins — and what is the appropriate response if it can?
The irony is that companies like Facebook pride themselves on innovating and delivering the future. But they don't seem to be able to escape their past, which dangerously affects our present.
With deep pockets, Facebook should have the expertise for fiercer monitoring of its public and private groups and its recommendations algorithms and a lower bar for downranking questionable conspiracy theory content. Perhaps tech companies themselves need to be paranoid about the unintended consequences of their business model. Otherwise, in elections to come, we're going to see history repeating itself.
- Financial Times