The person Reeve found was in many ways a sympathetic character – works in international development aid, articulate, fearful – but nonetheless wrote a post in bullet points that within a few days had contributed to a conspiracy theory so pernicious that a large percentage of the country had came into contact with it within a matter of days. It got so big that the government addressed it at the opening of a Saturday press conference, deeming it of greater importance than announcing fresh Covid-19 cases in the community.
As the conversation wore on, the source revealed himself to be someone extremely online and shattered by the impact of what he had wrought – the way his ill-considered and unsourced rumours had rocketed the length and breadth of the country.
"I may be hung out to dry, and it may be fully justified," he told Reeve. "I will take the consequences, because honestly it's one of the worst things I've ever done, articulating that." He went on to express a very real and well-founded fear for what might happen were his identity to become known.
"If my name gets near these articles, essentially I won't work in the consulting industry again. To be honest, after whatever happens in the next week or two, I might have to go and work on my parents' farm for the rest of my life, or become a carpenter under some random name."
Reeve purposefully withheld the person's identity from publication, and warned against doxing. This is the kind of judgement call publishers make every day. A similar one was made by every major media organisation in the country when confronted by the viral post – all investigated it, none could substantiate it, so none published it. Hundreds of hours of time collectively spent by various news organisations with precisely zero stories to recoup it against.
No consequences
Yet one publisher broke that line. The vast bulk of distribution seemed to happen on four platforms – Facebook, Instagram, WhatsApp and Messenger – all owned and operated by one company: Facebook.
They are brilliantly engineered and ubiquitous, allowing for an obscure Reddit post to be adapted and weaponised to millions within days if not hours. It was a nightmare scenario for the government, apparently caused immense anxiety to a family already dealing with Covid-19, and could easily have become more dangerous still, had a member of the public taken the law into their own hands.
Yet Facebook has suffered no consequences for its actions. The very real fear felt by the man who created the post is completely absent in the organisation most-responsible for its distribution. Why should it be scared?
Facebook is a core part of our information infrastructure, and remains the Labour Party's preferred method of communication, with Stuff reporting the party spent over $100,000 on the platform between May and July of this year. Prime Minister Jacinda Ardern's Facebook live appearances are hugely popular there, and she has been unrepentant about the far larger sums spent by all branches of government to communicate through the platform.
"We need to be present where people are," is Ardern's line on Facebook, and you can see the justification. It's a mass-reach platform where you can speak directly to an audience, with scarcely a breath of moderation. A privilege once reserved only for the media itself (and highly regulated there still), now available to anyone who can develop an audience, or create a piece of content that has the ingredients to spread quickly through feeds.
Yet the very same attributes that make Facebook attractive to politicians also make it prone to situations like that we saw over the weekend. They also accelerate the spread of all kinds of other conspiracy theories, misinformation and disinformation.
Spend enough time on the giant social platforms and you'll find questions asked about the efficacy of masks or whether the virus is nearly as dangerous as it's being made out to be. They are often particularly widespread in communities that have well-founded suspicions of the government and its communications apparatus, having spent centuries being lied to or discriminated against by it.
This is not an uncomplicated problem. A spokesperson for Facebook provided a statement to The Spinoff which speaks to the scale of the problem.
"We have removed 7 millions pieces of false information about the virus including false cures, claims that Coronavirus doesn't exist, that it's caused by 5G or that social distancing is ineffective. We use several automated detection mechanisms to block violating material on our platform and have removed millions of ads and commerce listings for violating our policies related to Covid-19."
Legislation possible
All that was in place prior to the weekend though, and the rumour still spread. It suggests to a confounding paradox. Facebook is engineered to facilitate the instantaneous spread of misinformation and then rewarded for it with government communications spending to counter it.
It certainly looks like an extraordinarily good business model: the false information is free, provided and consumed by its users. The accurate information that attempts to clean up the mess is paid for by the same government that relies on it to distribute its own messaging, and is thus appears loath to regulate it.
This is not because regulation is impossible. While New Zealand lacks the market power to break up Facebook and allow meaningful competition to spring up between it and Instagram or Whatsapp, there are plenty of demands it could make, as Germany has with fines and mandatory police reporting for hate speech. There is a middle ground between do-nothing and the authoritarian excesses of Turkey under Erdogan, or the outright ban in China.
Some curbing of the safe harbour laws that meant it faced no penalty for livestreaming the Christchurch terror attack would be one approach. Another would be to impose an access fee which treated it like a utility – in much the same way as we ask TV channels to pay to broadcast, or demand telcos buy spectrum, we could ask for a percentage of revenue to be devoted to employing local content moderators or creating campaigns which educate people to be more critical consumers of what they encounter on social media.
Essentially to treat Facebook like alcohol – a drug many enjoy, but one which comes with social costs for which we seek compensation, which we then devote to harm reduction and hospitals.
No solution is perfect, but the era of self-regulation is what has brought us here, and surely anything is better than this.
Despite the reluctance of New Zealand's politicians to contemplate any restraints on Facebook, there is a gathering storm of discontent with it. CEO Mark Zuckerberg is regularly hauled before Congress to explain its dominance, and there's a rising tide of large companies joining a spending boycott, including US icons like Disney and Coke. The biggest local entrant is Stuff, whose CEO Sinead Boucher eloquently captured the situation's tensions in a tweet following a recent press conference.
Boycott
Unfortunately the boycott campaign is already losing steam. The vast bulk of Facebook's revenue comes not from big corporates, but from tens of millions of smaller advertisers – local businesses, online retailers, political parties. (Disclosure: The Spinoff remains a Facebook advertiser).
That makes its revenue far more sturdy than other media companies, like TV channels, which are more reliant on national brand campaigns. Indeed, despite the boycott, Facebook's stock hit a record high after it announced a profit of more than $7.5bn in its most recent quarterly earnings (you read that right: it made that much money in just three months, during a pandemic).
Which begs the question – why doesn't Facebook do more to combat the worst excesses of its users? As The Spinoff's editor Toby Manhire pointed out during a recent episode of our Gone By Lunchtime podcast, it has all the systems for perfect contact tracing of viral misinformation. It can see exactly who posted the original rumour, and how it spread. In fact, the page that seems to have launched the rumour onto Facebook is still live. It has 3,000 followers and describes itself as a "newsagent", despite a lengthy history composed nearly entirely of racist memes.
A source at Facebook says that the page has been blocked from posting new content while it works to identify its admins – a necessary but hardly sufficient response. Still, the most recent public post dates to Sunday afternoon – 24 hours after the government had to lead a live press conference with a denial of the appalling and baseless rumour which convulsed an anxious nation.
That final post concerns the same subject as the conspiracy theory that went viral. Rather than resiling from it, or feeling remorse, the page's administrators adopt a defiant pose. It says, with an air of menace, that "the rumour mill will work overtime whether you like that or not". At this point, human nature being what it is, that's undeniably true.
But as we stare down a pandemic which is made harder to fight by the wildfires of social media, it's long past time to ask ourselves as a society whether we should continue to accept an epic political shrug in response.