Facebook's parent company, Meta, says it's taking measures to stop misinformation on its platform. Photo / 123rf
A network of anti-vaccine Facebook accounts have surged in popularity since the start of the Delta outbreak, prompting new warnings from scientists about the threat of misinformation to New Zealand's attempts to contain Covid-19.
Analysis of Facebook data by the Herald on Sunday reveals that a cluster of New Zealand-basedaccounts that are harshly critical of Covid vaccines and public health restrictions added tens of thousands of followers since the outbreak began in August, generating high rates of shares and comments and racking up millions of video views.
In just a few months, the accounts have built a highly engaged audience that appears receptive to misleading and false claims about the safety and effectiveness of the Covid vaccines.
Disinformation researchers say the Facebook accounts' growth mirrors that of a wider online movement in Aotearoa that is becoming increasingly assertive in opposition to the government's handling of the pandemic.
The Herald on Sunday has opted not to identify the accounts to avoid amplifying dubious claims that could be harmful to public health.
Facebook's parent company, Meta, said it has taken measures to remove false and damaging information about Covid from its platform. It has removed some videos posted by the accounts reviewed by the Herald on Sunday for violating its policies.
But public health experts and information researchers said Meta and other social media companies must do more to stop the flood of misinformation spreading online.
"The social media platforms have to take responsibility as publishers, rather than as just platforms," said epidemiologist Michael Baker.
"They cannot sit back and say they are purely a platform. They are a publisher and they have to take responsibility for that. That's what will transform the environment," said Baker, a professor of public health at the University of Otago Wellington.
The Herald on Sunday used data from CrowdTangle, an analytics tool owned by Facebook, to analyse the growth of anti-vaccine accounts on the platform. The data does not show exactly how many people the accounts have reached through their posts and videos – Meta keeps those figures secret – but it shows the growth and rate of engagement they're getting, which provides an indication of their popularity.
Public health experts say misinformation (false claims spread without necessarily intending to deceive) and disinformation (false claims that are knowingly spread) are a serious risk to the government's attempts to protect the New Zealand population against Covid, because they create confusion and undermine trust in health institutions. It could persuade people who are hesitant about the vaccines to avoid getting immunised, or lead people to refuse to follow public safety measures put in place to contain the virus.
Although misinformation is not as pervasive in New Zealand as in some other countries, researchers say it has mushroomed during the Delta outbreak and reached a "tipping point".
"It's spreading," said Sanjana Hattotuwa, a researcher for The Disinformation Project at the University of Auckland's Te Pūnaha Matatini centre. "It's like a metastasizing cancer in Aotearoa New Zealand and it's spreading across these platforms."
In a report published last month, researchers for The Disinformation Project said they'd been startled by the growth of anti-vaccine content on social media during the Delta outbreak. The false claims are getting worse in terms of the volume of posts, the amount of engagement they get and the "intensity" of their messaging.
The anti-vaccine opposition has evolved into a complex, highly fluid online "ecosystem" spread across various social media platforms that is hard for outsiders like scientists and journalists to track.
Hattotuwa said the most aggressive anti-vaccine activity tracked by the Disinformation Project is on Telegram, a closed messaging platform that he described as "the hellscape of toxicity in Aotearoa".
But Facebook (and to a lesser extent Instagram and TikTok) are still important for the campaigners:they appear to use the bigger platforms to build their audiences and then encourage users to move to smaller platforms with fewer controls, where they will face less scrutiny.
The soaring popularity of the Facebook accounts shows how easily fringe viewpoints can reach a substantial audience in the social media era. At little or no cost, motivated amateurs can quickly generate a following that rivals those of better-resourced media organisations. However, they may feel no obligation or incentive to adhere to the standards of accuracy that journalists have traditionally striven to achieve.
Meta says it is committed to fighting Covid-19 and has taken a range of measures internationally to give its users accurate information about the virus and stop dubious claims circulating.
"We remove Covid-19 misinformation that could contribute to imminent physical harm including false claims about cures, treatments, the availability of essential services, or the location and severity of the outbreak," a spokesperson for Meta said. "We also prohibit false claims that could lead to Covid-19 vaccine rejection. We have removed several videos for violating our policies."
Researchers say Facebook is doing more than some of its rivals to crack down on misinformation, but it's not been enough to stem the avalanche of anti-vaccine content. And when Facebook takes action against an account, the campaigners quickly adapt and move elsewhere.
"Some accounts that are suspended on one product or platform are there on another and you wonder why," Hattotuwa said.
"Some which are clearly violative of Facebook's existing policies and guidelines are allowed to exist and also to propagate and promote content. Some individuals that are clearly leading disinformation producers in Aotearoa New Zealand are suspended but allowed back on the platforms."