Analysis by The Disinformation Project found that just 12 people were behind the bulk of conspiracy and disinformation posts that inflamed tensions in the build-up to the violent climax of the February and March 2022 protests outside Parliament. (In the world of fake content, “misinformation” is unwittingly sharing incorrect facts; “disinformation” is fake news that’s deliberately and knowingly planted.)
There’s cross-border incitement, too.
“We know very well that the openness of these tech platforms [is] being exploited by foreign actors to increase polarising rhetoric, hate and harassment,” Donovan said.
Wherever they’re from, creators of fake content have become highly networked, she said. They’re “leaving little breadcrumbs” around the internet, in the form of catchy memes, targeting politicians - or journalists - they know could take the bait.
And the nature of modern news means it can be all news services, and countless repeater sites and accounts, before the misinformation is debunked (it should be noted that wire services also play a part in calling out conspiracies, such as AP’s regular “A look at what didn’t happen this week”).
Case studies collected by Donovan span from those that involve an inadequate response from social media, and some in more traditional media - such as carefully seeded fake information alleging immigrants at the US’s southern border carried ebola - to social media unfairly copping it.
That was the case with a “media-fuelled” social panic about a “slap a teacher” craze that had allegedly been sparked by a TikTok challenge. It was in fact a hoax - and one largely pushed through Facebook accounts.
There was no such craze but, like other fake news chronicled by Donovan’s group, it was used to push various political agendas regardless.
One certainty about this year’s election is that there will be a blizzard of misinformation on social media. How can we sort the wheat from the chaff?
“Disinformation is effective because it shows up looking like authentic source material,” Donovan said.
“Going to rallies and speeches of politicians, what you want to listen for are strange turns of phrase or slogans that the politicians are using and then go back online and search for those slogans and try to understand what’s happening in the digital media environments and how are people being mobilised.”
Is there any government or regulator that Donovan thinks is doing a good job of wrangling misinformation?
She points to a measure recently passed by the EU. “The Digital Services Act is a good step forward in terms of offering up ways of auditing tech platforms.”
The legislation includes an emergency mechanism to force platforms to disclose what steps they are taking to tackle misinformation or propaganda in light of Covid-19 and the war in Ukraine. It also puts tough new rules in place around marketing to children and bans manipulative techniques that lead people to unwillingly click on content on the internet, known as dark patterns.
Tech companies who break its provisions risk a fine of up to 6 per cent of their global turnover.
British bulldog
Donovan also gives dibs to the UK, where John Edwards, formerly NZ’s Privacy Commissioner, recently slapped TikTok with a £12.7 million penalty for mishandling children’s data in his new role as Britain’s privacy czar (Edwards’ successor is about to embark on an exercise to see if our Privacy Act is fit for purpose in this area, which will run in parallel to the free-ranging consultation over a possible new super-regulator).
But she qualifies, “Unfortunately if it doesn’t happen in the country where these companies have their home offices like the US, it’s not going to have the kind of teeth to get these companies to think about how their products are being weaponised.”
One of Donovan’s main points of focus, however, is trying to get governments to enforce laws already in place.
She also wants the social media firms to be more assertive in using the tools they already have in place, and to introduce new measures.
But with Twitter and many of its peers culling their misinformation teams this year, we risk heading in the opposite direction.
As The New York Times reported overnight, Twitter, under Musk, has made a point of lifting restrictions and restoring accounts that had been suspended. YouTube recently announced that it would no longer ban videos that advanced “false claims that widespread fraud, errors or glitches occurred in the 2020 and other past US presidential elections”.
“What’s sad to see is that these companies that are making billions of dollars in profit are extracting those profits and not reinvesting in their workforce and not reinvesting in improving their products,” Donovan said.
The Times also reported that disinformation researchers have become the target of hearings hosted by Republican lawmakers, and lawsuits brought by conservative states, who allege moves to fight disinformation are really attempts to quash conservative voices.
Donovan called these “slapp suits” - a term that’s short for “strategic lawsuit against public participation” or a bid to lumber expenses on a defendant and discourage them from future activism.
Chris Keall is an Auckland-based member of the Herald’s business team. He joined the Herald in 2018 and is technology editor and a senior business writer.