Netsafe's two major funders - the Ministry of Justice and the Ministry of Education - need to take a hard look at the way the internet safety agency operates day-to-day.
And beyond that, it's time for our lawmakers to clarify and reassesses Netsafe's powers. The case that led to the $100,000 in damages involved what its new leadership had admitted were missteps.
But it also revealed the limitations of Netsafe's investigative powers as a contracted non-profit attempting to fill a frontline role as the "Approved Agency" for a key piece of legislation: the Harmful Digital Communications Act (HDCA).
Netsafe needs to be fully funded by the government, not dependent, in part, on money from the social media platforms it is supposed to police.
As it stands, things are moving in the opposite direction, with social media companies poised to have a say in the appointment of an administrator for Netsafe's new internet safety code - a position they will also part-fund, if the draft safety code is enacted in its current form.
On one level, it's easy to see the appeal of private sector co-funding, given Netsafe had just $70,000 spare from its $4.1 million 2021 budget - and legal costs associated with the anti-bullying investigation could consume that contingency, and then some, in the current year.
But at a time when everyone from female politicians to pandemic experts to countless everyday people are being abused online, it's a good time to bump up the official budget for enforcing our cyber-bullying legislation.
And while it's part of Netsafe's role to establish a good working role with the social media companies, its new safety code, which is voluntary, is just way too vague. Facebook has said it would welcome any new legislation or regulation that clarifies what content it has to police for its New Zealand audience. An update to the HDCA should cater to that, not a voluntary code.
How we got here
The non-profit now known as NetSafe began in 1998 as a partnership between the Internet Safety Group, whose members included the NZ Police, Auckland Rape Crisis, the Department of Internal Affairs, and the Ministry of Education.
In 2015, the Harmful Digital Communications Act passed, including a provision for an "Approved Agency" to play an education role; assess and, if necessary, investigate complaints about harm caused to individuals by digital communications; and "provide victims of harmful digital communications with a quick and efficient means of redress". If attempts by the approved agency to work with online content providers to resolve an issue go nowhere, then an issue can be escalated to a District Court.
Then Justice Minister Amy Adams named Netsafe as the approved agency for the HDCA - as was widely expected; the role was tailored to it.
Netsafe still fills that role today. Its latest annual report says the non-profit group had revenue for the 12 months to June 30, 2021 of $4.1m, made up of funding from the Ministry of Justice and the Ministry of Education, plus industry partners and reimbursement for in-person presentations and workshops (its expenditure closely matched its revenue, with a $70,000 surplus).
Late last year, as it released a draft of its new internet safety code, Netsafe noted that cyber-bullying was worse than ever.
The introduction to the draft says: "Recent research by Netsafe shows one in five adults and twice as many young people in Aotearoa New Zealand received a digital communication that negatively impacted their life in 2020.
"As 2021 has progressed, Netsafe is continuing to record a new 'high' in the number of
reports related to harmful digital communication. Experiences like this, directly and indirectly, can cause physical, financial, and psychological harm; decrease user confidence and undermine investment in the digital economy and society."
The draft code's provisions call on social media platforms to work with Netsafe to help prevent objectionable or harmful content appearing online; to take steps against hate steps and bullying; to introduce support programmes to raise awareness, and to improve reporting procedures. But it is big on buzzwords, short on specifics and lacks sanctions.
The code was formulated after consultations between Netsafe's chief executive and representatives from Meta (owner of Facebook and Instagram), Google, Twitter and TikTok.
"It looks to me like a 'tick box' code rather than one with real potential to bring about the change needed to create an internet where everyone feels safe and welcomed," Tohatoha chief executive Mandy Henk told the Herald, soon after the draft was released in December.
Tohatoha advocates for a more equitable internet, and works on initiatives to curb hate speech and misinformation online.
InternetNZ public policy manager and acting chief executive Andrew Cushen said while the draft had gone to public consultation, submitters could only tweak a code that the industry had had a strong hand in creating. Instead, affected community groups should have been involved from the ground-up, Cushen said.
His sentiments were echoed by Tohatoha's Henk, who said, "I'm concerned about the content and structure of the code, as well as the process that led to its creation.
"As it is, Nesafe has created this with the industry - and not with the input of targeted communities and individuals."
'Conflict-of-interest' issues
Henk also saw a number of conflict-of-interest issues.
"I'm also concerned that, as written, the code creates a financial conflict of interest. It's hard to see how the code could be fairly administered for internet users if the administrator is dependent on funding from the industry," she said.
"I also have concerns that if NetSafe were to be the administrator of the code, it could create a conflict with their role as the official conflict and resolution body for the Harmful Digital Communications Act."
The Herald put Henk and Cushen's concerns to then Netsafe chief executive Martin Cocker.
The chief executive responded: "The code's purpose remains the same as it was when announced in July, and that is to create a safer online experience for the people of New Zealand. It will become a useful addition to New Zealand's online safety measures – although it is clearly only part of the solution.
"Groups of stakeholders received briefings about the draft code as a lead in to the submission and consultation processes. Their initial feedback on the process was taken into account."
It will always be part of Netsafe's role to create and maintain relationships with online content providers - indeed the Harmful Digital Communications Act explicitly cites that as one of its responsibilities. And for their part, the social media companies say they welcome constructive engagement and clarification of the rules they should follow.
But to ask the signatories - who will include Facebook, TikTok et al - to the code to fund implementation through an administrator "agreed up on and appointed by the signatories" muddies the waters.
Cocker had his final day as Netsafe CEO the same day - December 3 - that the draft code was released, after reportedly filing his intent to resign just three weeks earlier.
Newsroom reported that lawyer and former Employment Relations Authority member Vicki Campbell had been retained by Netsafe to investigate bullying claims against Cocker, who gave three weeks' notice in mid-November, halfway through Campbell's investigation.
$100,000 in damages
And we'll now also never know Cocker's thoughts - at least in an official capacity - on the July 2021 Human Rights Tribunal hearing that led to the $100,000 damages ruling released on March 22 for breaches of the Privacy Act.'
The public version of the tribunal's ruling leaves out many details, to protect the identity of the three women concerned - two of whom were ex-partners of a protagonist known only as "Mr Z", and one of whom was supporting them.
In a complicated dispute, "Mr Z" tried to turn the tables on the three women (Ms A, B and C) by complaining to Netsafe about comments alleged to have been made in a private Facebook group. The ruling says Netsafe created a case summary after making only limited attempts to verify the information supplied by "Mr Z" - and that after "Mr Z" used that case summary in a District Court hearing, Netsafe did not alert the court that he had left one page out of the version of the case summary that he presented. A three-year legal fight was complicated by Netsafe's efforts to block the three women from accessing the case summary.
"Ms A" said Netsafe did not take all reasonable steps to check information before releasing its assessment Case Summary to Mr Z, "noting he is a convicted criminal". "Mr Z" had Family Court convictions for violating protection orders, according to the tribunal's summary.
The tribunal ordered Netsafe to pay Ms A, Ms B and Ms C $30,000 in damages each for humiliation, with Ms B and Ms C also receiving separate $5000 awards for loss of benefits following what it ruled were Privacy Act 1993 breaches (the incidents took place before the Privacy Act 2020 came into force).
New broom
The agency's general manager, Andrea Leask, was named as interim chief executive officer after Cocker departed in December, and still holds the temporary role today.
And Netsafe also has a new chairman following Jon Duffy's departure, which was also in December. He was replaced by long-time board member Colin James (whose day job is chief information security officer for Fletcher Building).
Duffy was not the subject of any complaints, and his resignation from Netsafe's board came after 10 years in the role - towards the end of which he took on two chunky new roles in his outside responsibilities: a new day job as chief executive of Consumer NZ and a directorship with the Banking Ombudsman Scheme.
Regardless, the arrival of a new chair - and, soon, a new CEO -is a good time to push for change.