COMMENT: It took Facebook 29 minutes to respond to the live-streamed assault on innocent worshippers at two Christchurch mosques. It was not until a user alerted Facebook to the video – 29 minutes after livestreaming of the attack started and 12 minutes after it ended – that it become aware
Kathy Errington: Social media regulation is needed to confront hate head-on
They took the extraordinary step of blocking access to websites that were hosting the video. While I support their decision, it is not a precedent I would want to see repeated.
Ad-hoc decisions by some (and not other) ISPs to regulate access to certain websites should be a cause for concern - whilst motivated by the best of intentions, these short-term restrictions are incomplete, arbitrary, and have no democratic input.
The Helen Clark Foundation will release a report on Monday where we propose a statutory duty of care be imposed on social media companies, as well as for a new social media regulatory agency to be established.
Such an agency could draw on the existing models from the Broadcasting Standards Authority and the New Zealand Media Council, which regulate so called "mainstream media", without, I would argue, any undue impacts on freedom of speech.
Social media regulation needs a home in government – at the moment at least five agencies have responsibility for pieces of it, including the Privacy Commission, the Department of Internal Affairs, the Ministry of Justice, Netsafe and Police. Most existing legislation predates social media.
Unfortunately, we can't trust social media companies to effectively regulate themselves. They are, after all, largely reliant on a huge volume of user-generator content, and often unwilling to make the necessary investment into human moderators to effectively review what they give a platform to.
There is also profit in serving advertisements to white supremacists, and other peddlers of hate, and this can lessen the will of executives to tackle the issue. And we cannot expect that pressure from advertisers alone can force companies to clean up their platforms in a meaningful way.
The sheer size and importance of a very small number of platforms makes a boycott more difficult - advertisers simply have few other digital platforms to move to.
Even if social media platforms were motivated to regulate discourse, then we still have cause for concern.
Internal self-regulation implemented by private companies is beyond the reach of the public - we don't get to vote on who should be Facebook's chief executive and board of directors, much less the regulations on speech that they may apply.
The way these internal policies are implemented are usually kept private, and have been applied unevenly in the past.
The rapid pace of technological advancement makes this task even more urgent. Facebook only launched its livestreaming service to the public in April 2016, and less than three years later it was used to disseminate a massacre.
As the Privacy Commissioner has stated, this was a predictable risk. The proliferation of smartphones, increased speed and coverage of mobile networks, and constant innovation in services will continue to require new responses.
We should never simply leave it to social media companies and ISPs to decide the proper extent of free speech online, otherwise we are handing over far too much control.
Simply leaving the policy decisions to social media companies on what speech to regulate will mean that these are all too often inconsistent, opaque, and undemocratic.
Greater public input is needed as we work towards a new consensus as to how we can manage the harmful impacts of social media.
• Kathy Errington is the executive director of the Helen Clark Foundation.