Facebook founder and boss Mark Zuckerberg. Photo / AP
Media regulation in New Zealand is set to be blown up, with one high-powered Government regulator proposed to oversee the content on news media and social media platforms.
But the proposed new regulations - announced today - appear to have set the Government on a crash course with media companies,at least some of whom seem to have been sideswiped by the extent of the changes. They say they are already heavily regulated and the new laws threaten press freedom.
Free-speech advocates are also angry, describing the proposed regulation as “another major attempt at silencing Kiwis”. It was “a censor’s greatest dream” and would be “weaponised to suppress unpopular or disliked perspectives and opinions”, said the Free Speech Union.
On the face of it, the new Government-appointed regulator has been designed primarily to rein in harmful content on social media platforms, such as Facebook, but professional news media organisations would also come under its auspices.
The move would also affect the likes of the Broadcasting Standards Authority and Classification Office and act as an appeals backstop for media industry self-regulating bodies such as the Media Council.
A discussion document, released today, also recommends a big cultural competency overhaul, with a “significant” Māori presence on the new regulator’s board, and having Māori representatives help develop codes, including ensuring that complaint processes “are respectful of, and restore mana between parties through tailored remediation processes that are mindful of cultural values”.
The document has been drawn up by Internal Affairs, which says the existing regulatory system is decades old and predates social media. It asserts that New Zealanders “are being exposed to harmful content and its wider impacts more than ever before”.
The report cites specific anecdotes of harmful content – including a suicide video on TikTok – and more general examples of harm including racist content, the promotion of disordered eating, adult content in video games and misogynistic threats.
“Child protection and consumer safety for media and online content is not as strong as it should be in Aotearoa,” says Internal Affairs policy general manager Suzanne Doig.
“It is important to get these proposals right. We want to create safer platforms while preserving essential rights like freedom of expression, freedom of the press and the benefit of media platforms. This is why public feedback on the review is essential.”
The discussion document says the Government would appoint the regulator’s board but that the new body would be “fully independent of ministers”.
It says the new regulator would have “no powers over the editorial decisions of media platforms” but it would have ultimate power to approve codes and rule on complaint appeals. It would have takedown powers and it could issue penalties for “serious failures of compliance”.
The document says traditional media are likely to welcome the moves as it levels the playing field by regulating social media - but some media companies aren’t pleased.
Stuff chief executive Sinead Boucher said the moves had the potential to “significantly impact independent New Zealand media companies and our journalists”.
“Professional independent New Zealand media companies are already heavily regulated. Further regulation of the news media is wasteful and could impact press freedom.
“The Government’s focus should be on regulating the business practices, content and business models of the social platforms which are under-regulated, publish enormous volumes of harmful content, and are not subject to already strict laws governing New Zealand media companies and journalists.”
NZME - publisher of the Herald - said it would review the documents before engaging in formal consultation.
Radio NZ chief executive Paul Thompson said enhancing freedom of expression and editorial independence and freedom “should be central to any reforms”.
Great care should be taken with any changes that put those at risk, he said. RNZ was working through the details “of what could be a big change”.
“Our concerns will always centre around the potential impact on news and current affairs. Freedom of expression remains key to our successful democracy.
“We need to further understand how the proposed powers may differ from those held by the BSA and Media Council, which the documents recognise have been successful. The BSA, for example, already has many of the powers listed. We would be concerned if any new entity was to gain pre-emptive powers or tasked with proactively monitoring news media.”
The discussion document places a lot of focus on the regulation of social media and clamping down on the highest-risk content (including harmful content to children and terrorism) but it is clear that traditional, professional media are also in the spotlight.
In some parts, the document raises ambiguous and contradictory points: for example, it says traditional media mechanisms are “generally” effective and in another: “media services like TV and radio broadcasters would also need to follow new codes tailored to their industry”.
“During our community engagement, we heard widespread concerns about the harm some content is causing children and young people. Many of these concerns were about social media and other online platforms, but we also heard concerns about other types of platforms such as broadcasters,” says the document.
“This risky content includes age-inappropriate material, bullying and harassment, and promotion of self-harming behaviours. Instances of harmful content on mainstream social media sites, such as influencers promoting dangerous disordered eating to teenage girls, have become too common.”
The new regulator and codes
The regulation and oversight of standards of New Zealand content is currently a mix of full regulation (broadcasting), industry self-regulation (the professional news media) and little to no regulation (social media). In addition, there are a broad range of laws that govern this area.
The discussion document envisages “several improvements” to the current system with a “co-regulatory model that places greater responsibility on platforms to improve consumer protection and child safety”.
It says the new regulator’s final structure, including “detailed design of governance and oversight arrangements” as well as government funding would be decided once its functions and roles were confirmed.
“Under the proposals as they stand, the functions of the Classification Office, Film and Video Labelling Body, and Broadcasting Standards Authority would change considerably,” says the report.
The Media Council – which oversees all digital and print editorial content for the likes of the NZ Herald, Stuff, RNZ and TVNZ – “would still have an ongoing role in functions such as developing codes and running complaints processes”, says the report.
It says voluntary codes would “need to transition into the new framework”.
“Ultimately, the regulator would need to approve them or send them back to industry for necessary changes to bring them into compliance with the new framework. It is likely that some well-established existing codes covering professional content would transition into the new framework with pre-approval, such as existing codes and standards developed by broadcasters and the Broadcasting Standards Authority, and the Media Council and its members.
“Internal professional editorial processes generally support these codes and standards, significantly reducing the risk of harmful content.”
But the report also contradicts itself in parts including this comment: “Media services like TV and radio broadcasters would also need to follow new codes tailored to their industry.”
And it also makes clear the regulator will have the final say: “We anticipate that these codes would be a collaborative effort between industry groups and the new regulator. The regulator would have powers to endorse specific industry groups to develop codes on behalf of their member platforms, and powers to take over leadership of code development should an industry group fail to make progress on an acceptable code.”
The report says the work of the Classification Office – in classifying material as illegal (‘objectionable’) – would likely be transferred to the independent regulator.
“The Film and Video Labelling Body provides consumer information and the physical labels on products that support the current classification system. This function would shift into codes and be delivered by platforms.”
Platforms that would be regulated under the proposed new law are likely to have one of the following, says the document:
an expected audience of 100,000 or more annually; or
25,000 account holders annually in New Zealand.
“Alternatively, the regulator may designate a platform as a Regulated Platform if it is unclear whether the threshold has been met, or the risk of harm from that platform is significant. Regulated Platforms will not have to be a New Zealand-registered company or resident to be in scope. As defined, their inclusion will be determined by their user/audience base here.”
The back story
The discussion document released today says the content landscape is “rapidly evolving”.
“Media used to refer to television and radio broadcasters, publishers, advertisers, and cinemas. This type of content or media was created and distributed by a small number of organisations, so it was a lot easier to see what was being created and put rules in place to keep people safe from unsafe content.
“Now, in addition to the traditional forms of media, we have access to a diverse range of technologies (such as smartphones) social media, artificial intelligence, virtual reality, and live streaming – to name a few.
“These technological advancements have lowered barriers to the creation, distribution of, and access to content. As a result, anyone can create and share content. It is estimated that 2.5 quintillion bytes of data are created daily. While this has many benefits, such as an increased diversity of information, it also means there is a greater risk for New Zealanders to experience unsafe content.”
It says the current regulatory system is difficult to navigate and has “big gaps”.
“New Zealanders must figure out which of five industry complaint bodies to go to if they feel content is unsafe or breaches the conditions of the platform it is on.
“On top of that, not all forms of content are covered by those bodies. The system is also very reactive because it relies mainly on complaints about individual pieces of content. For most forms of content, we do not have the tools and powers to ensure that platforms are doing what they should to manage the risks of harmful content.”
A Māori lens on regulation
The discussion document also says it is important the new regulatory framework “reflects New Zealand’s unique cultural and social perspectives, and that it is grounded in Te Tiriti o Waitangi”.
“The new regulatory framework would aim to achieve outcomes that reflect Māori perspectives, needs, and aspirations,” says the document.
“We expect the legislation to provide for rangatiratanga by requiring a significant Māori presence on the Board of the regulator. For example, more than one member could be required to have knowledge of tikanga Māori, how content risks affect Māori, or both. The Board oversees the regulator’s activities and sets its strategic direction and priorities.
“The regulator may also wish to explore whether it needs a formal Māori advisory structure to support its work at the more operational level. The regulator would also need enough resources to keep in-house capacity and understanding of te ao Māori to inform its operational processes and decision-making.”
The document suggests the regulator could involve Māori in creating codes of conduct.
The functions and powers of the regulator for developing codes could include:
An ability to specify requirements for regulated platforms to involve Māori in developing and implementing codes.
Setting minimum standards to be met in codes for cultural competency in moderation processes.
Verifying that complaints processes under codes are respectful of and restore mana between parties through tailored remediation processes that are mindful of cultural values.
More industry reaction
The discussion document says a risk-based, code-based industry regulation approach “is not novel, domestically or internationally”.
“Traditional media already operate under codes and under multiple regulatory regimes. They are likely to welcome the introduction of a more level playing field, and a simplified approach for multi-platform providers.
“Consumers of traditional media will see little change in their experience. There would be simplified complaints processes and avenues to raise systemic concerns about trends and patterns of potentially unsafe content with the new regulator.”
The public and industry representatives have until July 31 to give feedback.
News Publishers Association general manager Brook Cameron said: “The NPA will be working with its members during the consultation period to provide detailed feedback. Professional independent New Zealand media companies are already heavily regulated and work within NZ legislation. The current model for our members’ editorial content has operated effectively for many years, providing New Zealanders with a robust and efficient complaints process via the Media Council [previously known as the Press Council].”
Radio NZ’s Thompson said it was a “complex area”.
“The unintentional consequences of well-meaning reform will need to be thoroughly considered before any move to proceed further.”
He said news media were well regulated. “At the highest level, these proposals are predicated on a definition of ‘harm’ and ‘minimisation’. The fundamental issue with the definition of harm is that it is inherently a subjective assessment, whereas any standards regime or code of practice needs an objective set of criteria against which any content can be evaluated. If a regime based on subjective tests is used then it will be difficult to establish consistency.”
Free Speech Union chief executive Jonathan Ayling said the proposed legislation would entirely restructure the New Zealand censorship regime, bringing online speech, such as material on social media platforms, under the oversight of a “regulator” and “codes of practice”.
This overreach must be opposed, said Ayling.
“While the intention to address ‘safety’ and online ‘harm’ is arguably laudable, the cure is worse than the disease. This is an inelegant solution to the ‘lawful but awful’ category of speech, which is best addressed through counter-speech. The proposed structure of a Regulator, with a Code drafted away from Parliament and political accountability, is a censor’s greatest dream and will be weaponised to suppress unpopular or disliked perspectives and opinions.
“Frameworks of this kind do nothing to increase mature discourse or community interconnectedness. On the contrary, they breed suspicion and division. Undoubtedly, content online can cause hate and harm. Free speech is the solution to this, as we use our voices to speak up for tolerance, inclusion, and diversity.”
Who’s who right now?
The Advertising Standards Authority deals with complaints on all advertising in any media. The Broadcasting Standards Authority deals with complaints about programme content broadcast on radio or television, including programme promotions. The Media Council’s jurisdiction includes editorial content in magazines, newspapers, periodicals in circulation in New Zealand, including their websites, major broadcaster’s online news content and digital news content from Media Council member platforms. The Media Council also accepts complaints about classification of Video-on-Demand content from major providers in New Zealand. Source: ASA
* Editor-at-Large Shayne Currie is one of New Zealand’s most experienced senior journalists and media leaders. He has held executive and senior editorial roles at NZME including Managing Editor, NZ Herald Editor and Herald on Sunday Editor and has a small shareholding in NZME.