Using materials from an Australian campaign called #chatsafe, the Government paid for posts to be promoted to users of Facebook, Instagram and Snapchat that urged people to be careful about how they talk about suicide online and to reach out for help if they were feeling distressed.
“Losing someone to suicide can bring up lots of difficult feelings, and it’s okay to talk about these with the people you trust,” said one of the posts.
“Whenever you feel overwhelmed by the content you see on your phone, it’s best to take a break and step away,” said another.
This campaign, and another aimed at people who might be distressed by the Covid-19 lockdowns, reached more than 400,000 young people in August and September 2021, according to the Ministry of Health.
It is an example of the sort of government-led activity that experts in the mental health sector say should be used more frequently to protect vulnerable children and teens from material on social media that could be detrimental to their emotional wellbeing.
Over the past dozen years or so, at the same time rates of anxiety and depression have been steadily rising among adolescents, content dedicated to these issues on social networks has exploded.
In recent weeks, an investigation by the Herald has revealed serious concerns among young people, parents, teachers, and clinicians about the prevalence of mental health-related material on platforms such as Instagram and TikTok, and the impact it can have on teens who are struggling with poor mental health.
At its worst, the critics say, these posts and videos can lead young people to misdiagnose themselves with serious conditions, encourage self-destructive behaviour, and trap users in a spiral of despair.
The role of social media in the youth mental health crisis has been a subject of increasing debate overseas, prompting media investigations, academic studies, parliamentary inquiries, lawsuits, and calls from politicians for stricter regulation.
Last month, US President Joe Biden promised in his annual State of the Union address to “finally hold social media companies accountable for the experiment they are running on our children for profit”.
But to date, there has been much less activity here, despite promises to prioritise youth mental health and reduce suicide rates.
“New Zealand has been incredibly slow to respond,” says one person who has been involved in government discussions on this issue and asked not to be identified because there could be professional consequences for speaking publicly.
Experts in this field say there needs to be a sustained, co-ordinated effort across communities and government to protect young people online, ranging from better education of teens and parents about how to navigate social media safely to legislative action to impose stricter obligations on the tech companies.
However, the work in this area in New Zealand has been fragmented, reactive, and inadequate to the scale of the problem, the sources say.
While officials in various departments have been grappling with this area to some degree, there hasn’t been leadership and vision at a high level that has translated to a comprehensive cross-government approach that takes a broad view of the problem. Where there has been progress, they say, it is often the result of passionate individuals who are juggling other responsibilities.
“It’s not very structured and we don’t have clear ownership,” says one public official.
“We urgently need a much more co-ordinated effort to [address] social media,” says another source in the sector.
In the past several months, the Herald can reveal, a handful of officials from organisations including the Suicide Prevention Office, Mental Health Foundation, Classification Office, Netsafe and Ministry of Education, have been trying to address some of these gaps.
One of the leaders of this initiative is Sarah Hetrick, a respected suicide prevention researcher at the University of Auckland who also serves as the Suicide Prevention Office’s principal clinical adviser. The SPO is a small department within the Ministry of Health that was formed in late 2019 as part of Labour’s $1.9 billion shake-up of the mental health system.
Social media was not seen as a priority when the office was established, sources say. A national suicide prevention strategy launched three years ago mentioned social media twice in 48 pages. The office produced new guidelines on how traditional media should discuss suicide, but did not do the same for social platforms. It did not have an official meeting with a tech company until last year.
In June, Hetrick and a representative of the Mental Health Foundation had an introductory Zoom call with three policy executives at Meta, the parent company of Facebook, Instagram, and WhatsApp, according to a calendar entry.
Soon after the call, Hetrick sent a follow-up email to the Meta executives, obtained by the Herald, in which she said her “particular and immediate concern is around ensuring a robust, clear, and timely response to ensure that contagion (that can lead to cluster suicide) is contained.
“However, as highlighted there is a lot of work needed in this space more broadly and we are excited about working with you as we develop a road map of the initiatives and activities that will be key in this area,” Hetrick wrote.
She added that there would need to be “joining up” across agencies and “bedded down clear processes”, and that urgency was required to avoid the issue “potentially bubbling away”.
Since then, the cross-agency group has met roughly every few weeks to discuss how to improve suicide prevention online.
According to a strategy memo seen by the Herald, this group is working on plans including the establishment of a protocol for responding quickly when “suicide content erupts online”; development of “response materials” that can be pushed out when risky material is going viral; and establishing better working relationships with the tech companies.
The group sought assistance from a senior policy official at the Department for the Prime Minister and Cabinet (DPMC) who worked on the Christchurch Call, the international collaboration to stop the spread of violent extremist material that Jacinda Ardern launched after the 2019 mosque shootings.
With the help of that official, Hetrick and colleagues arranged contact with officials in the UK, Australia and Europe who are working on regulatory and legislative measures to restrict the spread of self-harm content in the UK. They have also sought meetings in the coming weeks with each of the major tech companies.
Separately, researchers at the University of Auckland have received funding to modify the #chatsafe campaign deployed after Podmore’s death to tailor it specifically for audiences in Aotearoa.
The Herald has made multiple requests since November to talk to Hetrick or the SPO’s director Matthew Tukaki about this work, but they were not made available.
A ministry spokesperson responded to fact-checking queries for this article but did not provide a comment.
Insiders in the mental health sector say this cross-agency initiative being led by Hetrick is a promising start but much more will be required across communities, industry and government to protect young people whose mental health may be affected by what they view and post online.
“There is good work going on,” one said, “[but] it takes too long and then it burns good people out and it always ends up falling on just a few people rather than being well-resourced.”
About this series
This article is part of a series about teenagers, mental health and social media. In recent months, we have interviewed young people who use these platforms, parents, researchers, clinicians, health officials, regulators and others; reviewed dozens of academic studies on the subject; and reviewed thousands of posts on platforms such as Instagram and TikTok.
In October, we revealed that the Coroner’s office launched a joint inquiry into suspected suicides by three young women who were connected on Instagram. Among them was Cassandra Fausett, a 17-year-old from South Auckland who endured a “horrendous” two-year spiral into mental illness that resulted in numerous suicide attempts, police callouts and hospital admissions.
In January, we examined the community that they had been part of, talking to a young woman who was once also part of that world but now believes it was damaging to her mental wellbeing. “Anna” told us that she and her peers used private accounts to vent about their experiences with mental illness, sharing intimate details about self-harm incidents, hospital admissions and suicide attempts. It provided an outlet and a sense of validation that they didn’t get offline, but she worries that it also made her feel worse and complicated her recovery. “I just don’t think people understand how graphic it is, and how toxic,” she said.
In February, we reported on concerns about the mental health-related content on TikTok. The Chinese-owned platform has “completely changed the game”, in the view of one student, partly because of the power of its algorithm-driven recommendation engine that learns what readers like and uses this data to populate an endlessly scrollable personalised feed of short videos. A Herald reporter started a new account and, within an hour, was being pushed a stream of videos relating to self-harm and depression.
Listen to Investigations Editor Alex Spence discuss the issue of social media and teenagers’ mental health on The Front Page podcast.
If you have information about this topic, please contact alex.spence@nzme.co.nz. Because of the volume of correspondence, we cannot reply to all the responses we receive but we will read all of them. We will not publish your name or identify you as a source unless you want us to.
Where to get help
If it is an emergency and you or someone else is at risk, call 111.
For counselling and support
Lifeline: Call 0800 543 354 or text 4357 (HELP)
Suicide Crisis Helpline: Call 0508 828 865 (0508 TAUTOKO)
Need to talk? Call or text 1737
Depression helpline: Call 0800 111 757 or text 4202
For children and young people
Youthline: Call 0800 376 633 or text 234
What’s Up: Call 0800 942 8787 (11am to 11pm) or webchat (11am to 10.30pm)
For help with specific issues
Alcohol and Drug Helpline: Call 0800 787 797
Anxiety Helpline: Call 0800 269 4389 (0800 ANXIETY)
OutLine: Call 0800 688 5463 (0800 OUTLINE) (6pm-9pm)
Safe to talk (sexual harm): Call 0800 044 334 or text 4334
All services are free and available 24/7 unless otherwise specified.
For more information and support, talk to your local doctor, hauora, community mental health team or counselling service. The Mental Health Foundation has more helplines and service contacts on its website.