TikTok is facing scrutiny over the wellbeing of its youngest users. Photo / 123rf
Warning: This article discusses suicide and could be upsetting.
Some of New Zealand’s best-known brands are reviewing their TikTok advertisements after a Weekend Herald investigation found them appearing alongside “distressing” videos about mental illness, self-harm and suicide.
The brands, including ANZ, Huffer and Puma, said they did notknow their ads would appear near disturbing mental health-related videos in personalised feeds on the app and were concerned when it was brought to their attention.
Burger chain Wendy’s NZ said it had paused its ads on the platform while it investigated. Other companies said they had raised the matter with TikTok and wanted to see better controls to protect emotionally vulnerable young users.
Sportswear brand Puma said: “We are disappointed to see our ads played alongside content that is highly distressing and hurtful. We are currently investigating how it was possible for our ads to appear alongside such content, to prevent it from happening in future.”
ANZ Bank said: “It is our understanding and expectation that TikTok have appropriate content filters in place. We’ve raised this concern directly with TikTok and will be reviewing future ad placements.”
TikTok told the Weekend Herald it had taken down dozens of videos that breached its rules against content promoting or glorifying self-harm in response to the paper’s inquiries.
The social media platform said it was also “implementing measures to tackle any remaining content on-platform, and will continue to update and improve our content moderation practices so that this type of content is captured and removed immediately”.
“The safety of the TikTok community is our highest priority, and our recommendation system is designed with this in mind,” said Jed Horner, TikTok’s policy manager for trust and safety in Australia. “We genuinely care about the wellbeing of our community, particularly those struggling with mental health issues.”
It comes after TikTok chief executive Shou Zi Chew was questioned last week at a US Congress hearing about the app’s data security and content moderation.
TikTok, owned by the Chinese company ByteDance, has quickly become one of the world’s most popular social networks, particularly among teenagers, but this has brought intense scrutiny from media, regulators and politicians.
In the US, President Joe Biden’s administration is reportedly considering banning TikTok because of concerns about its Chinese ownership, while in other countries government agencies, including New Zealand’s Parliamentary Service, have blocked its use on official devices.
In addition to facing questions about national security, TikTok has also been forced to defend its safeguarding of teenage users.
TikTok and its American rivals – Facebook, Instagram, Snapchat and YouTube – are under fire from critics who allege social media has been a major contributor to an escalating mental health crisis among children and adolescents in high-income countries, including New Zealand.
In the past few months, a Weekend Herald investigation has examined how the explosion of mental health-related content on social platforms has impacted young Kiwis struggling with depression, anxiety and other psychological conditions.
As part of this investigation, a reporter set up new accounts on TikTok (registered as a 13-year-old, the minimum age) and watched hundreds of user-generated videos relating to mental health.
These videos included coded references to cutting and other forms of self-harm; young people venting about feeling hopeless and not wanting to live; jokes about suicide; intimate images of young women in hospital emergency departments and psychiatric wards; and clips about people’s frustrations with mental health services.
Alongside these posts, the Weekend Herald saw ads for dozens of well-known brands – including retailers, fast-food chains, sportswear manufacturers, media companies, educational institutions, mobile phone companies, festivals and government agencies.
In one example, an ad for a Huffer puffer jacket appeared in the reporter’s personalised feed amid a string of posts about self-harm and young women experiencing mental health crises.
Huffer managing director Kate Berry said the company took mental wellbeing very seriously and was concerned its ads had appeared in that context.
“We are in the process of investigating TikTok’s content monitoring practices and would like to be assured that they have robust community standards in place and are effectively removing any inappropriate content,” she said.
Other ads that were seen alongside disturbing videos included promotions for Auckland’s Pasifika Festival, TVNZ’s online streaming service, Vodafone mobile apps, New Balance football boots and the NZ Census.
Tātaki Auckland Unlimited, Auckland’s economic and cultural agency, which runs the Pasifika festival, said: “As a public organisation, we condemn the creation or promotion of material that is likely to harm people’s mental health ... The hugely popular cultural festivals we deliver and promote are about community. They enhance connection and wellbeing.
“The images and content you’ve shared are disturbing and we are exploring if there are any additional measures we can take to ensure our advertising is carried out by TikTok in a way that is in line with best practice and sound ethics.”
TVNZ said it was not aware that ads for its online streaming service had appeared next to “very distressing content”.
“We have raised this incident with TikTok and they have assured us that in response they have removed [more than] 50 relevant pieces of content they deemed to be a breach of their community guidelines.”
TVNZ said it had also asked TikTok what more could be done to “ensure safety for users and advertisers alike”.
A Census spokesperson said TikTok’s moderation process should have prevented its ads appearing next to those videos. “The advertising agency that we have partnered with has reached out to TikTok to better understand how this oversight occurred.”
Advertising executives say the brands are facing a dilemma: teenagers are a highly valuable demographic but can be difficult to engage. Brands believe they need to have a presence on TikTok and other social platforms to reach them, but some worry that doing so could mean they are inadvertently associated with user-generated content that damages their reputation.
While digital advertising allows companies to target specific audiences based on demographic characteristics and interests using automated processes, they may have limited control and visibility over the content that their ads appear next to. Companies say they rely on platforms and publishers to stop explicit, harmful, illegal and age-inappropriate content appearing on their sites and their ads running next to it.
Vodafone said: “It is unfortunate to see our ads played alongside distressing content. However, we have no say over someone else’s TikTok algorithm and the content being hosted on this app.
“Vodafone cares deeply about building a better future for the young people of Aotearoa and hope that social media companies understand the duty of care they face when allowing young people on to their platforms.”
In response to Weekend Herald inquiries, TikTok has been reassuring advertisers that it is improving protections in this area.
In a communication with one advertiser obtained by the Weekend Herald, TikTok said it tried to balance its guidelines barring content that depicts or encourages suicide and self-harm with “the right of our users to share their personal experiences with these issues in a safe way, both to raise awareness and to find community support”.
TikTok says it has moderation systems combining automated processes with around 40,000 human moderators around the world. It claims to have removed around 96 per cent of videos that violated its self-harm and suicide guidelines in the last quarter before anyone reported them.
But the company also said that moderating this type of content was difficult, partly because users adopted code words and “algospeak” to avoid detection. “Moderation is a constantly evolving practice, with common challenges across different platforms, and the internet more broadly,” TikTok said.
“We are confident, having taken additional measures in relation to this specific area, that content of this nature will be more effectively controlled in the future.”
Dr Marthinus Bekker, a clinical psychologist and senior lecturer at Massey University, reviewed some of the videos at the Weekend Herald’s request and said they “brought up sadness and distress in me both as a person and as a clinician”.
“I can see how this content likely fills a need for [users] to be understood, to see that they are not alone in their distress and to even voice their personal experiences of not being heard,” Bekker said. But for young people struggling with emotional problems, “these streams of videos seem likely to deepen distress and hopelessness and less likely to help them to find ways of regulating those emotions in effective ways”.
About this series
This article is part of a series about teenagers, mental health and social media. In recent months, we have interviewed young people who use these platforms, parents, researchers, clinicians, health officials, regulators and others; reviewed dozens of academic studies on the subject; and reviewed thousands of posts on platforms such as Instagram and TikTok.
In October, we revealed that the coroner’s office launched a joint inquiry into suspected suicides by three young women who were connected on Instagram. Among them was Cassandra Fausett, a 17-year-old from South Auckland who endured a “horrendous” two-year spiral into mental illness that resulted in numerous suicide attempts, police callouts and hospital admissions.
In January, we examined the community that they had been part of, talking to a young woman who was once also part of that world but now believes it was damaging to her mental wellbeing. “Anna” told us that she and her peers used private accounts to vent about their experiences with mental illness, sharing intimate details about self-harm incidents, hospital admissions and suicide attempts. It provided an outlet and a sense of validation that they didn’t get offline, but she worries that it also made her feel worse and complicated her recovery. “I just don’t think people understand how graphic it is, and how toxic,” she said.
In February, we reported on concerns about the mental health-related content on TikTok. The Chinese-owned platform had “completely changed the game”, in the view of one student, partly because of the power of its algorithm-driven recommendation engine that learns what readers like and uses this data to populate an endlessly scrollable personalised feed of short videos. A Herald reporter started a new account and, within an hour, was being pushed a stream of videos relating to self-harm and depression.
If you have information about this topic, please contact alex.spence@nzme.co.nz. Because of the volume of correspondence, we cannot reply to all the responses we receive but we will read all of them. We will not publish your name or identify you as a source unless you want us to.
For children and young people
Youthline: Call 0800 376 633 or text 234
What’s Up: Call 0800 942 8787 (11am to 11pm) or webchat (11am to 10.30pm)
Safe to talk (sexual harm): Call 0800 044 334 or text 4334
All services are free and available 24/7 unless otherwise specified.
For more information and support, talk to your local doctor, hauora, community mental health team or counselling service. The Mental Health Foundation has more helplines and service contacts on its website.