About five months ago, while investigating the death of Cassandra Fausett, who I wrote about in October, I began digging more into this area, interviewing users and researchers, reviewing dozens of academic studies, and examining thousands of posts on platforms such as Instagram and TikTok.
For a 40-something journalist whose social media use was mainly confined to Twitter, it has been an eye-opening and often profoundly disheartening experience.
Last month, I wrote about a private Instagram community that caused alarm among health authorities three years ago. Today, the Herald is reporting on TikTok, the Chinese-owned platform that has quickly become one of the most popular platforms on the Internet and the place that many young people seek and share information about mental health.
As I detail in that story, there’s a vast universe of mental health-related videos on TikTok which has amassed billions of views. Many of them are positive and uplifting. But there’s also a vast amount of much darker material that experts say could be harmful to impressionable teens struggling with difficult emotions.
One thing that has raised questions about TikTok is its powerful recommendation feature, which learns users’ tastes based on what they view and then personalises a “For You” page, an endless scroll of videos that can be uncannily compelling.
To illustrate why this could be problematic for mentally vulnerable adolescents, I conducted a quick experiment.
On an iPhone, I downloaded the TikTok app and set up a new account. It requires only a phone number and takes a few minutes. I gave my age as 13, the minimum; I was not asked for verification. I left the account empty, following no accounts and posting nothing.
I started with a generic mental-health search — “depressed” — and watched the first several videos that came up.
Teens acting sad. Nothing that was too upsetting.
I clicked “like” on a few posts and kept scrolling.
One post included the hashtag “#mentalhealthawareness”, so I clicked on that and it brought up a new group of videos.
One of them used a sound clip from the Netflix show After Life, in which Ricky Gervais’s character talks about being broken and empty after his wife’s death.
“I feel sad all the time,” Gervais’ character said, and the app prompted me to search for that term, so I clicked on the search button and it brought up more videos using that sorrowful meme.
I scrolled through these for about 15 minutes, but by then I was curious to see what the For You page would recommend from what I’d viewed so far.
The first half-dozen videos the algorithm served up seemed a little random: rants about teachers, someone playing with slime.
I swiped through the feed and after about five minutes I saw the first post that referenced self-harm.
Three minutes later came the first post that mentioned suicide.
It quickly escalated:
Self-harm. Death. Self-harm. Depression. Death. Depression. Isolation. Self-harm. Trauma. Suicide. Parental neglect. Depression. Depression. Self-harm. Depression. Depression. Parental neglect. Self-harm. Death. Death. Depression. Depression. Depression. Depression. Self-harm. Trauma. Anxiety. Trauma. Death. Depression.
I scrolled through a blizzard of despair for about half an hour. The app is so addictive that it’s hard to pull yourself away, even when you’re looking at content that makes you uncomfortable, but it was getting late, and starting to affect me.
Most of these posts, on their own, probably wouldn’t be considered dangerous; I didn’t see any explicit depictions of or incitement to self-harm in this session, although I had come across that in my reporting before this. But the cumulative effect of all those videos was to quickly put me in an incredibly bleak mood.
I closed the app and put my phone aside. I didn’t sleep well that night. This isn’t my world, and I was only visiting for research purposes. But I’m left wondering: How would I be affected by this content if I really was 13?
Help us investigate
This article is the latest in a series that has examined the rise in mental distress among young people in New Zealand. If you or your family have been affected by this, or if you work in the sector or government and can share information on this topic, please contact alex.spence@nzme.co.nz. Because of the volume of correspondence, we cannot reply to all the responses we receive but we will read all of them. We will not publish your name or identify you as a source unless you want us to.
Where to get help
If it is an emergency and you or someone else is at risk, call 111.
For counselling and support
Lifeline: Call 0800 543 354 or text 4357 (HELP)
Suicide Crisis Helpline: Call 0508 828 865 (0508 TAUTOKO)
Need to talk? Call or text 1737
Depression helpline: Call 0800 111 757 or text 4202
For children and young people
Youthline: Call 0800 376 633 or text 234
What’s Up: Call 0800 942 8787 (11am to 11pm) or webchat (11am to 10.30pm)
For help with specific issues
Alcohol and Drug Helpline: Call 0800 787 797
Anxiety Helpline: Call 0800 269 4389 (0800 ANXIETY)
OutLine: Call 0800 688 5463 (0800 OUTLINE) (6pm-9pm)
Safe to talk (sexual harm): Call 0800 044 334 or text 4334
All services are free and available 24/7 unless otherwise specified.
For more information and support, talk to your local doctor, hauora, community mental health team, or counselling service. The Mental Health Foundation has more helplines and service contacts on its website.