At a time when more teens are experiencing mental health problems, the amount of content dedicated to it on social media has exploded. Investigations Editor ALEX SPENCE talks to one Wellington student who worries that other young people are falling into the same dark hole that she did as a depressed adolescent.
Warning: This article discusses suspected suicide, self-harm, and other mental health problems and could be distressing for some people. If you need help, contact Lifeline on 0800 543 354 or text 4357 (HELP).
Megan Dykes’s TikTok account is a dark place. On a recent Thursday morning, she opened the ‘For You’ page on the app and lingered over the first video that popped up: a clip of a young woman joking about wanting to cut herself.
Dykes swiped her thumb and watched another similarly gloomy post, and then another.
She kept scrolling, and the videos kept coming — a seemingly endless stream of depressed and distressed young women venting about how lonely and worthless they felt.
Dykes, a 20-year-old university student in Wellington, has watched so much of this kind of negative mental illness-related content on social media in the past decade that she is to some extent desensitised to its emotional impact. But as she scrolled, she couldn’t help despairing.
“I just feel angry,” she says. “A whole generation of kids is being raised on this content.”
Dykes has a complicated history with social media. A decade ago, as an adolescent in Auckland, she began experiencing what she describes as a mild case of depression. In her recollection, Dykes was a slightly awkward child who didn’t have a lot of friends and struggled to fit in at school. She went online and found some “very intense” material relating to depression.
Dykes says her parents put few restrictions on her internet use and didn’t realise what she was looking at. Some of the material she found was very explicit, but she was drawn to it. On Google Plus (an early competitor to Facebook that is now defunct), she first came across references to self-harm. Within a year, Dykes says she was regularly self-harming and having suicidal thoughts.
It took Dykes years to get over these experiences; in some ways, she is still dealing with them. Looking back now, she says her depression would not have become as bad or lasted as long as it did, or she may not have started self-harming, had she not fallen into those dark corners of the web.
“It was so much harder to get out of it after becoming stuck in that space,” she says.
Rising rates of distress
In the past dozen or so years, rates of anxiety, depression, self-harm, and other mental health conditions have increased sharply among adolescents in New Zealand and other high-income countries. At the same time, the amount of material dedicated to these problems on social media has exploded.
A generation of youngsters who have never known the world without smartphones or the internet turned to social platforms for information and advice about mental health conditions and treatments, to seek validation and support from others with similar experiences and to express their anguish.
In many ways this has been beneficial, raising awareness of the importance of mental wellbeing, encouraging people to seek help and inspiring recovery. But there is also an astonishing amount of material on social platforms that users, researchers and clinicians say is potentially harmful to vulnerable adolescents — spreading misinformation about mental health conditions, encouraging self-destructive behaviour and trapping users in a spiral of hopelessness.
As part of a major investigation into the state of mental health in New Zealand, the Herald has spent months interviewing young people who have been immersed in this content, parents, clinicians, researchers, health officials, regulators, and others; reviewed dozens of academic studies on the subject; and examined thousands of posts on several platforms.
Last month, we reported on a network of young women with severe mental illnesses who used private accounts on Instagram to share intimate details about self-harm episodes, hospital admissions and suicide attempts. Three of those young women died by suspected suicide in 2019, prompting an ongoing inquiry by the Coroner’s office.
But while concerns remain about Instagram, serious questions have also been raised about the impact of TikTok, its Chinese-owned competitor (TikTok is controlled by a company called ByteDance). It commands a growing share of young people’s attention and has become, for many teens, the primary platform for viewing and sharing information about mental health.
“I think TikTok has completely changed the game,” Dykes says.
An internet juggernaut
It is hard to overstate the influence that TikTok, in just a few years, has had on how teens communicate and entertain themselves.
Based on a lively, easy-to-use app that showcases short, looping, often humorous videos made with catchy audio clips and visual effects, TikTok makes it possible for users to reach vast audiences with minimal effort. Billions of homemade videos have been uploaded to the platform on a dizzying range of topics, spawning countless memes and a new generation of influencers.
Its most compelling feature is a powerful, algorithm-driven recommendation engine that quickly discerns a user’s interests and uses this to populate a personalised stream of videos. Without even looking for something to watch, a user can stay glued to the app for hours and see dozens, if not hundreds, of clips in that time.
It has made TikTok an internet juggernaut, attracting a global audience of billions and challenging the dominance of Californian tech giants such as Facebook, Instagram, YouTube and Netflix (in New Zealand, nearly half of 15 to 24-year-olds use the platform every day, according to research by NZ On Air). This rapid ascent has brought TikTok billions of dollars in advertising revenue and forced its competitors to modify their services.
It has also brought increased scrutiny from politicians and regulators, particularly in the US, where TikTok’s ownership has raised national security concerns. In recent months, TikTok has also become embroiled in a growing movement to hold tech platforms accountable for their impact on children.
In January, a school authority in Seattle accused TikTok and other platforms in a lawsuit of “causing a youth mental health crisis”. US President Joe Biden promised in his State of the Union address this month to “finally hold social media companies accountable for the experiment they are running on our children for profit”. Last week, a prominent Republican senator used an op-ed in the Washington Post to call for users under the age of 16 to be banned from social networks.
Social media companies are under the spotlight in Britain, too, after a coroner ruled in September that harmful online content had contributed to the death of a 14-year-old named Molly Russell. The Conservative government is pushing through legislation that, among other things, aims to stop children from accessing content that promotes self-harm and suicide and will compel platforms to introduce stricter age verification measures.
The impact of TikTok and other platforms on the mental wellbeing of young people hasn’t received as much attention in New Zealand, despite the same risks being present here. According to the users and researchers who spoke to the Herald, there’s still a massive generational gap among parents, educators, clinicians and government agencies in the understanding of how platforms work, the content our children are viewing and the potential dangers.
“We’ve barely even started discussing it,” Dykes says.
‘Everything is awful’
These days, Dykes tries to moderate her social media use. She now uses it mainly for academic purposes — she is about to start a master’s degree in linguistics — and views the mental illness-related content that once captivated her with grave concern. She keeps a close eye on it to monitor trends and collect representative posts, partly because it seemed to her that nobody else in New Zealand was paying attention.
Dykes says she wants to raise awareness of the problem so that “parents can have open conversations with their kids about what content they are viewing” and to stop others from falling down the same rabbit hole that she did.
In the years she’s been active on social media, Dykes says no other platform has given her as much concern as TikTok. Not because the mental health-related videos on the platform are the most graphic — if you know where to look, you can find more explicit material elsewhere on the web — but because there’s so much of it and the app makes it so easy and compelling to consume.
A cursory search gives an indication of the extent of this content: videos with the hashtag #MentalHealth have been viewed 69 billion times, while those tagged #ADHD (Attention-Deficit Hyperactivity Disorder) have received 21.3 billion views and #BPD (Borderline Personality Disorder) 8.5b. There are burgeoning subgenres dedicated to self-harm, eating disorders, and psychiatric facilities.
To be sure, there are plenty of positive videos on these topics. Users who spoke to the Herald cited numerous accounts that share messages of understanding and hope that have helped them feel less alone and to want to get better. The most influential of these in New Zealand is Jazz Thornton, the mental health advocate and author, who has more than two million followers on the platform.
But these accounts are drowned out by a deluge of much darker content, the users and researchers say.
One problem vexing clinicians and health officials is the amount of misleading and inaccurate information about mental health conditions circulating on the platform.
Diagnosing a young person with a psychological condition is a complex process that entails more than checking off a list of symptoms, clinicians say. But many teenagers are using social media to diagnose themselves after watching short clips that may provide an inaccurate or oversimplified description (“Here’s a simple test to see if you have ADHD”).
They warn that this is pathologising normal teenage behaviour and could further alienate vulnerable young people if they seek help from health professionals based on a self-diagnosis only to be told that they don’t qualify for specialist treatment.
Dykes worries most about the cumulative impact watching a never-ending stream of clips about other people’s anguish and self-destructive impulses could have on the state of mind of an already troubled adolescent.
Among the thousands of TikTok posts Dykes has saved, there are many that she believes cross an ethical line: clips in which teens joke about misleading mental health professionals about plans to hurt themselves; videos that promote unhealthy eating and other types of self-harm, often using euphemisms to get around the platform’s content filter; videos that trivialise or romanticise suicide.
There are many others that aren’t so obviously objectionable, but which she believes could be detrimental when aggregated with similar posts and viewed by people who are emotionally vulnerable. A typical clip might show a disconsolate teen lip-synching to a sad song, or an image of an empty streetscape late at night, alongside text expressing that life is not worth living. “Everything is awful and it never gets better” is the prevailing sentiment, as Dykes puts it.
In seeking affirmation, Dykes worries that some people overshare intimate details about painful events such as self-harm incidents and hospital admissions. This can be distressing for viewers dealing with their own trauma and inspire copycat behaviour.
The content can also contribute to a skewed portrayal of mental wellbeing, present illness as more authentic than recovery and turn suffering into performance. Users share ever more emotionally raw posts to get the validation of views, likes, and sympathetic comments. It can become a competition to be seen as sicker than your peers.
These dynamics are amplified by TikTok’s algorithm, which learns a user’s tastes from what they watch, like and comment on, and uses that data to tailor an endlessly scrollable stream of related posts. A user who tends to view clips about baking or cute cats may have a cheerful ‘For You’ page. One who is drawn to posts about depression and self-harm will have a more troubling experience.
This can have an escalating effect, Dykes says.
“People start off being interested in content that’s just mildly problematic, joking about being sad, and they get pushed down further and further until it’s, like, really, really harmful, toxic content.”
In December, the Center for Countering Digital Hate, an American charity, published a report claiming that TikTok recommended risky eating disorder and self-harm content to new teen accounts within minutes of them being on the platform.
Researchers registered new accounts in the US, UK, Canada, and Australia claiming to be 13 (the platform’s age limit), and briefly viewed and liked videos about body image and mental health.
“Within 2.6 minutes, TikTok recommended suicide content,” the report said. “Within eight minutes, TikTok served content relating to eating disorders.”
Imran Ahmed, the Center for Countering Digital Hate’s chief executive, said: “The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them and their physical and mental health.”
Dykes isn’t seeking to stigmatise the young people who post and view this content. She has been in their position. Many have turned to social media because they don’t get support and validation in the offline world. Some believe that by talking about their problems, they’re helping to bring awareness to mental illness and helping others to feel less alone.
But, she says: “There’s a very fine line with this kind of content between sharing your own experiences and raising awareness and being honest about what you’re going through, and giving ideas to sick people.”
“I learned so many unhealthy behaviours from videos that were supposed to be spreading awareness.”
‘We care deeply’
In a statement, TikTok told the Herald that it takes these issues seriously and has introduced measures to protect its users.
“We care deeply about the health and wellbeing of our community and work with partners in New Zealand and around the globe to better understand how to support them,” a spokesperson said.
“We do not allow content depicting, promoting, normalising, or glorifying activities that could lead to suicide, self-harm or disordered eating on TikTok. Keeping the platform safe is a top priority for us, with more than 40,000 ... professionals engaged around the globe to enforce our strict community guidelines.”
The platform claims to proactively remove most self-harm videos. It blocks searches for certain keywords relating to suicide, self-harm and eating disorders and diverts those users to mental health helplines, including Lifeline in New Zealand. It allows users to filter their feeds so they don’t get videos that aren’t to their tastes.
“Circumstances that involve any instance of a threat of real-world harm to human life that is specific, credible and imminent are reported to law enforcement authorities in New Zealand,” the spokesperson said.
Experts say this is a start, but that more needs to be done to protect vulnerable young users.
In the past, progress in curbing risky behaviour — smoking, dangerous driving — has come as a result of sustained, co-ordinated community and government action. Experts say that a similar response is required in this area.
That necessitates a deeper understanding of the factors driving the youth mental health crisis; harnessing the technologies that young people use to communicate to promote accurate and healthy information about mental wellbeing; providing better support offline for those who need it; and educating teens and parents about how to stay safe online.
But it can’t just be left to individuals and families who may be mentally and emotionally depleted to solve the problem, the experts say. There also needs to be robust legislative and regulatory action to make the tech giants improve safety measures, remove harmful content, implement stricter age controls and provide more transparency about their algorithmic processes.
In New Zealand, however, the Government has been behind the curve in this area, despite promises to improve youth mental health and suicide prevention.
Matthew Tukaki, director of the Ministry of Health’s Suicide Prevention Office, said in a statement: “We are extremely concerned about content on TikTok, and other social media channels, that depicts, promotes or normalises self-harm or suicide, particularly where methods are described or displayed.”
“While many social media companies, including TikTok, have policies that remove, hide or blur harmful content, and promote helplines or services when a person searches for certain harm-related words, these are not always reliably implemented or used. Unfortunately, people can and do manoeuvre around the measures and the safety algorithms that are in place so content can continue to circulate.”
Tukaki says the Suicide Prevention Office and other Government agencies are working together to co-ordinate activity in this area better and “meetings have been organised with each of the platforms within the next month”.
Dykes wants a lot more urgency.
“People need to start talking about it. People need to realise how dire it is. It had consequences for me, and it’s going to keep having consequences for young people until we do something about it.”
About this series
This article is part of a series about teenagers, mental health and social media.
In October, we revealed that the Coroner’s office launched a joint inquiry into suspected suicides by three young women who were connected on Instagram. Among them was Cassandra Fausett, a 17-year-old from South Auckland who endured a “horrendous” two-year spiral into mental illness that resulted in numerous suicide attempts, police callouts and hospital admissions.
Last month, we examined the community that they had been part of, talking to a young woman who was once also part of that world but now believes it was damaging to her mental wellbeing. “Anna” told us that she and her peers used private accounts to vent about their experiences with mental illness, sharing intimate details about self-harm incidents, hospital admissions and suicide attempts. It provided an outlet and a sense of validation that they didn’t get offline, but she worries that it also made her feel worse and complicated her recovery.
“I just don’t think people understand how graphic it is, and how toxic,” she said.
If you have information about this topic, please contact Investigations Editor Alex Spence at alex.spence@nzme.co.nz. Because of the volume of correspondence, we cannot reply to all the responses we receive but we will read all of them. We will not publish your name or identify you as a source unless you want us to.
Where to get help
If it is an emergency and you or someone else is at risk, call 111.
For counselling and support
Lifeline: Call 0800 543 354 or text 4357 (HELP)
Suicide Crisis Helpline: Call 0508 828 865 (0508 TAUTOKO)
Need to talk? Call or text 1737
Depression helpline: Call 0800 111 757 or text 4202
For children and young people
Youthline: Call 0800 376 633 or text 234
What’s Up: Call 0800 942 8787 (11am to 11pm) or webchat (11am to 10.30pm)
For help with specific issues
Alcohol and Drug Helpline: Call 0800 787 797
Anxiety Helpline: Call 0800 269 4389 (0800 ANXIETY)
OutLine: Call 0800 688 5463 (0800 OUTLINE) (6pm-9pm)
Safe to talk (sexual harm): Call 0800 044 334 or text 4334
All services are free and available 24/7 unless otherwise specified.
For more information and support, talk to your local doctor, hauora, community mental health team, or counselling service. The Mental Health Foundation has more helplines and service contacts on its website.