"On Sunday night [US time], clips of a suicide that had been livestreamed on Facebook circulated on other platforms, including TikTok," she said.
"Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide.
"We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who've reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.
"If anyone in our community is struggling with thoughts of suicide or concerned about someone who is, we encourage them to seek support, and we provide access to hotlines directly from our app and in our Safety Centre."
Facebook spokeswoman Gina Murphy said, "We removed the original video from Facebook last month on the day it was streamed and have used automation technology to remove copies and uploads since that time. Our thoughts remain with [the victim's] family and friends during this difficult time."
TikTok Austalia-New Zealand GM Lee Hunter later added, "We understand and share the concerns expressed by the Prime Minister, eSafety Commissioner and the wider community that materials like this are made and shared. We are working closely with local policymakers and relevant organisations to keep them informed. We understand the serious responsibility that we have, along with all platforms, to effectively address harmful content and we want to reiterate that the safety of our users is our utmost priority. "
Facebook tightened rules around acceptable content and bolstered AI and human filters after the Christchurch mosque shootings were livestreamed, but resisted calls to disable the feature - in contrast to Google-owned YouTube, which turned off livestreaming for most mobile users.
The suicide clip that surfaced on Monday prompted warnings from the NZ Mental Health Foundation and Netsafe, while Chief Censor David Shanks attempted to contact the Chinese-based TikTok over his concerns.
Multiple copies of clip online
This morning, New York-based researcher Eric Feinberg, vice president at Coalition For A Safer Web, told the Herald he had been unable to locate any copies of the clip on TikTok, but he did find nine posts on Facebook-owned Instagram that features copies of the raw video (that is, the original, unaltered clip), plus one copy on the lower-profile Telegram.
Feinberg previously alerted the Herald to multiple copies of the Christchurch mosque gunman's livestream video that kept appearing on Facebook, Instagram and YouTube up to and beyond the first anniversary of the atrocity. They included "masked" copies, such as one clip on Facebook that at first appeared to be a video game.
There have been reports that the distressing TikTok clip has also been repackaged with innocuous content, so the platform should still be approached with caution.
TikTok recently claimed an explosion in popularity in New Zealand, to 1.1 million active monthly users last quarter (78 per cent were between 13 and 17 years old, and 66 per cent were female).
"We've seen more complaints about content on Tik Tok as it has grown in popularity," Netsafe chief executive Martin Cocker told the Herald this morning.
"Netsafe now has an operational connection with the Tik Tok Trust and Safety Team and we can help New Zealanders with any issues they experience on their platform."
Cocker says his organisation is the lead agency for enforcing the Harmful Digital Communications Act.
He has been fielding calls from multiple schools, with Netsafe helping them to prepare advisories to parents and caregivers.
A typical example, from Hobsonville Point Secondary School principal Morrie Abraham, reads:
"There is a disturbing video circulating on social media at the moment. This video contains graphic, upsetting content and it has been copied and shared across numerous social media platforms. Social media platforms are working to remove this video but it is expected to still be online in various places for some time.
"Young people are vulnerable to both intentionally and unintentionally coming across upsetting content online. We recommend that you follow Netsafe's guidelines on managing this with the young people in your lives:
• Try not to assign blame about how they came across the material
Reassure them that it isn't their fault
• Don't trivialise what they have seen by saying that the material may not be real (it is important to deal with their feelings first)
• Provide comfort and assurance
• Normalise their response, e.g., "It's normal to be scared/angry/upset/confused"
• Don't overreact by taking away the technology – this will make them less likely to talk to you if something else happens and it can make them feel like they are to blame. Make sure that they know you are glad that they came to you about it.
Cocker says Netsafe has also posted two advisories that could help parents grapple with the situation: "Understanding TikTok" and "Helping young people exposed to upsetting content."
TikTok has recently been involved in separate allegations that it shares data with the Chinese government (leading to a possible sell-off of its US, Australian and NZ arms), and that it has censored content related to BlackLivesMatter and other politically-sensitive topics. The company has denied both claims.
Where to get help
If it is an emergency and you feel like you or someone else is at risk, call 111.
• Lifeline: 0800 543 354, available 24/7
• Suicide Crisis Helpline: 0508 828 865 (0508 TAUTOKO), available 24/7
• Youth services: (06) 3555 906
• Youthline: 0800 376 633
• Kidsline: 0800 543 754, available 24/7
• Whatsup: 0800 942 8787, available 1pm to 11pm
• Depression helpline: 0800 111 757, available 24/7
• Rainbow Youth: (09) 376 4155
• Helpline: 1737