Facebook's director of policy for Australia and New Zealand, Mia Garlick, said: "Happily it's a sparingly used product, but it's a really important product because it's such an important moment in time."
Mrs Garlick said Facebook consulted with suicide survivors and prevention groups overseas and in New Zealand to create a system people were likely to respond to.
"We've tried to make sure the flow and the experience is as empathetic as possible and that it resonates with people even if they're going through a difficult time."
Part of the tool is suggested messages which people can send their friends to either ask for help or offer support.
For those looking to reach out, Facebook offers the text: "Hi [friend], I'm having a tough time at the moment, and I'm finding it really hard to talk with anyone. Please message me back."
Users can write their own message, but Mrs Garlick said research showed a blank screen could be too daunting.
The tool also offers people seeking help the chance to "take a deep breath" before moving on to some tips like going outside, relaxing or doing something creative.
"In a product technology design sense, you try to have as few click-throughs as possible to get from point A to point B. But what we've discovered through a difficult time, they prefer more options and they prefer more emotion-rich language because it resonates with how they're feeling."
NetSafe has welcomed the tool. The group's executive director, Martin Cocker, said because it was a community-driven tool it was only as effective as someone's friend group. But he believed Facebook had found a good middle-ground for being responsible for behaviour on its site.
Because there was no limit on how many times someone could be reported, Mr Crocker hoped it wouldn't be used against Facebook users.
"If someone uses those tools to bully someone else, then it's offensive on two fronts: those tools are there for a very serious purpose to help people in a difficult place and then it's bullying which is very unnecessary in the first place."
Patrick Walsh, chairman of the Online Safety Advisory Group, said the tool was a step in the right direction and hoped other social media followed Facebook's lead and created their own tools.
Instagram, owned by Facebook, can send messages to users searching hashtags deemed "at risk" while Twitter has a self-harm and suicide alert feature on its help page and SnapChat has a list of resources on its safety page. None offers a user-prompted alert system.
How to help a friend
• If someone posts something which concerns you, find the "click down" arrow on the top right of the post.
• Click "report post" confidentiality.Choose the "I don't think this should be on Facebook" option.
• Choose from the list "It's threatening, violent or suicidal".
• You are then offered a list of options, including chatting to the friend with a suggested text, reaching out to a mutual friend or speaking with professionals and asking Facebook to look at the post before offering the person help.
Where to get help:
• In an emergency: call 111
• Lifeline: 0800 543 354 (available 24/7)
• Suicide Crisis Helpline: 0508 828 865 (0508 TAUTOKO) (available 24/7)
• Youthline: 0800 376 633, or text 234 (available 24/7) or talk@youthline.co.nz or live chat (between 7pm and 11pm) http://livechat.youthline.co.nz/mibew/chat?locale=en&style=youthline
• Kidsline: 0800 543 754 (available 24/7)
• Whatsup: 0800 942 8787 (1pm to 11pm)
• Depression helpline: 0800 111 757 (available 24/7)
• Rainbow Youth: (09) 376 4155 (weekdays 11am to 5pm)
• NetSafe: 0508 NETSAFE (0508 638 723), www.theorb.org.nz