In the wake of the Christchurch terrorist attacks, Facebook's chief operating officer Sheryl Sandberg has written a letter to the Herald outlining the actions the social media giant will take.
The terrorist attacks in Christchurch were an act of pure evil.
All of us at Facebook stand with the victims, their families, the Muslim community, and all of New Zealand. It is deeply tragic when people face violence because of who they are and what they believe.
Over the past two weeks, the whole world has seen the compassion, unity and resilience you have shown as a country through your grief.
Many of you have also rightly questioned how online platforms such as Facebook were used to circulate horrific videos of the attack. We are committed to reviewing what happened and have been working closely with the New Zealand Police to support their response.
In the immediate aftermath, we took down the alleged terrorist's Facebook and Instagram accounts, removed the video of the attack, and used artificial intelligence to proactively find and prevent related videos from being posted.
We have heard feedback that we must do more – and we agree. In the wake of the terror attack, we are taking three steps: strengthening the rules for using Facebook Live, taking further steps to address hate on our platforms, and supporting the New Zealand community.
First, we are exploring restrictions on who can go Live depending on factors such as prior Community Standard violations.
We are also investing in research to build better technology to quickly identify edited versions of violent videos and images and prevent people from re-sharing these versions.
While the original video was shared Live, we know that this video spread mainly through people re-sharing it and re-editing it to make it harder for our systems to block it; we have identified more than 900 different videos showing portions of those horrifying 17 minutes.
People with bad intentions will always try to get around our security measures. That's why we must work to continually stay ahead. In the past week, we have also made changes to our review process to help us improve our response time to videos like this in the future.
Second, we are taking even stronger steps to remove hate on our platforms. We have long had policies against hate groups and hate speech.
We designated both shootings as terror attacks, meaning that any praise, support and representation of the events violates our Community Standards and is not permitted on Facebook.
We are also using our existing artificial intelligence tools to identify and remove a range of hate groups in Australia and New Zealand, including the Lads Society, the United Patriots Front, the Antipodean Resistance, and National Front New Zealand.
These groups will be banned from our services, and we will also remove praise and support of these groups when we become aware of it. And just this week we announced that we have strengthened our policies by banning praise, support and representation of white nationalism and separatism on Facebook and Instagram.
Finally, we are standing by the people of New Zealand and providing support to four local well-being and mental health organizations to raise awareness around their services within the country. We are also working with our existing youth partner to co-design additional education around peer support and resilience.
Through everything, we remain ready to work with the New Zealand Government's Royal Commission to further review the role that online services play in these types of attacks more widely. We are also ready to work with the New Zealand Government on future regulatory models for the online industry in areas like content moderation, elections, privacy and data portability.
We know there is more work to do. We are deeply committed to strengthening our policies, improving our technology and working with experts to keep Facebook safe.