The family of Molly Russell, who took her own life at 14, found she had been viewing material on Instagram linked to anxiety. Photo / PA
Instagram is to ban all graphic self-harm images from its platform following the controversy over Molly Russell's suicide which her father blamed on the site.
Adam Mosseri, head of Instagram, said "no graphic self-harm image" will in future be allowed on the platform.
With his company having faced unprecedented criticism from ministers and charities over its failure to tackle self harm imagery, Mosseri also disclosed Instagram would change its search mechanisms so that all self-harm related but non-graphic content would be harder for users to find.
This would entail self-harm terms and content being removed from "Explore", "Search", hashtagged pages and account recommendations.
"If there is self-harm related content that stays on the platform even if it's admission orientated, maybe someone has a picture of a scar and says I am 30 days clean, it's going to be much more difficult to find," he said.
He said Instagram was also developing technology to blur remaining self-harm content and put it behind a privacy screen so people did not accidentally find it and view it. The company also plans to increase the help it provides for self-harmers who used Instagram to share their experience.
The move follows the death of Molly, 14, who took her life after viewing self-harm images. Her father, Ian, said Instagram "helped kill my daughter".
The announcement came as Mosseri this afternoon met with Matt Hancock, the Health Secretary, who last week warned social media firms they could be banned if they failed to remove harmful material.
Earlier, the Instagram boss had met Jeremy Wright, the Culture Secretary, who is due to publish in the next month his plans for new laws to regulate social media.
It is expected to include plans for a regulator with powers to force social media companies to take down illegal material such as violence and child abuse within fixed time periods and to purge harmful but legal content such as cyberbullying and self-harm imagery.
Asked why it had taken Instagram so long to tackle self-harm and to act only after the publicity surrounding Molly's death, Mosseri said: "We have not been as focused as we should have been on the effects of graphic imagery of anyone looking at content.
"That is something that we are looking to correct and correct quickly. It's unfortunate it took the last few weeks for us to realise that. It's now our responsibility to address that issue as quickly as we can."
He said the ban on graphic self-harm would be introduced as "quickly and responsibly" as the company could.
He pointed out that by allowing people to post content Instagram had helped save lives, by, for example, enabling friends to alert law enforcement agencies if someone was about to seriously harm themselves.
"I do want to be careful because there is a tension between wanting to act and act quickly and need to act responsibly," he said.
"For instance I don't want to do anything that will unintentionally stigmatise any sort of mental health issues. I don't want to do anything that will put a vulnerable person in a place where they feel unsupported or ashamed if we take that content down.
Where to get help:
If you are worried about your or someone else's mental health, the best place to get help is your GP or local mental health provider.
However, if you or someone else is in danger or endangering others, call police immediately on 111.