Social media companies must “purge” their websites of content that promotes self-harm and suicide, the health secretary says.
The demand comes after a father accused social media websites of playing a part in his daughter taking her own life.
Molly Russell was 14 when she was found dead in her bedroom in November 2017.
Her family said she had shown “no obvious signs” of severe mental health issues but they later found she had been viewing material on social media related to anxiety, self-harm and suicide.
In fact, her father said algorithms used by Instagram had enabled her to view more harmful content, possibly contributing to her death.
Health Secretary Matt Hancock said he had written to a number of internet companies to remind them of their duty to act.
In his letter, he said: “I welcome that you have already taken important steps, and developed some capabilities to remove harmful content. But I know you will agree that more action is urgently needed.
“It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people.
“It is time for internet and social media providers to step up and purge this content once and for all.”
He added that the government is developing a white paper addressing “online harms”, and said it will look at content on suicide and self-harm.
“I want to work with internet and social media providers to ensure the action is as effective as possible.
“However, let me be clear that we will introduce new legislation where needed.”
In an interview with the Sunday Times, Molly’s father Ian Russell criticised both Instagram and the online scrapbooking site Pinterest, saying: “The more I looked [into Molly’s online accounts], the more there was that chill horror that I was getting a glimpse into something that had such profound effects on my lovely daughter.”
He added: “We went to one Molly was following and what we found was just horrendous.
“They seemed to be completely encouraging self-harm, linking depression to self-harm and to suicide, making it seem inevitable, normal, graphically showing things like cutting, biting, burning, bruising, taking pills.
“It was there, hiding in plan sight.
“We only looked at two sites because they were so harrowing…there’s no doubt that Instagram played a part in Molly’s death.”
An Instagram spokesman said it does “not allow content that promotes or glorifies eating disorders, self-harm or suicide and works hard to remove it”.
“However for many young people, discussing their mental health journey or connecting with others who have battled similar issues is an important part of their recovery.
“This is why we don’t remove certain content and instead offer people looking at, or posting it, support when they might need it most.”
An inquest into Molly’s death is expected later this year.
:: If you feel emotionally distressed or suicidal please call Samaritans for help on 116 123 or email email@example.com in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.