UK could ban social media over suicide images
UK health secretary has warned that social media firms could be banned if they fail to remove harmful content.
Speaking on the BBC's Andrew Marr show, Matt Hancock said: "If we think they need to do things they are refusing to do, then we can and we must legislate."
But he added: "It's not where I'd like to end up."
The minister earlier called on social media giants to "purge" material promoting self-harm and suicide in the wake of links to a teenager's suicide.
Molly Russell, 14, took her own life in 2017 after viewing disturbing content about suicide on social media.
Speaking to the BBC, her father said he believed Instagram "helped kill my daughter".
Russell also criticised the online scrapbook Pinterest, telling the Sunday Times: "Pinterest has a huge amount to answer for."
Instagram responded by saying it works with expert groups who advise them on the "complex and nuanced" issues of mental health and self-harm.
Based on their advice that sharing stories and connecting with others could be helpful for recovery, Instagram said, they "don't remove certain content".
"Instead (we) offer people looking at, or posting it, support messaging that directs them to groups that can help."
But Instagram added it is undertaking a full review of its enforcement policies and technologies.
A Pinterest spokesman said: "We have a policy against harmful content and take numerous proactive measures to try to prevent it from coming and spreading on our platform.
"But we know we can do more, which is why we've been working to update our self-harm policy and enforcement guidelines over the last few months."
Facebook, which owns Instagram, said earlier it was "deeply sorry".
The internet giant said graphic content which sensationalises self-harm and suicide "has no place on our platform".
Papyrus, a charity that works to prevent youth suicide, said it has been contacted by around 30 families in the past week who believe social media had a part to play in their children's suicides.
"We've had a spike in calls to our UK helpline since the BBC first reported this six days ago, all saying the same thing," said a spokeswoman for the charity.
Hancock said he was "horrified" to learn of Molly's death and feels "desperately concerned to ensure young people are protected".
In a letter sent to Twitter, Snapchat, Pinterest, Apple, Google and Facebook (which owns Instagram), the minister "welcomed" steps already taken by firms but said "more action is urgently needed".
He wrote: "It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people.
"It is time for internet and social media providers to step up and purge this content once and for all."
He added that the government is developing a white paper addressing "online harms", and said it will look at content on suicide and self-harm.
Hancock explained: "Lots of parents feel powerless in the face of social media. But we are not powerless. Both government and social media providers have a duty to act.
"I want to make the UK the safest place to be online for everyone - and ensure that no other family has to endure the torment that Molly's parents have had to go through."
Molly was found dead in her bedroom in November 2017 after showing "no obvious signs" of severe mental health issues.
Her family later found she had been viewing material on social media linked to anxiety, depression, self-harm and suicide.
Russell told the BBC: "Some of that content is shocking in that it encourages self harm, it links self-harm to suicide and I have no doubt that Instagram helped kill my daughter."
Solicitor Merry Varney, who represents the Russell family, said Molly's case "and the examples of how algorithms push negative material" show a need to investigate online platforms, and how they could be "contributing to suicides and self-harm".
MHK
Comments