Instagram has agreed to ban graphic images of self-harm after objections were raised in Britain following the suicide of a teen whose father said the photo-sharing platform had contributed to her decision to take her own life.
Instagram chief Adam Mosseri said Feb. 7 evening the platform is making a series of changes to its content rules.
He said: “We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable in our community.”
Mosseri said further changes will be made.
“I have a responsibility to get this right,” he said. “We will get better and we are committed to finding and removing this content at scale, and working with experts and the wider industry to find ways to support people when they’re most in need.”The call for changes was backed by the British government after the family of 14-year-old Molly Russell found material related to depression and suicide on her Instagram account after her death in 2017.
Her father, Ian Russell, said he believes the content Molly viewed on Instagram played a contributing role in her death, a charge that received wide attention in the British press.
The changes were announced after Instagram and other tech firms, including Facebook, Snapchat and Twitter, met with British Health Secretary Matt Hancock and representatives from the Samaritans, a mental health charity that works to prevent suicide.
Instagram is also removing non-graphic images of self-harm from searches.
Facebook, which owns Instagram, said in a statement that independent experts advise that Facebook should “allow people to share admissions of self-harm and suicidal thoughts but should not allow people to share content promoting it.”
Social-Media Bosses Could Be Held Liable for Harmful Content
By Jane GrayLONDON—U.S. social-media bosses could face arrest if they fail to moderate harmful content on their sites, the UK minister for suicide prevention has suggested.
Jackie Doyle-Price’s comments come as the UK government prepares legislation to regulate social media sites.
Speaking on BBC Radio 4’s “Today Programme,” Doyle-Price said that social media companies should be treated like publishers and be held accountable for the images and videos distributed on their platforms.
“When someone decides to seek that [self-harm] content on the internet and they suddenly find themselves because of the hashtags bombarded with similar material, that behavior of self-harm and suicide becomes normalized,” she told ”Today.”
She added, “I firmly believe that once you draw people into those communities that they then return to as they feel comfortable with people of like mind, it’s akin to grooming.”
“If the [social media firms] have the intelligence to bombard me with adverts that reflect the fashion that I like then they have the intelligence to be able to remove this content. There is no excuse,” she said.
When asked if American-owned social media bosses who have broken the law are liable for arrest if they set foot in the UK, she said, “Nothing is off the table.”
There’s growing public concern over the death of Molly Russell, 14, whose father discovered chilling images about suicide on her Instagram account after she took her own life.
Molly was found dead in her bedroom in November 2017. Her father, Ian, said that Instagram and Pinterest contributed to her death.
‘Deeply Moved’
Instagram boss Adam Mosseri said that they don’t allow posts that encourage suicide or self-harm, but he admitted that more work needed to be done, and that not enough harmful images are found before they appear on users’ accounts.He said Instagram has placed measures to stop recommending related content, such as images, hashtags, and accounts, and are applying sensitivity screens to hide self-harm content.
“We don’t want to stigmatize mental health by deleting images that reflect the very hard and sensitive issues people are struggling with.”
Mosseri is due to meet the UK’s health secretary, Matt Hancock, on Thursday.
In a letter to social-media giants Twitter, Snapchat, Pinterest, Apple, Google, and Facebook (which owns Instagram) last week, Hancock wrote that the sites should “purge” self-harm content from social media.
“It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people. It is time for internet and social media providers to step up and purge this content once and for all,” he wrote.
“Internet companies have always enjoyed legal protection from liability for user-generated content. This laissez-faire environment has led some companies to pursue growth and profitability with little regard for the security and interests of their users,” she said.
Staying Safe Online
The National Society for the Prevention of Cruelty to Children has these tips to keep your child safe online:- Talk about staying online
- Explore their online world together
- Agree on rules
- Manage your family’s settings and controls
Friends Read Free