Instagram has agreed to ban graphic images of self-harm after objections were raised in Britain following the suicide of a teen whose father said the photo-sharing platform had contributed to her decision to take her own life.
Instagram chief Adam Mosseri said Feb. 7 evening the platform is making a series of changes to its content rules.
He said: “We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable in our community.”
Mosseri said further changes will be made.
“I have a responsibility to get this right,” he said. “We will get better and we are committed to finding and removing this content at scale, and working with experts and the wider industry to find ways to support people when they’re most in need.”The call for changes was backed by the British government after the family of 14-year-old Molly Russell found material related to depression and suicide on her Instagram account after her death in 2017.
Her father, Ian Russell, said he believes the content Molly viewed on Instagram played a contributing role in her death, a charge that received wide attention in the British press.
Dad Ian Russell blames Instagram for his daughter Molly's death.He says parents must talk to their children about the dangers that exist on social media and he's calling for Instagram to take greater action to remove harmful content.
Gepostet von This Morning am Mittwoch, 30. Januar 2019
The changes were announced after Instagram and other tech firms, including Facebook, Snapchat and Twitter, met with British Health Secretary Matt Hancock and representatives from the Samaritans, a mental health charity that works to prevent suicide.
Instagram is also removing non-graphic images of self-harm from searches.
Facebook, which owns Instagram, said in a statement that independent experts advise that Facebook should “allow people to share admissions of self-harm and suicidal thoughts but should not allow people to share content promoting it.”
Social-Media Bosses Could Be Held Liable for Harmful Content
By Jane Gray
LONDON—U.S. social-media bosses could face arrest if they fail to moderate harmful content on their sites, the UK minister for suicide prevention has suggested.
Jackie Doyle-Price’s comments come as the UK government prepares legislation to regulate social media sites.
Speaking on BBC Radio 4’s “Today Programme,” Doyle-Price said that social media companies should be treated like publishers and be held accountable for the images and videos distributed on their platforms.
“When someone decides to seek that [self-harm] content on the internet and they suddenly find themselves because of the hashtags bombarded with similar material, that behavior of self-harm and suicide becomes normalized,” she told “Today.”
She added, “I firmly believe that once you draw people into those communities that they then return to as they feel comfortable with people of like mind, it’s akin to grooming.”
“If the [social media firms] have the intelligence to bombard me with adverts that reflect the fashion that I like then they have the intelligence to be able to remove this content. There is no excuse,” she said.
When asked if American-owned social media bosses who have broken the law are liable for arrest if they set foot in the UK, she said, “Nothing is off the table.”
There’s growing public concern over the death of Molly Russell, 14, whose father discovered chilling images about suicide on her Instagram account after she took her own life.
Molly was found dead in her bedroom in November 2017. Her father, Ian, said that Instagram and Pinterest contributed to her death.
A recent survey conducted by YouGov on behalf of the Princes Trust found that in 2009, 9 percent of 16- to 25-year-olds disagreed with the statement that life is “really worth living.” In 2019, that figure doubled to 18 percent.
According to the study, overwhelming pressure from social media is leading to a sense of inadequacy, with 46 percent thinking that comparing themselves with others on social media makes them feel “inadequate.”
Instagram boss Adam Mosseri said that they don’t allow posts that encourage suicide or self-harm, but he admitted that more work needed to be done, and that not enough harmful images are found before they appear on users’ accounts.
Writing in a comment piece for the Telegraph, he said, “I have been deeply moved by the tragic stories that have come to light this past month of Molly Russell and other families affected by suicide and self-harm.”
He said Instagram has placed measures to stop recommending related content, such as images, hashtags, and accounts, and are applying sensitivity screens to hide self-harm content.
He also said it’s a difficult balance. “Suicide and self-harm are deeply complex and challenging issues that raise difficult questions for experts, governments, and platforms like ours.”
“We don’t want to stigmatize mental health by deleting images that reflect the very hard and sensitive issues people are struggling with.”
Mosseri is due to meet the UK’s health secretary, Matt Hancock, on Thursday.
It is time for internet & social media providers to step up & protect children online. My letter to social media firms: pic.twitter.com/eK6FaKPgKc
— Matt Hancock (@MattHancock) January 28, 2019
In a letter to social-media giants Twitter, Snapchat, Pinterest, Apple, Google, and Facebook (which owns Instagram) last week, Hancock wrote that the sites should “purge” self-harm content from social media.
“It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people. It is time for internet and social media providers to step up and purge this content once and for all,” he wrote.
Digital minister Margot James said that the UK government will publish a white paper, followed by a consultation over the summer, outlining new legal measures for online platforms.
“Internet companies have always enjoyed legal protection from liability for user-generated content. This laissez-faire environment has led some companies to pursue growth and profitability with little regard for the security and interests of their users,” she said.
“There is far too much bullying, abuse, misinformation, and manipulation online as well as serious and organized crime online.”
Staying Safe Online
The National Society for the Prevention of Cruelty to Children has these tips to keep your child safe online:
- Talk about staying online
- Explore their online world together
- Agree on rules
- Manage your family’s settings and controls
Visit their website for more advice.
In the UK, call Papyrus on 0800 068 4141 if you’re distressed by Molly’s story or you know someone who needs help.
If you’re in America, call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255) or contact the Crisis Text Line by texting TALK to 741741.