Social-Media Bosses Could Be Held Liable for Harmful Content

February 6, 2019 Updated: February 10, 2019

LONDON—U.S. social-media bosses could face arrest if they fail to moderate harmful content on their sites, the UK minister for suicide prevention has suggested.

Jackie Doyle-Price’s comments come as the UK government prepares legislation to regulate social media sites.

Speaking on BBC Radio 4’s “Today Programme,” Doyle-Price said that social media companies should be treated like publishers and be held accountable for the images and videos distributed on their platforms.

“When someone decides to seek that [self-harm] content on the internet and they suddenly find themselves because of the hashtags bombarded with similar material, that behavior of self-harm and suicide becomes normalized,” she told “Today.”

She added, “I firmly believe that once you draw people into those communities that they then return to as they feel comfortable with people of like mind, it’s akin to grooming.”

“If the [social media firms] have the intelligence to bombard me with adverts that reflect the fashion that I like then they have the intelligence to be able to remove this content. There is no excuse,” she said.

When asked if American-owned social media bosses who have broken the law are liable for arrest if they set foot in the UK, she said, “Nothing is off the table.”

Jackie Doyle-Price. (Rob Stothard/Getty Images)

There’s growing public concern over the death of Molly Russell, 14, whose father discovered chilling images about suicide on her Instagram account after she took her own life.

Molly was found dead in her bedroom in November 2017. Her father, Ian, said that Instagram and Pinterest contributed to her death.

A recent survey conducted by YouGov on behalf of the Princes Trust found that in 2009, 9 percent of 16- to 25-year-olds disagreed with the statement that life is “really worth living.” In 2019, that figure doubled to 18 percent.

According to the study, overwhelming pressure from social media is leading to a sense of inadequacy, with 46 percent thinking that comparing themselves with others on social media makes them feel “inadequate.”

‘Deeply Moved’

Instagram boss Adam Mosseri said that they don’t allow posts that encourage suicide or self-harm, but he admitted that more work needed to be done, and that not enough harmful images are found before they appear on users’ accounts.

Writing in a comment piece for the Telegraph, he said, “I have been deeply moved by the tragic stories that have come to light this past month of Molly Russell and other families affected by suicide and self-harm.”

He said Instagram has placed measures to stop recommending related content, such as images, hashtags, and accounts, and are applying sensitivity screens to hide self-harm content.

The British Health Secretary says social media sites should “purge” self-harm content from social media. (Manan Vatsyayana/AFP/Getty Images)

He also said it’s a difficult balance. “Suicide and self-harm are deeply complex and challenging issues that raise difficult questions for experts, governments, and platforms like ours.”

“We don’t want to stigmatize mental health by deleting images that reflect the very hard and sensitive issues people are struggling with.”

Mosseri is due to meet the UK’s health secretary, Matt Hancock, on Thursday.

In a letter to social-media giants Twitter, Snapchat, Pinterest, Apple, Google, and Facebook (which owns Instagram) last week, Hancock wrote that the sites should “purge” self-harm content from social media.

“It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people. It is time for internet and social media providers to step up and purge this content once and for all,” he wrote.

Digital minister Margot James said that the UK government will publish a white paper, followed by a consultation over the summer, outlining new legal measures for online platforms.

“Internet companies have always enjoyed legal protection from liability for user-generated content. This laissez-faire environment has led some companies to pursue growth and profitability with little regard for the security and interests of their users,” she said.

“There is far too much bullying, abuse, misinformation, and manipulation online as well as serious and organized crime online.”

Staying Safe Online

The National Society for the Prevention of Cruelty to Children has these tips to keep your child safe online:

  • Talk about staying online
  • Explore their online world together
  • Agree on rules
  • Manage your family’s settings and controls

Visit their website for more advice.

In the UK, call Papyrus on 0800 068 4141 if you’re distressed by Molly’s story or you know someone who needs help.

If you’re in America, call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255) or contact the Crisis Text Line by texting TALK to 741741.

From NTD News

Follow Jane on Twitter: @itsjanewriting