End-to-End Online Encryption Shouldn’t Apply to Children’s Accounts: Report

End-to-End Online Encryption Shouldn’t Apply to Children’s Accounts: Report
A teenage girl, who claims to be a victim of sexual abuse and alleged grooming, poses in Rotherham, England, on Sept. 3, 2014. (Christopher Furlong/Getty Images)
Lily Zhou
12/8/2020
Updated:
12/8/2020

Applying end-to-end encryption of online private messages could compromise children’s safety, a report said.

Anne Longfield, children’s commissioner for England, who published the report (pdf) on Tuesday, said that the blanket move of tech firms to apply end-to-end encryption could protect perpetrators of child abuse.

“End-to-end encryption makes it impossible for the platform itself to read the contents of messages, and risks preventing police and prosecutors from gathering the evidence they need to prosecute perpetrators of child sexual exploitation and abuse,” Longfield wrote in the foreword of the report.

A survey included in the report showed that a lot of children using online messaging platforms are much younger than the minimum age requirement.

According to the survey from March, nine out of ten children aged between 8 and 17 use a messaging app or website.

“Usage of messaging platforms increases sharply with age, from 70 [percent] of children aged 8–10, 91 [percent] of children aged 11–13, to 97 [percent] of teenagers aged 14–17,” the reports says.

The survey also found that nine in ten 12-year-olds and six in ten 8-year-olds said that they had used a messaging app or site with a minimum age of at least 13.

The percentage of children sharing selfies and locations with friends and families and the percentage of girls who send messages to strangers increase with age.

Almost one in ten children admitted to using a messaging service to talk to strangers, and 1 in 20 said they shared photo of video selfies to strangers.

Among children aged 8–17, 38 percent report having received content on a messaging platform which worried them or made them feel uncomfortable in the four weeks prior to the survey, and 10 percent said they received such content from strangers.

“An NSPCC investigation found that Facebook, Instagram, and WhatsApp were used in child abuse image and online child sexual offences an average of 11 times a day in 2019,” the report reads.

“The rate of grooming offences committed in the UK appears to have further accelerated over the course of lockdown, with 1,220 offences recorded in just the first three months of national lockdown—Facebook-owned apps (Facebook, Instagram, Whatsapp) accounted for 51 [percent] of these reports and Snapchat a further 20 [percent].”

The report suggested a number of policy recommendations including requiring platforms to verify their users ages, retain the ability to scan for child sexual abuse material, and not apply end-to-end encryption to children’s accounts.