Governments, Big Tech Team Up Against Online Child Sexual Exploitation

Governments, Big Tech Team Up Against Online Child Sexual Exploitation
Attorney General William Barr speaks about an initiative to prevent online child sexual exploitation as international politicians look on, at the Justice Department in Washington on March 5, 2020. (Samira Bouaou/The Epoch Times)
Charlotte Cuthbertson
3/8/2020
Updated:
3/13/2020
WASHINGTON—Powerful testimony from the “Phoenix 11,” a group of child sex abuse survivors, kicked off the announcement of a new collaboration between governments and tech companies to stop the online sexual abuse of children. 
“Last year, we all took a bold step to overcome the fears about ourselves, to band together to become a force for change,” the group said in a video presented at the Department of Justice (DOJ) on March 5. 
“We are survivors of sexual torture, child rape, erotic photoshoots, pedophile sleepovers, elementary school sex shows, streaming BDSM, and twisted sexual desires whose digital images were trafficked worldwide to fulfill the endless needs of an evil perverted community which takes pleasure from our pain.”
In efforts to prevent child sexual abuse online, the United States, along with the UK, Canada, Australia, and New Zealand, have created new voluntary principles that tech companies are promoting. 
Facebook, Twitter, Google, Microsoft, Roblox, and Snapchat have all endorsed the principles, which ask them to prevent child sexual abuse material from being made available on their platforms, and taking action against advertising, soliciting children, and the livestreaming of child-related sexual abuse. 
The principles stop short of asking tech companies to address end-to-end encryption, which is an ongoing tussle about where the line exists between privacy protection and the protection of criminals. The DOJ has said end-to-end encryption without a backdoor for law enforcement stymies criminal investigations. Tech companies say a backdoor presents a security risk for users. 
“Predators’ supposed privacy interests should not outweigh our children’s privacy and security,” said Attorney General William Barr on March 5. “Technology has made it easier to produce, conceal, and distribute child sexual abuse materials.” 
The DOJ has seen a 160 percent increase in cases involving the production of videos and images of children who were sexually exploited and abused over the last decade. 
The CyberTipline for the National Center for Missing and Exploited Children (NCMEC) received 1.1 million reports of child online sexual exploitation in 2014. 
In 2019, it received 16.9 million reports. Barr credited Facebook for submitting 94 percent of those tips (almost 16 million).
The tipline reports included 69.1 million images, videos, and other files related to child sexual exploitation, according to NCMEC.
“Sexual-abuse imagery can be preserved online for much longer periods of time and disseminated more broadly,” Barr said. “Victims incur not only the initial harm of abuse, but are victimized again and again when those images are recirculated. For example, sexual abuse imagery of one particular victim has been found in almost 21,500 separate U.S. investigations over the last 20 years.” 
UK Security Minister James Brokenshire speaks about an initiative to prevent online child sexual exploitation, as Australia Minister for Home Affairs Peter Dutton (L) and Canada Minister of Public Safety and Emergency Preparedness Bill Blair look on at the Justice Department in Washington on March 5, 2020. (Samira Bouaou/The Epoch Times)
UK Security Minister James Brokenshire speaks about an initiative to prevent online child sexual exploitation, as Australia Minister for Home Affairs Peter Dutton (L) and Canada Minister of Public Safety and Emergency Preparedness Bill Blair look on at the Justice Department in Washington on March 5, 2020. (Samira Bouaou/The Epoch Times)
UK Security Minister James Brokenshire said that the problem is vast and the new coalition with the tech industry is an important milestone in addressing it. 
“Child sex offenders exploit technological advances to inflict misery, sharing vile materials and tips on how to target children. They unite to isolate, ensnare, and manipulate our young people and cause pain and suffering that can last a lifetime,” he said. 
“The scale of the global threat is horrifying and we know that it’s getting worse. More than 3.5 million accounts are now registered to the world’s most depraved dark web sites.”
Brokenshire said encryption remains the “elephant in the room,” and pointed to Facebook Messenger. Last year, he said, Facebook submitted around 12 million reports of child sexual exploitation on its Messenger app. 
Facebook now plans to encrypt the app, much like what it did with its WhatsApp messaging app. 
“End-to-end encryption ensures only you and the person you’re communicating with can read what’s sent, and nobody in between, not even WhatsApp,” states Facebook’s WhatsApp site.  
Facebook added end-to-end encryption to its WhatsApp service in April 2016, but in August the same year, said in its privacy policy that it would include “limited data sharing” with Facebook. 
In 2019, Facebook CEO Mark Zuckerberg said the company is moving toward encryption. 
“In the last year, I’ve spoken with dissidents who’ve told me encryption is the reason they are free, or even alive,” he said. “When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion. We have a responsibility to work with law enforcement and to help prevent these wherever we can.”
But Brokenshire said encrypting Facebook Messenger would mean Facebook would be blind to the 12 million instances of child sexual exploitation. 
“I’ve got to say that putting our children at risk, what I believe are marginal privacy gains is something I really struggled to believe any of us want,” Brokenshire said. 
In Congress, a group of bipartisan senators introduced a bill on March 5 that would create incentives for companies to “earn” liability protection for violations of laws related to online child sexual abuse material.
The Eliminating Abusive and Rampant Neglect of Interactive Technologies Act (EARN IT Act) would amend Section 230 of the Communications and Decency Act. 
It would also allow individuals to sue tech companies that don’t comply with best practices or establish reasonable practices to prevent online child exploitation
“Technological advances have allowed the online exploitation of children to become much, much worse over recent years,” said Sen. Dianne Feinstein (D-Calif). “Companies must do more to combat this growing problem on their online platforms.”
The Facebook and WhatsApp apps are displayed on an iPhone in this file photo illustration. (Justin Sullivan/Getty Images)
The Facebook and WhatsApp apps are displayed on an iPhone in this file photo illustration. (Justin Sullivan/Getty Images)

The 11 Principles 

Principle 1: Companies seek to prevent known child sexual abuse material from being made available to users or accessible on their platforms and services, take appropriate action under their terms of service, and report to appropriate authorities.
Principle 2: Companies seek to identify and combat the dissemination of new child sexual abuse material via their platforms and services, take appropriate action under their terms of service, and report to appropriate authorities.
Principle 3: Companies seek to identify and combat preparatory child sexual exploitation and abuse activity (such as online grooming for child sexual abuse), take appropriate action under their terms of service, and report to appropriate authorities. 
Principle 4: Companies seek to identify and combat advertising, recruiting, soliciting, or procuring a child for sexual exploitation or abuse, or organizing to do so, take appropriate action under their terms of service, and report to appropriate authorities. 
Principle 5: Companies seek to identify and prevent child sexual exploitation and abuse facilitated or amplified by livestreaming, take appropriate action under their terms of service, and report to appropriate authorities.
Principle 6: Companies seek to prevent search results from surfacing child sexual exploitation and abuse, and seek to prevent automatic suggestions for such activity and material.
Principle 7: Companies seek to adopt enhanced safety measures with the aim of protecting children, in particular from peers or adults seeking to engage in harmful sexual activity with children; such measures may include considering whether users are children.
Principle 8: Companies seek to take appropriate action, including providing reporting options, on material that may not be illegal on its face, but with appropriate context and confirmation may be connected to child sexual exploitation and abuse.
Principle 9: Companies seek to take an informed global approach to combating online child sexual exploitation and abuse and to take into account the evolving threat landscape as part of their design and development processes.
Principle 10: Companies support opportunities to share relevant expertise, helpful practices, data, and tools where appropriate and feasible. 
Principle 11: Companies seek to regularly publish or share meaningful data and insights on their efforts to combat child sexual exploitation and abuse.

For Help or to Submit a Tip

Contact NCMEC
1-800-843-5678