Hawley Rips YouTube for Putting Profits Before Protecting Kids From Pedophiles

July 9, 2019 Updated: July 10, 2019

WASHINGTON—Sen. Josh Hawley (R-Mo.) told a Senate Judiciary Committee hearing on July 9 that he is “sickened” by a report of digital giant YouTube’s “refusal” to change its algorithm to protect children from pedophiles using the site to find and “groom” potential victims.

“YouTube admitted they could do something about it. They could stop auto-referring these videos of minors to pedophiles, but they chose not to do so,” Hawley told the hearing.

“Why not? Because their [business] model is that 70 percent of their business, 70 percent of their traffic, comes from these auto-recommended videos,” Hawley said. “In other words, ad revenues would be lost if they actually took some enforcement steps to stop this exploitation of children.”

The Missouri Republican was referring to a June 3, 2019, New York Times article that says YouTube’s algorithm collects otherwise innocent videos of sometimes partially clothed children and places them in a huge repository, which is then recommended to adults who have viewed sexually oriented materials on the site.

“But YouTube has not put in place the one change that researchers say would prevent this from happening again: turning off its recommendation system on videos of children, though the platform can identify such videos automatically,” the article states.

“The company said that because recommendations are the biggest traffic driver, removing them would hurt ‘creators’ who rely on those clicks. It did say it would limit recommendations on videos that it deems as putting children at risk.”

Hawley told the hearing, “This report was sickening.”

He was so upset that he introduced his “Protecting Children From Online Predators Act of 2019” shortly after the New York Times story was published.

Hawley was Missouri attorney general prior to being elected to the Senate in 2018 and has since become a prominent thorn in Silicon Valley’s side. He has also introduced legislation to force companies such as YouTube, Google, and Facebook to make public how they monetize data about their users.

A third Hawley bill focused on Silicon Valley firms was introduced earlier this year and is entitled “Ending Support for Internet Censorship Act of 2019.”

That measure ends digital firms’ immunity to publishing liability under Section 230 of the Communications Decency Act, unless they submit “clear and convincing evidence” that their algorithms and content-removal policies are politically neutral.

During the hearing, Hawley asked a panel of witnesses if “for some of these companies, aspects of their business model actually conflict with protecting the safety of children.”

Witness Christopher McKenna, founder and president of Protect Young Eyes, responded that he “absolutely agree[s] that the business model based on reach and engagements is one that absolutely conflicts with protection” of children.

McKenna’s firm provides an app and other resources designed to help parents protect their children from online dangers, especially including sexual predators and traffickers.

McKenna said sites push engagement—the process of connecting users with other users—because it is the most effective for generating ad revenues.

During his testimony prior to Hawley’s questioning, McKenna described his organization’s experience on Instagram.

“In March 2019, CNN reported that Instagram was the leading social media platform for child grooming by sexual predators,” McKenna told the committee.

“Our own test accounts quickly discovered that young people, particularly young girls, can be hunted like prey. We started an Instagram account with two stock photos and tried to mimic the behavior of an average teen girl. We posted two selfies with three hashtags each, searched a few hashtags, and liked a few photos.

“Within a week, we had dozens of men sending us images of their penises, telling us they were horny, and sending us pornography through direct messages—even after we told all of them that we were ‘only 12.’ They were relentless.”

Professor Angela Campbell of the Georgetown University Law School said in response to Hawley that “YouTube actually has a product intended for children, called ‘YouTube Kids’ and it’s got some good policies.

“The problem is, again, they’re not really enforcing those policies. There is a lot of content, even on YouTube Kids, that is inappropriate, and we complain to the FTC about this.”

Hawley’s bill, according to his office,

  • Bans recommending videos that feature children.
  • Prohibits video-hosting websites from recommending videos that feature minors. Those videos, however, could still appear in search results.
  • Would apply only to videos that primarily feature minors, not videos that simply have minors in the background.
  • Would exempt professionally produced videos, such as prime-time talent-show competitions.
  • Requires the Federal Trade Commission to impose criminal penalties and stiff fines for violations.

“To its credit, YouTube is trying to tweak its algorithm and promises that it will limit some recommendations,” Hawley said in a fact sheet on his proposal.

RECOMMENDED