Google, Meta Pressed to Ramp up Efforts to Tackle Child Abuse Material

Failure to respond could see services fined up to $782,500 a day.
Google, Meta Pressed to Ramp up Efforts to Tackle Child Abuse Material
eSafety Commissioner Julie Inman Grant at a press conference at Parliament House in Canberra, Australia, June 15, 2021. AAP Image/Mick Tsikas
Rebecca Zhu
Updated:
0:00

Google, Meta, Microsoft, and Apple will need to compile a report every six months to the eSafety Commission explaining how they are dealing with child abuse material on their platforms.

The eSafety office also requires social media services, including Discord and WhatsApp, to go into more detail and outline how they are tackling deep fake material of children generated with AI, live-streamed abuse, and sexual extortion, on top of child abuse material.

Related Topics