Google, Meta, Microsoft, and Apple will need to compile a report every six months to the eSafety Commission explaining how they are dealing with child abuse material on their platforms.
The eSafety office also requires social media services, including Discord and WhatsApp, to go into more detail and outline how they are tackling deep fake material of children generated with AI, live-streamed abuse, and sexual extortion, on top of child abuse material.