Australia’s eSafety commissioner has launched enforcement action against a UK-based technology company responsible for enabling AI-generated “nudify” images of Australian school children.
It comes as the Australian Federal Police (AFP) and the Commissioner both report a rise in complaints about online child sexual exploitation (OCSE).