Lawsuit Accuses Apple of Ignoring Child Sexual Abuse Content on iCloud

The company stopped using a child sexual material scanning tool, saying it posed a risk to user privacy, the complaint noted.
Lawsuit Accuses Apple of Ignoring Child Sexual Abuse Content on iCloud
A man holds an Apple iPhone in a mobile phone store in Nantes, France, on Sept. 13, 2023. Stephane Mahe/Reuters
Naveen Athrappully
Updated:
0:00

Apple isn’t doing enough to stop the spread of child sexual abuse material (CSAM) on its iCloud and iMessage offerings, accused a plaintiff in a recently filed lawsuit.

The complaint, filed in the U.S. District Court Northern District of California on Tuesday, claimed Apple “knew that it had dire CSAM problem but chose not to address it.”