Apple isn’t doing enough to stop the spread of child sexual abuse material (CSAM) on its iCloud and iMessage offerings, accused a plaintiff in a recently filed lawsuit.
The complaint, filed in the U.S. District Court Northern District of California on Tuesday, claimed Apple “knew that it had dire CSAM problem but chose not to address it.”