Apple Announces Limits to Child Sex Abuse Image-Scanning System After Privacy Backlash

Apple Announces Limits to Child Sex Abuse Image-Scanning System After Privacy Backlash
A phone in a file photograph. Loic Venance/AFP via Getty Images
Tom Ozimek
Tom Ozimek
Reporter
|Updated:

Apple on Aug. 13 provided new details of how its planned child sexual abuse material (CSAM) detection system would work, outlining a range of privacy-preserving limits following backlash that the software would introduce a backdoor that threatens user privacy protections.

The company addressed concerns triggered by the planned CSAM feature, slated for release in an update for U.S. users later this year, in a 14-page document (pdf) that outlined safeguards it says it will implement to prevent the system on Apple devices from erroneously flagging files as child pornography, or being exploited for malicious surveillance of users.
Tom Ozimek
Tom Ozimek
Reporter
Tom Ozimek is a senior reporter for The Epoch Times. He has a broad background in journalism, deposit insurance, marketing and communications, and adult education.
twitter
Related Topics