Apple details reasons to abandon CSAM-scanning tool, more controversy ensues

[ad_1] In December, Apple said that it was killing an effort to design a privacy-preserving iCloud photo scanning tool for detecting child sexual abuse material (CSAM) on the platform. Originally announced in August 2021, the project had been controversial since its inception. Apple first paused it that September in response to concerns from digital rights groups and researchers that such a tool would inevitably be abused and exploited to compromise the privacy and security of…

"Apple details reasons to abandon CSAM-scanning tool, more controversy ensues"