Apple details reasons to abandon CSAM-scanning tool, more controversy ensues

EnlargeLeonardo Munoz/Getty

In December, Apple said that it was killing an effort to design a privacy-preserving iCloud photo scanning tool for detecting child sexual abuse material (CSAM) on the platform. Originally announced in August 2021, the project had been controversial since its inception. Apple first paused it that September in response to concerns from digital rights groups and researchers that such a tool would inevitably be abused and exploited to compromise the privacy and security of all iCloud users. This week, a new child safety group known as Heat Initiative told Apple that it is organizing a campaign to demand

→ Continue reading at Ars Technica

Related articles

Comments

Share article

Latest articles