Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods.
Apple in August announced a planned suite of new child safety features, including scanning users’ iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook’s former security chief, politicians, policy groups, university researchers, and even some Apple employees.
The majority of criticism was leveled at Apple’s planned on-device CSAM detection, which was lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual abuse.
It seems like every big tech company has these scanners, so if “pedophile’s” are still trading images they must no be very effective. Honestly I think this is just an excuse to collect more data.
I wouldn’t jump to those conclusions. I’d be alright with it if they were more transparent regarding their implementation of such tools, and they were guaranteed to be limited to materials or circumstances involving real children.
No amount of transparency can redeem this bad idea. Furthermore, any guarantees as to surveillance being “… limited to materials or circumstances involving real children.” is a fantasy.
Apple’s proposal is merely the beginning – within 5 years, unless there is major pushback, this will be the norm across all devices capable of being connected to the internet.
Overcollection and overreporting are going to be the new normal. The European Union’s chatcontrol proposal will subject everyone’s communications to scrutiny – not just images, but text as well. This is going to be done automatically without any evidence, or even suspicion. An entire population’s privacy is going to be sacrificed on the altar of allegedly reducing child sexual abuse. This is utterly beyond the pale.