Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage

This means either two things:

  1. They finally realized how stupid the idea was
  2. They’re going to be doing it secretly

Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods.

Apple in August announced a planned suite of new child safety features, including scanning users’ iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook’s former security chief, politicians, policy groups, university researchers, and even some Apple employees.

The majority of criticism was leveled at Apple’s planned on-device CSAM detection, which was lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual abuse.

1 Like

While I’m hoping for the first one. I would be willing to put money (if I had any) on the second :frowning:

“Let’s search for photos of naked children in our client phones” → former apple trainee.

It seems like every big tech company has these scanners, so if “pedophile’s” are still trading images they must no be very effective. Honestly I think this is just an excuse to collect more data.

1 Like

I wouldn’t jump to those conclusions. I’d be alright with it if they were more transparent regarding their implementation of such tools, and they were guaranteed to be limited to materials or circumstances involving real children.

1 Like

No amount of transparency can redeem this bad idea. Furthermore, any guarantees as to surveillance being “… limited to materials or circumstances involving real children.” is a fantasy.

Apple’s proposal is merely the beginning – within 5 years, unless there is major pushback, this will be the norm across all devices capable of being connected to the internet.

Overcollection and overreporting are going to be the new normal. The European Union’s chatcontrol proposal will subject everyone’s communications to scrutiny – not just images, but text as well. This is going to be done automatically without any evidence, or even suspicion. An entire population’s privacy is going to be sacrificed on the altar of allegedly reducing child sexual abuse. This is utterly beyond the pale.

And this will expand to other areas as well. Child protection is a pretext to wield and hold on to power. Power is what they want.