Did you know there is a forensic tool which looks for "animated" "CSAM"?

https://assets-global.website-files.com/603bd9850e8442d1efe81a47/619aea869b20f10905d64de7_VICpoint%20White%20Paper%20File%20Path%20Classifier.pdf Look at page 3

Due to the link limit, I have to make multiple posts.

https://safeonline.global/project-vic-international/
https://dev-safe-online-fund.pantheonsite.io/wp-content/uploads/2023/10/Online-Child-Safety-175.pdf Page 3.
By the way, they appear to have received funding from a U.K. Home Office affiliated group.

I don’t think this is going to be successful. AI and ML-based CSAM detection already faces severe issues with false-positives and too many false-negatives. It’s only a matter of time before companies either shift away from these tools entirely and reverts back to tried-and-true perceptual hash-matching technologies, or severely limit these to only reach known-CSAM or derivatives. Measuring against a false-positive rating is the only way to really clap back against this in an effective way, otherwise a war will be waged that will not be won.

Petite/youthful adult models shouldn’t have to worry if someone sharing their nudes will get them arrested. Same with cartoonists and CGI artists.

5 Likes

Neither should the fans https://www.fedbar.org/wp-content/uploads/2010/09/sidebar-september2010-pdf-1.pdf

4 Likes