A way to reduce CSAM creation: Deep Unfakes

It’s hard to have an idea that’s both good and original, but people gotta try, right?

NCMEC is I believe the one organization authorized to collect CSAM, categorize it, and process it.

What if they took note of children appearing in CSAM for the very first time. How long would it take for a dozen or so to appear? I have no idea, but let’s say it’s a month. Then, looking over the various shots of this child, they find one that gives the clearest image of the face, and extract that face into a mugshot, with all hint of sexual activity edited out. Put those twelve mugshots together into a set and publicize it to the whole world. People look at it, and if they recognize any of the children, they inform NCMEC of who they have seen. With this identifying info, law enforcement should have an easy time of finding the perpetrator and arresting him. Who would be motivated to look at this NCMEC creation? Anyone who is interested in curbing the creation of CSAM – which is, of course CSA. It would only take seconds.

You might object that we have violated the privacy of the child, advertising to the world that they have been abused. But we have given no specific indication of HOW they have been abused. If the photo never finds its way onto the screen of anyone who knows the child, then there is little harm. If it does, the chances of rescue are high.

And naturally the main benefit here is to make the creators of CSAM very afraid. Unless the child victim is totally isolated from the world, the creators are in danger. Less CSAM would be made.

Pass any needed legislation needed to enable it. Why wouldn’t this work?

An additional possibility is to send out photos that are a mixture of ordinary kids who have not appeared in CSAM and the ones that have. One advantage is that people will have a decent chance of getting some “hits” instead of (for the vast majority of people) never seeing anyone they recognize. This could also protect against “mistaken identity” cases where a very similar-looking child is recognized, or if the makers have in fact used “deep fake” methods of grafting a child’s photo onto a sexual situation – though creating such things is also illegal. The next month’s edition could list the children in the past month who were “fillers” (or mistaken identities), and they could be complimented for volunteering to be part of a program to fight CSA (their parents would have to give permission, of course).

The idea for this came to me from an observation of at least one person of good will who thought that society should encourage the wide distribution of CSAM, precisely to maximize the chances of a child being recognized and rescued. It brings into question the idea that CSAM producers like to have lots of people see what they create, and that every time someone looks it encourages the creation of more.

But with the face extracted from the sexual situation, the downsides are largely removed.

1 Like

They already do something like that for human remains, and I wouldn’t be surprised if the purpose of that program is to perfect the technology for something like what you’re describing

1 Like