Stable Diffusion

Has anyone heard about this?

And if so, what are your thoughts?

1 Like

Well, if there is no way to separate real from CG then people can be blackmailed for things they never did, photographic evidence will be useless in court, politicians can be smeared by showing them with mistresses and/or gigolos (that might not be so bad :laughing:) and there will be no way to tell CG CSAM from real CSAM. As to the last, the only solution I can see will be to ban it all. Real or CG.

This technology is cool and has a lot of potential, beyond the obvious thing it can be used for that I’m sure everyone here is thinking of. That being Simulated CSAM. The issue is and is the potential risk that the images produced using this might be indistinguishable from reality, this poses a problem. However, it’s a problem we’re going to have to deal with some way because once this is out there, it’ll be out there. And there’s no way of removing it. Even if you take down the official download, people will fork it, back it up, reupload it.

3 Likes

I see this as a huge chance. We could flood the distribution of CSAM with CGI and satisfy any demand for such images without any real human being involved.

I believe that this technology could be, for lack of a better term, superior in almost every way to real CSAM as it could display the exact fantasies of a consumer, including things that are not even possible in real life. Meaning that for many consumers, there would simply be no reason to still search for real CSAM. The only reason to still look for real CSAM would be for people who get off on forbidden things, or for sadists who want to see the suffering of real children.

I think there is a real chance that this technology could eliminate the distribution of real CSAM. We have tried this in vain now for decades without noticeable effect, maybe it’s time to try a different approach?

2 Likes

As much as I like the idea of flooding the internet with fake harmless material to hopefully get rid of the demand for harmful CSAM. I would not suggest that. For two reasons:

  1. You would need to “train” the AI. Using real material, which is obviously illegal to possess.
  2. It the fake material is indistinguishable from a real human child it’s still illegal (at least according to US law)

That’s true, and these are definitely issues to be discussed.

Although it is noteworthy that by law here in Germany, police officers can already create, use and distribute realistic CGI child pornography to gain access to CSAM distribution platforms. So in legal terms, not much would need to change in order for them to be able to also flood these platforms with generated images.

2 Likes

Exactly, the future is coming wether we want it to or not. Better to start talking about how we handle emerging technologies sooner rather than later. Child sex dolls will give way to realistic androids, photoshopping and AI tech will become indistinguishable from the real deal. If not in our lifetime, then the lifetime of our descendants. Scientific progress is unstoppable…

3 Likes

I doubt it. I’ve been paying very close attention to AI in particular, and while Stable Diffusion are impressive technological leaps, they’re still held back by limitations which will always allow the common man to identify it from an organic image.

I just worry about the likenesses and faces of real children being potentially implicated by this material.

If not in our lifetime, then the lifetime of our descendants.

Hence why I said this. Currently, the tech is flawed; extremely impressive, but flawed. Maybe I’ll live to see it perfected, maybe not. But I sincerely believe it WILL get there eventually. Science and technology make the impossible possible. American Civil War vets lived to see canons and muskets turn to airplanes and atom bombs. Technology is rapidly turning scifi in reality…

I am starting to appreciate, that at my somewhat advanced age, I will not likely live to see this. Therefore, I gleefully throw this problem into younger laps. May God have mercy on your souls. :smiley: