StyleGAN 2 for virtual CP

StyleGAN 2 is an AI known to synthesize “near-perfect” human faces (skip to 2:02). They are NOT real people.

The software can synthesize many other things like cars, cats and birds. The program feeds on pictures belonging in the same category. If you want cats, the AI must be given many, many images of cats.

Some people might use the software to create virtual CP that is hyper realistic.

1 Like

One thing I would like to see is someone developing a software that gives us an option of ONLY generates images based on photos taken of Adults. I plan to create super realistic 3D models in the future, and I do not want any minors to be involved in the development of them whatsoever. I know some software can choose what the generate face is based on. For example, excluding certain ethnicities, face shapes, and even… ages.

While I’m not into porn nor gore because I just don’t agree with it, I don’t want the people downloading my future 3D models to be restricted to only things I agree with. In other words, I want people to be able to use my 3D models to pose in lewd ways which is very common in SecondLife.

Which is why I want ONLY 3D models generated based SOLEY on confirmed adult images. Posing a 3D model in Second Life modeled based on a real child in lewd ways probably constitute creation and/or Possession of Child Pornography which are crimes. And lets say someone uses this software to synthesized based on images of REAL children. Would that constitute a violation of law?

We know morphed images of children such as combining the face of a child with that of a body of an adult in porn is illegal and constitute child pornography. Could the morph related laws apply to AI generated based on the images of 50 children? Which would mean use of ai generated images that involve synthethizations based on images of minors must never be used for pornography, (and in some Countries; gore is also prohibited such as in New Zealand). Now, I have no idea whether this would be a violation of law or not. Even if it does not, I have personal ethical issues with it.

Even if it’s possible to somehow managed to create photo realistic CP without the synthesization of images of real children from these AIs, virtual indistiquishable CP still violates law. https://www.law.cornell.edu/uscode/text/18/2256

LT;DR, I want a program that gives us the option of ONLY synthesizing images from adults. And I want the AI developers to really confirm the ages of their models. Like checking their birth certificates in person for example. And when it comes to synthesizing, Let us choose the age range. My suggestion is anything involving gore or porn, limit the synthesizing to images taken of people when they are 18+.

  1. It’s very sad that we have to ban photo-realistic CP when the child isn’t real. I guess it’s because the police doesn’t have the tools/resources to tell if photo-realistic CP is real or virtual.

  2. We have a similar problem with photo-realistic deepfakes. In the future, it will be very hard to tell if a picture or video is real or a deepfake.

If someone finds a solution for 2) we might have a solution for 1)

  1. Correct, the issue is that it would be very difficult to tell the difference between fake CP and CSAI. It seems to me the justification for prohibiting “virtually indistinquishable fake CP” is because it makes it potentially very difficult to prosecute real CSAI criminals because the criminal can use the “it’s just virtually indistinquishable fake cp” defense. I believe that was the motivation behind the PROTECT act.

  2. I’m not really sure what the solution is. But if we find some way to verify that something is fake vs real, (like some hidden code in JPG/PNG files that verify that it’s real or fake), it could be useful to be a tool against deepfakes which could be used to annoy others or worse, put words into the president/politician’s mouth that they never said as an attempt to undermine a political faction in the eyes and ears of the public.

I’ve heard of a method for removing CP that involves a hash database. Maybe we could use this method to identify virtual CP. It could look something like this.

  1. Artist creates photo-realistic virtual CP.
  2. Artist signs a contract with Prostasia to acquire “No Children Harmed” certificate.
  3. Certified artist uploads material to database
  4. Material is given a hash
  5. Material is distributed via same database
  6. Anyone can check the material against the database to verify it’s fake
1 Like

From a purely legal perspective, this wouldn’t be any different than simply looking for the real deal, especially when it comes to convincing a jury that it isn’t.

AIs are not gods. They can’t pull things out of a vacuum.
You would have to teach it in one way or another how to create the models, much like how an artist might learn anatomy from legal sources.

If it’s children that look like adults, then this defeats the object of this whole exercise and there is plenty of legal material out there.

There is an easy solution. If everything is fake and so realistic that it is indistinguishable from reality, then people would look at that instead and you would have strong arguments against the current regime of highly draconian laws. You could largely stop enforcing them, except for going after physical abusers and anyone dubious enough to knowingly distribute problematic ones. Self-regulation? I don’t know.

It won’t stop all cases? Laws aren’t made to be absolutely perfect, this leads to EARN IT.

The same applies to an indistinguishable sexbot. The awful and flawed argument against them is that they’re not realistic enough to be useful, well if they are, then it pulls the veil over this censor apologist nonsense.

AI porn is nice. I’m not a map so I will stick with AI adult porn.

I think Americans will have this to deal with:

B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or

C) such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.

(9)“identifiable minor”—
(A)means a person— (i)

(I)who was a minor at the time the visual depiction was created, adapted, or modified; or

(II)whose image as a minor was used in creating, adapting, or modifying the visual depiction; and

(ii)who is recognizable as an actual person by the person’s face, likeness, or other distinguishing characteristic, such as a unique birthmark or other recognizable feature; and

(B)shall not be construed to require proof of the actual identity of the identifiable minor.

https://www.law.cornell.edu/uscode/text/18/2256

I believe the AI porn is trained off real people. Will AI porn trained off real children or mostly off children violate the laws? I think it’s best to avoid it. Also I think there are ethical problems with using images of children for this reason. I cannot condone this.

Maybe the admins of this website know?

Just out of curiosity, when you say “real people”, are you only referring to modern-day people, or are you including historical figures in that?

1 Like

The problem with AI Generated CP is that it is based upon the fallacy that pedophiles become sexually ravenous towards every single child they see. Like all people, pedophiles have individual preferences.

Wouldn’t it be possible for AI generator like this to accidentally produce content that is being considered illegal, even if it wasn’t intended or designed to do so? Computers don’t have sense of morals per se, they don’t understand their bit flipping causes obscene output.

This is going to suck very much considering AI can be trained with real children photos. So you probably will need another AI that is trained on child porn to detect child porn before the AI produces child porn. What a mess.