Germany producing artificial CSAM that is indistinguishable from reality

I couldn’t see any posts here regarding this, but Germany has passed a new law last year which granted police the ability to artificially produce CSAM material that, according to the papers, is “indistinguishable from reality”.

Reason for this is that they want to use this material to enter CSA communities, where such material is shared. Many lawyers have criticized this approach, not only for the moral issues, but also because the federal governements reason for them to legally produce this was “these images are fictional and there is no real victim”. This is something that others have said regarding cartoons and stories and were ignored.

There are huge ethical issues regarding this type of material, because it feeds the market. Also, who knows if they didn’t accidentally create something which looks like me as a child back in the day - or will look like someone who is not even born yet? This is a wrong and disgusting approach imo.

What are your thoughts?

3 Likes

The only thing that came to mind was:

  1. Police produce fake CP
  2. Share it
  3. Jail people for downloading CP (even though it isn’t real; but only the police know that)

The person in question would need to prove it’s them, though.

1 Like

So… if I were to create simulated CSAM that is ‘‘virtually indistinguishable from reality’’ I would receive a life sentence but when police do it, it’s cool?

This is one of the many reasons why I am so anti-police, I’m sick of their ‘‘we commit crimes to catch criminals’’ bullshit and total lack of accountability.

8 Likes

Can I get some sample materials for, umm, research?

Seriously though I think they’re probably using a pretty lax definition of “indistinguishable”, there’s no way they can produce images in any significant volume that are going to be convincingly realistic.

3 Likes

Artists can already do that using pen and paper. It should very much be possible digitally. People in the article argued if that could be used for therapy. Like a place where you are allowed to view them without electronics to prevent sharing. Do you think that would help?

2 Likes

The keyword is “indistinguishable” and I don’t think they could manage that in the large quantities that would be required in a community that is extremely cautious and skeptical. The chances of them actually getting any significant results from this seem pretty slim to me and it’ll likely have the opposite effect to what is intended only adding to supply.

3D artworks should be legal, these would fall into the category of 3D artworks, they themselves said “there’s no victim”.

1 Like

I agree. They are not helping anyone with this.

3D artworks should be legal, these would fall into the category of 3D artworks, they themselves said “there’s no victim”.

Only reason this is banned is to make the life of law enforcement easier since 3D artwork can be photorealistic. Potential CSA could just argue it’s art. I honestly don’t know what to think about this.

Japan probably has the most rational (or lax depending on how you view it) CSAM law, because if I remember correctly they actually try to identify the person in the material and only care about the real age (should have obvious limits where it’s 100% obvious), while the majority of countries also take the appearance (physique, attitude) into account, so legal adults can be illegal if they appear to be a minor to “a neutral viewer”.

This is also why a lot of pornstars in japan specialize in “lolicon”. A porn company hired 300 pornstars and put 150 to work in this category.

That’s… wayyyy easier said than done.

Do you know how hard it is to do that? To successfully cross over the uncanny valley like that?
To this day, it’s not sure if such a feat will ever be achieved, at least on a massive scale.
It would need to be put through several paces.

If they’re disseminating this material to try and dilute the market for CSAM and divert focus towards fake stuff, then that’s good.
We can safely allow that content to exist and focus on materials that DO cause harm.

They have algorithms that can detect whether such images are real or fake, should a specific image be too difficult to tell. I feel as though this could be a tremendous breakthrough in both technology and psychology that would eliminate the need for actual children to suffer real abuse, all the while fostering an environment that creates a fine line between what’s legal and what isn’t.

2 Likes

Yeah, I get your point and I agree since it’s effectively just fictional, but I doubt that there will ever be a future where photorealistic art like this will ever be legal - even if studies suggest that they would reduce actual abuse. It would be too much of a moral issue for the majority of people. Should it really be “indistinguishable” as they say then it would also give PTSD to a lot of people tbh.

Coming across these images and having to keep thinking: “Was this real, was it fictional?” etc.
Even if PhotoDNA knows it’s fictional, the avg. internet user might not. I myself saw images where I thought they were real, but they turned out to be art.

There is also a very interesting new AI that can map already existing video footage from games and process it in such a way that it almost looks photorealistic. And this is all done live.

This is for live footage and now imagine this on already rendered, processed images.

I was at a restaurant, and I overheard the friend of a Vietnam war vet complaining that Miss Saigon triggers that veteran’s PTSD. I don’t see any major calls to ban that play. Whether or not it causes PTSD should hardly be a valued measure of anything.

2 Likes

I can’t see this getting very far. Would the government/police spend all that money producing CGI porn? It might now be legal for them to do so (and I suspect that could be challenged in the courts), but I suspect this is a publicity exercise. I’m also far from covinced that any CGI porn wouldn’t be recognisable as such - even big-bugdet feature films have difficulty making realistic humans.

I’m reminded of the announcement in the UK some years ago that the police would be setting up fake child porn sites, with a “click here for CP” link, followed by a series of “are you sure” options, leading to a statement that the user’s IP address has been logged and they shouild expect a visit.

It never happened. Partly because we don’t have a “it’s not illegal if the police do it” law, so the police would find themselves in court (it’s illegal here to advertise CP even if you haven’t got any) and partly because any suspect would have the defence that he knew it was the police site and wanted to see where it would go, being perfectly safe because obviously the police wouldn’t distribute illegal material.

1 Like

(6) Paragraph 1 sentence 1 numbers 1, 2 and 4 and sentence 2 do not apply to official acts in the context of criminal investigations if:

  1. the act relates to child pornographic content that does not reflect actual events and has not been produced using a picture of a child or adolescent, and
  2. the investigation would otherwise be futile or significantly more difficult.

This allows them to produce and use this material.

Hm. As long as nobody IRL is used in its production, I don’t see the moral issues of it. I don’t care what people find disgusting, I just care if people are harmed in making it.

My concerns are that it could be confused with actual CP and 1) get someone arrested over fiction because it’s detailed and 2) people who think they’re viewing a harmless situation might be viewing an actual instance of CSEM/ blending in actual CSEM with the simulations and 3) the images may look like a real kid somewhere in reality and that would really make prosecuting actual cases a nightmare to deal with, as you would need to prove if the entity in the image/video was a real kid or not… Though I suppose the obvious get around to this (and 1) is to have an obviously inhuman character be the entity that doesn’t exist IRL.

Also…

because the federal governements reason for them to legally produce this was “these images are fictional and there is no real victim”. This is something that others have said regarding cartoons and stories and were ignored.

Assuming I’m understanding this correctly… I was agreeing with the first half until the double standard of, “Realistic simulations of a crime? A-OK. Cartoonish simulations of the same crime, you’ve gone too far.” Not understanding the rationale there if that’s what that meant… Even by the logic of those who oppose lolicons, I can’t figure how that makes any sense…

… Please tell me I read that wrong.

1 Like

Nope. You read that correctly.

Hey, man, this isn’t the first time that Germany started something with really pure intentions and then screwed things up royally.

2 Likes

They are sadly also “fighting paedophilia” instead of sexual abuse.

The new regulation is also intended to send a signal to society that children - even if they are only modelled physically (dolls, drawings) - should not be made an object of sexual behaviour.

  • Page 26 (33 in PDF) of the source below
4 Likes

So they’re trying to ban drawings now too?

Only possession is legal. They said this in regards of drawings:

As far as fictitious porn goes, i.e. recognizably artificial, the newly inserted § 184b paragraph 1 sentence 2 StGB differentiates the punishment for reasons of proportionality. The new punishment framework will not apply here. The new punishment for the distribution of child pornographic comics, drawings, literacy or content in virtual worlds should not apply, because a real child is not involved, it is also not to fear in the same way that fictitious representations will lead the consumer to imitate.

2 Likes

I say: make up your damn mind.

Either virtual child pornography is harmless, because no real children are involved and therefore no one is hurt (which was how they argued that it should be legal for law enforcement). In this case there is no reason to outlaw it for anyone else, and it should be completely legal, period.

Or in some way or another virtual child pornography is still indirectly harmful (which is how they argued that it should be banned in the first place). In this case law enforcement should also not be able to use it, because then they as well contribute to more harm to children.

The situation right now is completely and disturbingly twisted. On the one hand, virtual child pornography is criminalized so they can throw pedophiles in jail for using it. On the other hand, it is then legalized for law enforcement so they can catch more pedophiles and throw them in jail. It’s just another case showing that the law has long since abandoned actually trying to protect children via constructive, humane and sensible policies, and that it is only about punishing people which are seen as “disgusting”, “perverted” and basically subhuman.

By the way, did you know that in Germany you can now get lesser punishments for beating a child half to death (minimun sentence: 6 months in jail) vs. possessing just a single CGI image of a naked child (minimum sentence: 1 year in jail)?

Casablanca gambling? I’m shocked! - YouTube