Playing Hide and Seek: How to Protect Virtual Pornographers and Actual Children on the Internet

The article suggests labeling fictionally generated pornography as virtual. Could make it easier for people who download and share fictionally generated stuff to be safe from false prosecutions. if authorities mistake it for illegal images which includes CSAM and Bestiality porn (in countries like UK and NZ).

If this labeling becomes mandated by law, it will probably allow creators to select a variety of different labels they can use. Also how would it be labeled? If someone creates a 211 page book of just CGI porn, is it only the back of the book that needs to be labeled or every single image? What about those who choose to take specific images out to share? Perhaps they maybe required to “photoshop” the label onto each image?

Maybe only have this requirement for extremely realistic CGI images that can easily be confused with an illegal image? I don’t see a need for the label requirement when the CGI generated avatar looks like an adult engaging in non-bestiality sexual activity. But if the CGI involves an avatar, clothes and the scene, taken as a whole that looks obviously under 18, or obviously involves fictional bestiality, I think a requirement makes much more sense than the current state of the law.

But a question: what about criminals attempting to hide their activity by incorporating these labels onto ACTUAL csam? I know hashlist could rule out any attempt for known csam. But what about unknown ones?

https://digitalcommons.law.villanova.edu/cgi/viewcontent.cgi?article=1205&context=vlr

Producers and distributors of virtual pornography should have no objection to such labeling because it would allow them to exercise their First Amendment rights and protect them from unwarranted prosecution.

The government would benefit from the labeling because it would be easier to
prove that defendants knew they possessed actual child pornography if the images did not include a virtual pornography label. Moreover, it would ease the evidentiary burden of establishing that an image was of an actual child. Concomitantly, the record-keeping provision would aid the possessors of alleged child pornography in establishing their affirmative defense that the images they possessed were completely virtual. They can buttress this claim by introducing into evidence the label required by 2257 that states that the images are of adults or are completely computer-generated.

The authors of the article criticize the current standard as to what qualifies as “Indistinguishable” which is why they prefer the labeling alternative.

In addition to being overbroad, the new “indistinguishable from” language is unconstitutionally vague. Its definition of “indistinguishable” is linked to an “ordinary” person standard.’ 0 4 Who is the “ordinary” person?
Does it differ from a “reasonable” person? If it is meant to be synonymous with the latter, other problems arise. The mens rea of Section 2252 is “knowingly,” yet the assessment of whether an image is child pornography is whether an ordinary person would so think. 0 5 Thus, defendants who do not know that they possess child pornography, where it is in the form of a virtual image, can be convicted if an ordinary person would believe the image was of an actual child.

In effect, defendants are being convicted on a negligence standard rather than the “knowingly” mens rea stated in the statute and mandated by the Supreme Court.’

No, this doesn’t work.

At best, you could get the artist to post the 3D model to show that it is produced from a 3D model rather than being a direct photograph. You would at that point have to consider the possibility that someone could have been digitized into a 3D model.

Most art is from overseas not the U.S. so you would have to magically apply the regulation to the scale of the internet, including hobbyists who may not have access to accreditation facilities and may continue distributing their art anyway. Some artists are very hesitant to reveal their identities for fear of persecution.

Communities are good at weeding out bad content (artists tend to have a reputation of sorts in communities, word tends to spread when any particular artist is iffy), so if you stop punishing communities for existing (and trying to be somewhat legal), then you will naturally get more safer content and less harmful content around them.

You can then point people to those communities, instead of having them wander around aimlessly and right into the illegal content.

2 Likes