The article suggests labeling fictionally generated pornography as virtual. Could make it easier for people who download and share fictionally generated stuff to be safe from false prosecutions. if authorities mistake it for illegal images which includes CSAM and Bestiality porn (in countries like UK and NZ).
If this labeling becomes mandated by law, it will probably allow creators to select a variety of different labels they can use. Also how would it be labeled? If someone creates a 211 page book of just CGI porn, is it only the back of the book that needs to be labeled or every single image? What about those who choose to take specific images out to share? Perhaps they maybe required to “photoshop” the label onto each image?
Maybe only have this requirement for extremely realistic CGI images that can easily be confused with an illegal image? I don’t see a need for the label requirement when the CGI generated avatar looks like an adult engaging in non-bestiality sexual activity. But if the CGI involves an avatar, clothes and the scene, taken as a whole that looks obviously under 18, or obviously involves fictional bestiality, I think a requirement makes much more sense than the current state of the law.
But a question: what about criminals attempting to hide their activity by incorporating these labels onto ACTUAL csam? I know hashlist could rule out any attempt for known csam. But what about unknown ones?
Producers and distributors of virtual pornography should have no objection to such labeling because it would allow them to exercise their First Amendment rights and protect them from unwarranted prosecution.
The government would benefit from the labeling because it would be easier to
prove that defendants knew they possessed actual child pornography if the images did not include a virtual pornography label. Moreover, it would ease the evidentiary burden of establishing that an image was of an actual child. Concomitantly, the record-keeping provision would aid the possessors of alleged child pornography in establishing their affirmative defense that the images they possessed were completely virtual. They can buttress this claim by introducing into evidence the label required by 2257 that states that the images are of adults or are completely computer-generated.
The authors of the article criticize the current standard as to what qualifies as “Indistinguishable” which is why they prefer the labeling alternative.
In addition to being overbroad, the new “indistinguishable from” language is unconstitutionally vague. Its definition of “indistinguishable” is linked to an “ordinary” person standard.’ 0 4 Who is the “ordinary” person?
Does it differ from a “reasonable” person? If it is meant to be synonymous with the latter, other problems arise. The mens rea of Section 2252 is “knowingly,” yet the assessment of whether an image is child pornography is whether an ordinary person would so think. 0 5 Thus, defendants who do not know that they possess child pornography, where it is in the form of a virtual image, can be convicted if an ordinary person would believe the image was of an actual child.
In effect, defendants are being convicted on a negligence standard rather than the “knowingly” mens rea stated in the statute and mandated by the Supreme Court.’