I ran across this article today while reading up on how the AG of South Dakota wants the authority to criminally prosecute website owners who do not enforce unconstitutional age verification checks on protected free speech.
It appears this was signed into law by the governor on 2/13/2024. What started out as another law against deep fakes, has become a blanket ban of all AI created porn which resembles children.
“ Wheeler’s bill was amended in the committee to include definitions for computer-generated child pornography, which would allow prosecutors to pursue charges for its possession as they would with real imagery or videos.”
As usual, there is no mention of 3D rendering which can at times rival the quality of AI produced imagery. I would not trust this attorney general to make any distinction between the two.
It doesn’t include an ‘obscenity’ requirement for any images, so it likely won’t hold up against images that are not of real minors.
It’ll be shot down by a state court, as will the provisions about child sex dolls because simply being naked and anatomically correct cannot be the basis for an obscenity prosecution.
It’s also my hope that the obscenity doctrine itself will be overturned.
The more I read about AI images overwhelming people’s abilities to triage and prosecute actual CSAM, the stronger the case grows, since images that do not depict a real minor or are not intrinsic to the sexual exploitation and victimization of real minors are on the same level of legality as adult porn (requiring the materials to be ‘obscene’) and also because obscenity itself is a vague and unjustifiable legal concept inherently.
Eh… two Justices are already on their way off the bench. Thomas and Alito are very old and Alito out-flat said that he intends to retire, but only after a Trump victory. I have a feeling that it will happen even if Trump doesn’t win. Health problems, old age, constant unending string of controversies, etc.
I am not so certain that these AI images that are purportedly indistinguishable from real CSAM actually exist. If so, I have never seen them and I am very active in communities where fictional material and role play are popular. I have seen very good quality AI and 3D renders, some that may make a lot of people do a double take, but the telltale signs of fictional material were still easily apparent within a matter of seconds.
I don’t think there is any more validity to these AI claims than those who claim that sex dolls have a “rape” button. Legislators constantly make fantastical claims about things they know nothing about, and they know most people would be fearful to even investigate such topics.
When? These days, calls of “degenerate” and “immoral” from the right and calls of “problematic” and “objectifying women” (you know, assuming that they don’t know about the male versions, which, given that loli is a more banned term than shota, is probably the case) from the left seem to matter more than actual Constitutional theory.
I think lawmakers just want to score a bill that passes, even if the bill cannot be acted on.
It appears this is viewed as a badge.
I find it difficult to believe that lawmakers are as ignorant about the First Amendment as it appears they are.
I asked an AI. This portion of the response corresponds with what I think.
Symbolic value: Some bills are passed primarily for their symbolic importance, sending a message or taking a stance on an issue even if immediate action isn’t possible.
If one reasons that the better artificial images are, the more the images reduce demand for images that aren’t artificial and reasons that the agenda is to reduce demand for images that aren’t artificial, proscribing artificial images doesn’t make sense.
Proscribing both destroys any legal incentive to choose one over the other.
It would need either an obscenity requirement, or at least an affirmative defense (like in the federal law) requiring defendants to show the image didn’t involve the use of any actual minors.
it’s likely that they’re trying to provoke a challenge so they can get the issue brought back to the SCOTUS, but that’s not something even the most pessimistic side of me would bet on. No, the only thing that would bring this back to SCOTUS is if(when) the whole concept of relying on obscenity is a viable exception from the Freedom of Speech or Due Process.
Even California’s bill (which is apparently slated to become law, since it was not tabled like I had hoped) does not seem to target purely virtual depictions, since it still relies on a ‘depiction of’ or a ‘depiction of what appears to be’ a ‘person’, since there is no clarification that such a ‘person’ need not exist.
They are all gonna be overwhelmed eventually. These laws thus are in favor for big surveillance companies, because they will step in with automation systems that will scan everyones traffic and destroy anonymity to combat the “extreme rise in CSAM”.
The broader the definition, the bigger the data pool and therefor more “criminals”.