Came across this story while perusing ACLU-related news, was not disappointed by what I read, but also not too thrilled by the ignorance of these legislators, mostly regarding the law and the actual sciences, and also their eagerness to cast aside civil liberties because they believe that they’re actually helping children by wasting resources going after things that objectively are not children.
It goes to show that these legislators know virtually nothing about the actual nature of virtual/simulated content, nor are they even fully educated on the actual effects that CSAM viewing has on risk.
While it is contentious for some researchers and clinicians, the assumption that merely viewing CSAM as a pedohebephilic person will ‘drive them to commit actual crimes’ or ‘whet their appetites’ or ‘normalize their dysfunctions and deviance’ is without merit. Pedophilia isn’t something that can be changed, it’s something that many people have to struggle with and ultimately make peace with if they’re to live comfortable, safe, and offense-free lives. And part of this is making peace with their attractions and emotions by expressing them in such a way that reconciles them with the reality that they cannot be acted on in a way that implicates a real minor person.
That isn’t possible with CSAM, because in order for it to exist, there had to be an actual child who was involved in the production of the material, so the natural remedy is to allow the use of virtual/fictional ‘children’ by way of 3DCGI, 2D drawings/art, petite/youthful adult actors, dolls, and text-based stories.
Anyway, here’s the story.
It only showcases why groups such as the ACLU actually know what they’re doing and should be trusted more and more. Perhaps these lawmakers will realize this and table their bill quietly before it actually threatens to harm someone who does not deserve it, or at the very least, narrow its scope to be limited to depictions of actual, specific minors.
Altschiller Calls Out ACLU-NH Defense of Graphic, AI-Generated Child Porn
Posted to Politics April 10, 2024 by Damien Fisher
New Hampshire’s ACLU is siding with the producers of AI-created child sex abuse images over New Hampshire’s kids, critics say, opposing legislation to ban deepfake child porn in New Hampshire.
And at least one Democratic state senator says siding with criminals and against victims is nothing new for the progressive organization.
“It has been my experience in working for laws that protect crime victims the ACLU has not necessarily been a partner in protecting the rights of the people who have been harmed by criminals so much as protecting the rights of the criminals,” said Sen. Deb Altschiller (D-Stratham). “I have yet to have a criminal justice bill that they have embraced.”
Altschiller is the prime sponsor of SB564, which “expands the definition of ‘child’ under the child sexual abuse images statute to include those images that are portrayed to be a person under the age of 18 and are thus indistinguishable from a child.” She testified before the House Criminal Justice and Public Safety Committee on Wednesday, and that’s when she first learned of the ACLU’s opposition to her legislation.
Gilles Bissonnette, ACLU-NH’s Legal Director, did not testify in person. Instead, he submitted a written statement revealing his organization’s position: AI-generated child sex abuse images are protected speech under the First Amendment.
“These images are protected by the First Amendment and Part I, Article 22 insofar as they are neither produced using minors nor do they appear to depict a specific, identifiable person,” Bissonnette wrote.
Altschiller told the committee this expanded definition is needed as the scourge of child sex abuse image trafficking is colliding with the rise of easily available AI programs that can create new, realistic images, sometimes using the images of real children.
“Once something is out there, you can’t unring the bell,” Altschiller said.
New Hampshire State Police Sgt. Hawley Rae also testified on behalf of Altschiller’s legislation, arguing that people who consume child sex abuse images are statistically more likely to engage in abuse IRL (“In Real Life.”)
New Hampshire already has a problem with people trafficking these types of abusive images, and the potential for abusers using deepfake technology to make new abuse images from the photos of real children should be sobering, Rae said.
“Kids are vulnerable, especially in the social media world, and I can only assume this will be a problem in the AI world as well,” Rae said.
Bissonnette’s objection to the bill is founded on prior court rulings that hold child sex abuse images created without using real children are protected. The 2002 United States Supreme Court decision in Ashcroft v. Free Speech Coalition and the 2008 New Hampshire Supreme Court decision in State v. Zidel both found that child sex abuse images that did not depict real children are allowed.
“SB564 presents serious constitutional concerns under Ashcroft and Zidel because it sweeps within its scope images that are not limited to depictions of an ‘identifiable’ (meaning ‘recognizable as an actual, specific person’) minor who was actually victimized,” Bissonnette wrote.
Rep. Terry Roy (R-Deerfield) said neither the Ashcroft nor Zidel courts were dealing with the reality of the new dangers children face today.
“The Ashcroft court didn’t have to contend with the AI technology at all,” Roy said.
Interestingly, the ACLU’s hardline “free speech” absolutism on child porn doesn’t apply to political speech Bissonnette and his organization find objectionable. The ACLU-NH’s policy today is to decline to defend free speech that “denigrates [marginalized] groups” and “impedes progress toward equality.” That includes refusing to defend the free speech rights of allegedly right-wing groups whose “values are contrary to our values” and whose words might offend the “marginalized.”
The ACLU’s guidelines state, “As an organization equally committed to free speech and equality, we should make every effort to consider the consequences of our actions.”
What about the “consequences” of graphic, violent child porn, critics ask.
Given the advances in technology, Rep. David Meuse (D-Portsmouth) said failing to act now could have dire consequences for New Hampshire’s children sooner rather than later.
“I feel that composite images today are so realistic … they’re virtually indistinguishable from an image of a real child. These images just create a market for more images,” Meuse said. “The very fact that a market for this type of material exists, if we continue to allow that market to exist, real children are going to be harmed.”
The committee voted unanimously to approve the bill, moving it closer to a full House vote.
Even WIRED posted a story discussing the potential use for AIG-VCP to be useful in preventing CSA, and even dismantling the market for CSAM.
Could AI-Generated Porn Help Protect Children? | WIRED
Still, satisfying pedophilic urges without involving a real child is obviously an improvement over satisfying them based on a real child’s image. While the research is inconclusive, some pedophiles have revealed that they rely on pornography to redirect their urges and find an outlet that does not involve physically harming a child—suggesting that, for those individuals, AI-generated child pornography actually could stem behavior that would hurt a real child.As a result, some clinicians and researchers have suggested that AI-generated images can be used to rehabilitate certain pedophiles, by allowing them to gain the sexual catharsis they would otherwise get from watching child pornography from generated images instead, or by practicing impulse management on those images so that they can better control their urges. And with more resources for treatment available and less stigma attached to them, more pedophiles might feel prepared to seek help in the first place.
And with regard to the argument that the realistic images allow the market for actual CSAM to thrive - this also isn’t true, nor is it a fresh argument. Hell, the SCOTUS in Ashcroft rebutted it.
The Government next argues that its objective of eliminating the market for pornography produced using real children necessitates a prohibition on virtual images as well. Virtual images, the Government contends, are indistinguishable from real ones; they are part of the same market and are often exchanged. In this way, it is said, virtual images promote the trafficking in works produced through the exploitation of real children. The hypothesis is somewhat implausible. If virtual images were identical to illegal child pornography, the illegal images would be driven from the market by the indistinguishable substitutes. Few pornographers would risk prosecution by abusing real children if fictional, computerized images would suffice.
Most research by NGOs out there who’ve observed this phenomenon already conclude that the majority of AIG material is being produced explicitly without real CSAM, and the market for said CSAM is actually helping people abstain completely from CSAM, with many groups even coming together to create their own models and datasets where actual CSAM is completely omitted. @Larry even mentioned the result of this, with most outputs appearing to be petite adults with childish faces.
It’s almost poetic how many of these communities are already against the use of real children, and see AIG content as a way to finally find a way to exercise their rights without implicating real abuse or exploitation material. You cannot legislate this or imprison it away. There needs to be an alternative to CSAM consumption, and therapy isn’t really an option that works for everyone, given the stigma which drives people away from receiving it.
We should not be afraid to actually diagnose these findings in a court-of-law setting should the need arise.