NH Legislators Ignorant of the law - Schooled by ACLU Chapter

Came across this story while perusing ACLU-related news, was not disappointed by what I read, but also not too thrilled by the ignorance of these legislators, mostly regarding the law and the actual sciences, and also their eagerness to cast aside civil liberties because they believe that they’re actually helping children by wasting resources going after things that objectively are not children.

It goes to show that these legislators know virtually nothing about the actual nature of virtual/simulated content, nor are they even fully educated on the actual effects that CSAM viewing has on risk.

While it is contentious for some researchers and clinicians, the assumption that merely viewing CSAM as a pedohebephilic person will ‘drive them to commit actual crimes’ or ‘whet their appetites’ or ‘normalize their dysfunctions and deviance’ is without merit. Pedophilia isn’t something that can be changed, it’s something that many people have to struggle with and ultimately make peace with if they’re to live comfortable, safe, and offense-free lives. And part of this is making peace with their attractions and emotions by expressing them in such a way that reconciles them with the reality that they cannot be acted on in a way that implicates a real minor person.
That isn’t possible with CSAM, because in order for it to exist, there had to be an actual child who was involved in the production of the material, so the natural remedy is to allow the use of virtual/fictional ‘children’ by way of 3DCGI, 2D drawings/art, petite/youthful adult actors, dolls, and text-based stories.

Anyway, here’s the story.
It only showcases why groups such as the ACLU actually know what they’re doing and should be trusted more and more. Perhaps these lawmakers will realize this and table their bill quietly before it actually threatens to harm someone who does not deserve it, or at the very least, narrow its scope to be limited to depictions of actual, specific minors.

Altschiller Calls Out ACLU-NH Defense of Graphic, AI-Generated Child Porn

Posted to Politics April 10, 2024 by Damien Fisher

New Hampshire’s ACLU is siding with the producers of AI-created child sex abuse images over New Hampshire’s kids, critics say, opposing legislation to ban deepfake child porn in New Hampshire.

And at least one Democratic state senator says siding with criminals and against victims is nothing new for the progressive organization.

“It has been my experience in working for laws that protect crime victims the ACLU has not necessarily been a partner in protecting the rights of the people who have been harmed by criminals so much as protecting the rights of the criminals,” said Sen. Deb Altschiller (D-Stratham). “I have yet to have a criminal justice bill that they have embraced.”

Altschiller is the prime sponsor of SB564, which “expands the definition of ‘child’ under the child sexual abuse images statute to include those images that are portrayed to be a person under the age of 18 and are thus indistinguishable from a child.” She testified before the House Criminal Justice and Public Safety Committee on Wednesday, and that’s when she first learned of the ACLU’s opposition to her legislation.

Gilles Bissonnette, ACLU-NH’s Legal Director, did not testify in person. Instead, he submitted a written statement revealing his organization’s position: AI-generated child sex abuse images are protected speech under the First Amendment.

“These images are protected by the First Amendment and Part I, Article 22 insofar as they are neither produced using minors nor do they appear to depict a specific, identifiable person,” Bissonnette wrote.

Altschiller told the committee this expanded definition is needed as the scourge of child sex abuse image trafficking is colliding with the rise of easily available AI programs that can create new, realistic images, sometimes using the images of real children.

“Once something is out there, you can’t unring the bell,” Altschiller said.

New Hampshire State Police Sgt. Hawley Rae also testified on behalf of Altschiller’s legislation, arguing that people who consume child sex abuse images are statistically more likely to engage in abuse IRL (“In Real Life.”)

New Hampshire already has a problem with people trafficking these types of abusive images, and the potential for abusers using deepfake technology to make new abuse images from the photos of real children should be sobering, Rae said.

“Kids are vulnerable, especially in the social media world, and I can only assume this will be a problem in the AI world as well,” Rae said.

Bissonnette’s objection to the bill is founded on prior court rulings that hold child sex abuse images created without using real children are protected. The 2002 United States Supreme Court decision in Ashcroft v. Free Speech Coalition and the 2008 New Hampshire Supreme Court decision in State v. Zidel both found that child sex abuse images that did not depict real children are allowed.

“SB564 presents serious constitutional concerns under Ashcroft and Zidel because it sweeps within its scope images that are not limited to depictions of an ‘identifiable’ (meaning ‘recognizable as an actual, specific person’) minor who was actually victimized,” Bissonnette wrote.

Rep. Terry Roy (R-Deerfield) said neither the Ashcroft nor Zidel courts were dealing with the reality of the new dangers children face today.

“The Ashcroft court didn’t have to contend with the AI technology at all,” Roy said.

Interestingly, the ACLU’s hardline “free speech” absolutism on child porn doesn’t apply to political speech Bissonnette and his organization find objectionable. The ACLU-NH’s policy today is to decline to defend free speech that “denigrates [marginalized] groups” and “impedes progress toward equality.” That includes refusing to defend the free speech rights of allegedly right-wing groups whose “values are contrary to our values” and whose words might offend the “marginalized.”

The ACLU’s guidelines state, “As an organization equally committed to free speech and equality, we should make every effort to consider the consequences of our actions.”

What about the “consequences” of graphic, violent child porn, critics ask.

Given the advances in technology, Rep. David Meuse (D-Portsmouth) said failing to act now could have dire consequences for New Hampshire’s children sooner rather than later.

“I feel that composite images today are so realistic … they’re virtually indistinguishable from an image of a real child. These images just create a market for more images,” Meuse said. “The very fact that a market for this type of material exists, if we continue to allow that market to exist, real children are going to be harmed.”

The committee voted unanimously to approve the bill, moving it closer to a full House vote.

Even WIRED posted a story discussing the potential use for AIG-VCP to be useful in preventing CSA, and even dismantling the market for CSAM.

Could AI-Generated Porn Help Protect Children? | WIRED
Still, satisfying pedophilic urges without involving a real child is obviously an improvement over satisfying them based on a real child’s image. While the research is inconclusive, some pedophiles have revealed that they rely on pornography to redirect their urges and find an outlet that does not involve physically harming a child—suggesting that, for those individuals, AI-generated child pornography actually could stem behavior that would hurt a real child.

As a result, some clinicians and researchers have suggested that AI-generated images can be used to rehabilitate certain pedophiles, by allowing them to gain the sexual catharsis they would otherwise get from watching child pornography from generated images instead, or by practicing impulse management on those images so that they can better control their urges. And with more resources for treatment available and less stigma attached to them, more pedophiles might feel prepared to seek help in the first place.

And with regard to the argument that the realistic images allow the market for actual CSAM to thrive - this also isn’t true, nor is it a fresh argument. Hell, the SCOTUS in Ashcroft rebutted it.

The Government next argues that its objective of eliminating the market for pornography produced using real children necessitates a prohibition on virtual images as well. Virtual images, the Government contends, are indistinguishable from real ones; they are part of the same market and are often exchanged. In this way, it is said, virtual images promote the trafficking in works produced through the exploitation of real children. The hypothesis is somewhat implausible. If virtual images were identical to illegal child pornography, the illegal images would be driven from the market by the indistinguishable substitutes. Few pornographers would risk prosecution by abusing real children if fictional, computerized images would suffice.

Most research by NGOs out there who’ve observed this phenomenon already conclude that the majority of AIG material is being produced explicitly without real CSAM, and the market for said CSAM is actually helping people abstain completely from CSAM, with many groups even coming together to create their own models and datasets where actual CSAM is completely omitted. @Larry even mentioned the result of this, with most outputs appearing to be petite adults with childish faces.
It’s almost poetic how many of these communities are already against the use of real children, and see AIG content as a way to finally find a way to exercise their rights without implicating real abuse or exploitation material. You cannot legislate this or imprison it away. There needs to be an alternative to CSAM consumption, and therapy isn’t really an option that works for everyone, given the stigma which drives people away from receiving it.

We should not be afraid to actually diagnose these findings in a court-of-law setting should the need arise.

@elliot @Gilian for vis.

4 Likes

Bernstein v. United States - Wikipedia Code is also protected by the First Amendment.

1 Like

I should also add that the writer, Mr. Damian Fisher’s biases, are as clear as day with this article.

I myself and others owe a great deal of debt to the ACLU. The way Mr. Fisher attempts to character assassinate the organization in such a crass, uninformed, and baseless manner shows how little in the way of integrity that these types. They prefer to bake things down into simple “this vs. that”, “us vs. them” but it’s not that simple. The ACLU has represented the interests of those types of groups before, and will continue to do so when their liberties are actually threatened. It’s just that they barely ever are in a way that would be consequentially actionable. One wouldn’t even need to dabble that much in free-speech or 1A law to figure out how to craft a rock-solid defense for the actions of neo-Nazis and white nationalists, because ultimately their speech is political and non-violent. That’s all there is to say about them, and none of that can reasonably be read as condonement of their beliefs or actions.

However, trying to conflate the definition of ‘child’ to include images that are not actually of real children, though, on the basis of appearance or concept alone, is something that’s consequentially actionable. There is so much wrong with the NH Legislature’s defense of this, from their belief that these materials increase risk in predisposed offenders to other things. None of it is based in science or fact.

1 Like

This!

This idea is so easy that many think of it without a reference.

The manner in which that lawmaker conflates is by unexplained leaps.

I prefer the term “imaginary character” to “non-existent beings.” I read a decision where a judge mulled over what a non-existent being is. If I find that writing again, I’ll send a link. The judge insisted that there’s no such thing as a non-existent person.

3 Likes

Do you have a source which isn’t written by a far right grifter? It’s hard to know what is going on here.

Depicting someone as a normal person might help. No, really, the discourse makes it sound a lot like it’s someone who’s crawled out of Freakshow Alley, rather it being poor Bill who is just like you who has had his rights trampled on.

It could be worse. It could be the term “virtual”. “virtual” sounds like “online” or jargon.

This reminds me of a British journalist citing a cop as their “expert”. Cops don’t know shit about psychology. Cops often fall for pseudo-scientific nonsense which justify their biases and it’s not unheard of for them to make things up. They only know how to arrest people and are known for civil rights violations. The IWF has a former cop as a spokesman too.

He sounds like a far right grifter.

It wouldn’t do shit for that. The only thing it does is undermine the rule of law and possibly waste taxpayers money on a harmful “War on Drugs” type prohibition. Even if they were “indistinguishable” (not merely “realistic”), it would still be harmful to put that into a statute as a broad content based restriction, as opposed to dealing with someone’s specific conduct.

it’s not even like that. Tools and methods have existed for almost two decades, perhaps even before the popularization of computer-aided applications, to determine whether something is AIG (aside from simply looking at it).

In this webinar we will unveil a cutting-edge tool designed to empower investigators to distinguish between authentic and AI-generated material.

The mere existence of these tools and their popularization is a strong argument against the type of regime that the NH Legislators are propositioning.

1 Like

I also wanted to address this.

Yes, they did.

One of the main, primary concerns that the Congress who drafted the CPPA was the possibility of people using computer technology, which was described as ‘prohibitively expensive’ at the time but not inaccessible, nor non-existent. Tools like Photoshop and Gimp enabled professionals to create modified composites that could be of real minor faces applied to the modified bodies of petite adults through a process commonly referred to as ‘photobashing’.

The main concern wasn’t necessarily that just 3D vector-based rendering tech would be the issue, it was about computer-aided graphics in general. It wasn’t a non-existent problem, just one that wasn’t really as wide-spread.

In fact, the main caveat behind CSAM is the fact that it can be proven that it’s a photograph of a real minor.
In fact, I’ve not yet encountered an image which appeared so realistic that it couldn’t be distinguished from a real photo, only that it was trained on real photos of something.

So to act like that there was a substantial change that necessitates revisiting or invalidating the issue is beyond unreasonable.

3 Likes

The Ashcroft court DID address the issue of indistinguishability. The people who wrote the article in your OP either did not read Ashcroft, or they read it and are purposefully misleading. Ashcroft argued that while there could be such indistinguishable images, prohibiting fictional content just because there existed the chance that criminals could claim their images were virtual when they weren’t was equivalent to turning the first amendment on its head. You can’t ban something just because it could be criminal activity disguised as protected activity. The court was basically saying that the government shouldn’t be lazy and trample the rights of others in the process, and instead should do their jobs of proving that the images were real. Although, Justice Thomas in his concurring opinion did say that an affirmative defense could be added where the images were protected if it was proven that the images did not involve the use of real minors, which is what we have currently.
So while Ashcroft did not exicitly mention AI, they did address the issue of indistinguishability in general citing computer technology and such. Claiming that the court did not address AI and thus the issue needs revisiting is misleading and dishonest.

1 Like

This is true, but Ashcroft only proscribed depictions that were indistinguishable from minors that did not exist. Depictions that are indistinguishable from real minors implicate the likeness of a real child.

Rather than creating original images, pornographers can alter innocent pictures of real children so that the children appear to be engaged in sexual activity. Although morphed images may fall within the definition of virtual child pornography, they implicate the interests of real children and are in that sense closer to the images in Ferber. Respondents do not challenge this provision, and we do not consider it.

2 Likes

I am confused, Ashcroft did not proscribe images which were not of real minors, regardless of how close to reality they were. They only proscribed the morphed images.

I have a question. Since a depiction is illegal if it looks like some real person, it can be banned. But it is not required to identify an actual person. However, caricatures can be identified as a particular person, even if they are not at all realistic. So, if a CGI, AI, painting is accurate enough to look real, how can you tell if it’s a “real” person or a made-up image, since you don’t have to name said person?

It actually is required to identify the person, since 2256 requires that it depict an ‘identifiable minor’.

(9) “identifiable minor”—

(A) means a person—

(i)

(I)

who was a minor at the time the visual depiction was created, adapted, or modified; or

(II)

whose image as a minor was used in creating, adapting, or modifying the visual depiction; and

(ii)

who is recognizable as an actual person by the person’s face, likeness, or other distinguishing characteristic, such as a unique birthmark or other recognizable feature; and

(B)

shall not be construed to require proof of the actual identity of the identifiable minor.

“Shall not be construed to require proof of the actual identity of the identifiable minor” admittedly doesn’t make much sense, since those are factual requirements that must be proven, and are provable. All it really requires that there’s a deliberate resemblance to a real, existing child.

All it’s saying is that they don’t have to prove who it is (i.e. provide names, DOB, etc), only that it looks like someone.

Call me cynical, but I don’t think that a zealous prosecutor, with the help of a compliant judge, would let that stop them.