FBI - AI CSAM is Illegal if it depicts, or is indistinguishable from, an actual minor

I posted about this in the thread Generative AI is NOT CSAM unless it depicts a real minor, but after thinking about it for a little bit, I figured it deserves its own post.

On Friday, March 29, 2024, the FBI and IC3 organization put out a PSA warning that AI-generated CSAM was illegal, and ultimately may mislead some people into thinking that all CGI depictions are illegal under child pornography law.

They’re not, and even the article’s contents corroborate this fact. Even the use of the language ‘content manipulation technology’, to the two cited court cases, to the references to a long-held federal definitions statute, even a DoJ article clarifying what all of it means supports my contention.

https://www.ic3.gov/Media/Y2024/PSA240329

FBI/IC3 Public Service Announcement

Alert Number: I-032924-PSA

March 29, 2024

Child Sexual Abuse Material Created by Generative AI and Similar Online Tools is Illegal


The FBI is warning the public that child sexual abuse material (CSAM) created with content manipulation technologies, to include generative artificial intelligence (AI), is illegal. Federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession of any CSAM,1 including realistic computer-generated images.

Background

Individuals have been known to use content manipulation technologies and services to create sexually explicit photos and videos that appear true-to-life. One such technology is generative AI, which can create content — including text, images, audio, or video — with prompts by a user. Generative AI models create responses using sophisticated machine learning algorithms and statistical models that are trained often on open-source information, such as text and images from the internet. Generative AI models learn patterns and relationships from massive amounts of data, which enables them to generate new content that may be similar, but not identical, to the underlying training data. Recent advances in generative AI have led to expansive research and development as well as widespread accessibility, and now even the least technical users can generate realistic artwork, images, and videos — including CSAM — from text prompts.

Examples

Recent cases involving individuals having altered images into CSAM include a child psychiatrist and a convicted sex offender:

  • In November 2023, a child psychiatrist in Charlotte, North Carolina, was sentenced to 40 years in prison, followed by 30 years of supervised release, for sexual exploitation of a minor and using AI to create CSAM images of minors. Regarding the use of AI, the evidence showed the psychiatrist used a web-based AI application to alter images of actual, clothed minors into CSAM2.
  • In November 2023, a federal jury convicted a Pittsburgh, Pennsylvania registered sex offender of possessing modified CSAM of child celebrities. The Pittsburgh man possessed pictures that digitally superimposed the faces of child actors onto nude bodies and bodies engaged in sex acts3.

There are also incidents of teenagers using AI technology to create CSAM by altering ordinary clothed pictures of their classmates to make them appear nude.

Recommendations

  • For more information on altered images, see the FBI June 2023 PSA titled “Malicious Actors Manipulating Photos and Videos to Create Explicit Content and Sextortion Schemes” at https://www.ic3.gov/Media/Y2023/PSA230605.
  • If you are aware of CSAM production, including AI generated material, please report it to the following:
    1. National Center for Missing and Exploited Children [1-800-THE LOST or www.cybertipline.org]
    2. FBI Internet Crime Complaint Center [www.ic3.gov]

References


1 The term “child pornography” is currently used in federal statutes and is defined as any visual depiction of sexually explicit conduct involving a person less than 18 years old. See 18 U.S.C. § 2256(8). While this phrase still appears in federal law, “child sexual abuse material” is preferred, as it better reflects the abuse that is depicted in the images and videos and the resulting trauma to the child. :leftwards_arrow_with_hook:

2See https://www.justice.gov/usao-wdnc/pr/charlotte-child-psychiatrist-sentenced-40-years-prison-sexual-exploitation-minor-and :leftwards_arrow_with_hook:

3See https://www.justice.gov/opa/pr/registered-sex-offender-convicted-possessing-child-sexual-abuse-material :leftwards_arrow_with_hook:

As stated in the earlier post, I think it’s rather clear from the wording of the FBI post that it is limited to using real images of children, or materials which are indistinguishable from them.
This does not seem to impact, nor contradict, the legality of CGI materials which depict fictional ‘children’.

This does not apply to contents where the minor does not exist and is still readily distinguishable from a photograph - or modified photograph - of a real minor.

If anyone thinks that there is reason to doubt this, I have included a footnote taken from the FBI’s June 2023 PSA, (bolded emphasis added by myself).

Generally, synthetic content may be considered protected speech under the First Amendment; however, the FBI may investigate when associated facts and reporting indicate potential violations of federal criminal statutes. Mobile applications, “deepfake-as-a-service,” and other publicly available tools increasingly make it easier for malicious actors to manipulate existing or create new images or videos. These tools, often freely found online, are used to create highly realistic and customizable deepfake content of targeted victims or to target secondary, associated victims.

To assuage any further doubt of this, I’ll even quote the DOJ citation taken from the article itself.

Images of child pornography are not protected under First Amendment rights, and are illegal contraband under federal law. Section 2256 of Title 18, United States Code, defines child pornography as any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age). Visual depictions include photographs, videos, digital or computer generated images indistinguishable from an actual minor, and images created, adapted, or modified, but appear to depict an identifiable, actual minor.

Computer-generated content that is still perfectly distinguishable from that of an actual minor is, and will continue to be exempt from, the language of the federal statute which defines child pornography/CSAM, 18 USC 2256(8)

1 Like

Pinging @elliot @terminus and @Gilian for visibility

It’s just so weird seeing this post come out. I wish it were worded a little better, ‘realistic computer-generated images’ is too broad and also incorrect, it isn’t enough that it’s ‘realistic’, and even the body and citations provided make that true. Maybe their dictionary just didn’t include ‘photorealistic’? Who knows.

(8) “child pornography” means any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, of sexually explicit conduct, where—

(A)

the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct;

(B)

such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or

(C)

such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.

(9) “identifiable minor”—

(A) means a person—

(i)

(I)

who was a minor at the time the visual depiction was created, adapted, or modified; or

(II)

whose image as a minor was used in creating, adapting, or modifying the visual depiction; and

(ii)

who is recognizable as an actual person by the person’s face, likeness, or other distinguishing characteristic, such as a unique birthmark or other recognizable feature; and

(B)

shall not be construed to require proof of the actual identity of the identifiable minor.

. . .

(11)

the term “indistinguishable” used with respect to a depiction, means virtually indistinguishable, in that the depiction is such that an ordinary person viewing the depiction would conclude that the depiction is of an actual minor engaged in sexually explicit conduct. This definition does not apply to depictions that are drawings, cartoons, sculptures, or paintings depicting minors or adults.

‘realistic computer-generated images’

Tech Coalition | The Issue The Tech Coalition uses a similar style.

Reaktion auf Deepfakes: Das will die neue EU-Richtlinie gegen sexuellen Missbrauch European tech pundits use a similar style.

This phrase makes it clear that if AI generated content can’t be determined as fake, it’s a crime. When AI gets to the point of photorealism, you won’t be able to tell it apart from an actual photo.

On a separate, but related thought, I have noticed AI children look more like small, flat chested adults than children. Except for the faces. To train the AI to produce a believable child body will require real nude child photographs. If my understanding of AI is correct. Where are these to come from, since CSAM is not allowed? I don’t think there are that many available.

Well, courts and even the citations listed by the PSA have interpreted that to literally mean an actual, real person.
It can be a realistic depiction of a child that doesn’t actually exist and still be legal - that’s legally indistinct from a petite youthful adult playing a character who is a minor.

I really wish the FBI had opted to use more specific language here, since they’re technically correct but most people who aren’t versed in the complexities of this won’t know.
Actually reading the PSA tells you that they mean altered or manipulated photos of real people, but the headline itself is very clickbaity and borderline misleading.

I don’t think it’ll ever reach that point where an experienced individual or tool cannot detect whether something is AI-generated. There are too many flaws that are borderline patternistic with AI models, even the latest ones.

I’ve seen AI content that looks true-to-life but still readily distinguishable after looking at it for long enough. I think the main concern here, with SEM involving ‘minors’, is whether any specific child or likeness could be used or if it was based off actual CSAM.

1 Like

that’s very intriguing. That Stanford paper from December 2023 really scared a lot of people, but the reality is that most models now, at least the ones being used by FSM/VCP communities, were being audited and re-trained as to not include CSEM, or any images of actual minors, with keywords and metadata for 3DCG being used instead.

There’s been a real push by good-faith actors to do right by both the law and strong adherence to ethics to ensure that no model is trained on real CSAM. I’m gonna do whatever I can to ensure that their work does not go unnoticed in both the legal world and academia.

It is possible to produce an identifiable likeness without any photography. Paintings, sketches, CGI, etc. have produced images of recognizable people. Since AI is supposed to provide near realistic results, based on an aggregation of real people, and identifying a specific person is not required, what’s the difference? And given the attitudes of the many pearl-clutching, do-gooding, soft-headed, asinine little old ladies, of all ages and genders, will that really matter?

That section was added to address the possibility of people superimposing or ‘splicing’ the faces or identifiable likenesses of actual minors into sexually explicit situations, basically ‘deepfakes’. It is that section, in conjunction with the ‘indistinguishable’ bit, that makes it possible to prosecute such computer-generated materials which implicate real minors.

It doesn’t really make sense that it wouldn’t need to require proof of the actual identity, since it IS their identity.

who is recognizable as an actual person by the person’s face, likeness, or other distinguishing characteristic, such as a unique birthmark or other recognizable feature

Moral busybodying can be disputed once called out and addressed with reason.

For AI art, I use cartoon screenshots for image to image prompts to generate the art while also using newer models which have illegal content removed. Image to image doesn’t necessarily make it the same person though as the person that is in the image.

You could steal the word “pseudo-photograph” from the British. Doesn’t that cover what you want criminalized? There are obscenity laws there and a law which criminalizes “photographs” and “pseudo-photographs”.

As an example of one you are missing, the Obscene Publications Act is a mostly unenforced obscenity law from the 50s which a politician fantasizes about reviving.

I’d blame the pundits who keep saying they’re “not sure” if it is illegal.

Deepfakes and altered images of actual minors are illegal, which is pretty much what the body of the notice alludes to. Purely fictional rendered images of children that do not exist are on the same level of legality as adult pornography.

I’ve seen some people try to use the ambiguity of the language “indistinguishable” to imply that it doesn’t have to be an image of a real minor, just indistinguishable from one, but that interpretation has not been upheld by literally any court. Even as recently as last year.

https://www.criminallegalnews.org/news/2023/jun/15/arkansas-supreme-court-reverses-11-counts-possession-child-pornography-because-cgi-images-do-not-depict-image-child/

For purposes of the issues in the instant opinion, a “person” is “[a] natural person.” § 5-1-102(13)(A)(i) (Repl. 2013). The Court explained: “Thus, although § 5-27-602(a)(2) includes possession of CGI, the criminal act is limited to possession of imagery depicting or incorporating the image of a child – a natural person under seventeen years of age – engaging in sexually explicit conduct. Section 5-27-602(a)(2) necessarily excludes CGI that does not depict or incorporate the image of a child under the statutory definition.

The Court’s reading of § 5-27-602(a)(2) is consistent with Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002), in which the U.S. Supreme Court “declared unconstitutional as violative of the First Amendment § 2256(8)(B) of the Child Pornography Prevention Act of 1996 (“CPPA”), which prohibited ‘any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture that is, or appears to be, of a minor engaging in
sexually explicit conduct.’”

While New York v. Ferber, 458 U.S. 747 (1982), held that child pornography is deemed speech outside the protection of the First Amendment, it is based on the fact that child pornography is “intrinsically linked” to the sexual abuse of children, i.e., production of child pornography could be accomplished only by recording the crime of sexually abusing children. But the CPPA’s proscription of computer-generated imagery or “virtual imagery” went too far because it recorded no crime, and it created no victims by its production.

In the present case, the Court agreed with Lewis that the State failed to present evidence that the images identified as CGI by the State’s expert in Counts 1, 15, 16, and 23-30 depicted or incorporated the image of a child. The Court conducted its own independent examination of the image in Count 1 and concluded it did not depict or incorporate the image of a child. The Court concluded “[t]his constitutes a failure of proof sufficient to sustain Lewis’s convictions.”

Accordingly, the Court reversed and dismissed on Counts 1, 15, 16, and 22-30. See: Lewis v. State, 2023 Ark. 12 (Ark. 2023).

You could use the argument that this is merely a state’s jurisprudence, but it’s certainly not the only one, nor is it one that exists in isolation.

1 Like

It’s one of David’s papers. He is notorious for stirring up bullshit panics. He speaks to no one relevant and lectures people from the mountain top despite knowing fuck all. His primary source for whether those links were CSEM appears to be C3P. By the way, is it ethical to gotcha a provider with a bullshit scandal, rather than quietly informing them months prior? It reeks.

It needs to be documented somewhere easy to cite.

You…speak as though you know him personally, or are at least familiar with his works. I’d like to initiate a correspondence with him so he doesn’t misuse terminology (like equating fiction to CSAM) or allow his own views poison the credibility of his works.

Indeed.

I should watch my words more. The censorship might be making me a bit irritable.

I don’t know. I don’t think he is reasonable. You wouldn’t be the first although you are more sophisticated than a few. He is very opinionated and thinks people should do things his way. That is what is so frustrating about him. He conflates one kind of content with another. One site with many others. It’s so frustrating.

Another thing is that Jeremy didn’t leave you with the best legacy. You’re far better than he was though, even if imperfect. At the same time, this hell can’t be allowed to continue.

I just want to see people think critically and act accordingly.

Conflating CSAM with materials that do not even involve or depict an actual ‘child’ simply does not meet that criteria, otherwise anything that “looks like” it could be considered abuse material, including images of adults, drawings, etc.

Fictional drawings, paintings, and CGI are no more CSAM than a picture of an adult who looks young. Adults are not children and neither are the ink, paint, and vertices which make up these fictional characters.

There must be an intrinsic victim for it to be considered CSAM.

4 Likes

…and it’s genuinely relieving to see the FBI echo the sentiments that these types of materials still require an active victim in order to be prosecuted.

3 Likes