I posted about this in the thread Generative AI is NOT CSAM unless it depicts a real minor, but after thinking about it for a little bit, I figured it deserves its own post.
On Friday, March 29, 2024, the FBI and IC3 organization put out a PSA warning that AI-generated CSAM was illegal, and ultimately may mislead some people into thinking that all CGI depictions are illegal under child pornography law.
They’re not, and even the article’s contents corroborate this fact. Even the use of the language ‘content manipulation technology’, to the two cited court cases, to the references to a long-held federal definitions statute, even a DoJ article clarifying what all of it means supports my contention.
https://www.ic3.gov/Media/Y2024/PSA240329
Alert Number: I-032924-PSA
March 29, 2024
Child Sexual Abuse Material Created by Generative AI and Similar Online Tools is Illegal
The FBI is warning the public that child sexual abuse material (CSAM) created with content manipulation technologies, to include generative artificial intelligence (AI), is illegal. Federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession of any CSAM,1 including realistic computer-generated images.
Background
Individuals have been known to use content manipulation technologies and services to create sexually explicit photos and videos that appear true-to-life. One such technology is generative AI, which can create content — including text, images, audio, or video — with prompts by a user. Generative AI models create responses using sophisticated machine learning algorithms and statistical models that are trained often on open-source information, such as text and images from the internet. Generative AI models learn patterns and relationships from massive amounts of data, which enables them to generate new content that may be similar, but not identical, to the underlying training data. Recent advances in generative AI have led to expansive research and development as well as widespread accessibility, and now even the least technical users can generate realistic artwork, images, and videos — including CSAM — from text prompts.
Examples
Recent cases involving individuals having altered images into CSAM include a child psychiatrist and a convicted sex offender:
- In November 2023, a child psychiatrist in Charlotte, North Carolina, was sentenced to 40 years in prison, followed by 30 years of supervised release, for sexual exploitation of a minor and using AI to create CSAM images of minors. Regarding the use of AI, the evidence showed the psychiatrist used a web-based AI application to alter images of actual, clothed minors into CSAM2.
- In November 2023, a federal jury convicted a Pittsburgh, Pennsylvania registered sex offender of possessing modified CSAM of child celebrities. The Pittsburgh man possessed pictures that digitally superimposed the faces of child actors onto nude bodies and bodies engaged in sex acts3.
There are also incidents of teenagers using AI technology to create CSAM by altering ordinary clothed pictures of their classmates to make them appear nude.
Recommendations
- For more information on altered images, see the FBI June 2023 PSA titled “Malicious Actors Manipulating Photos and Videos to Create Explicit Content and Sextortion Schemes” at https://www.ic3.gov/Media/Y2023/PSA230605.
- If you are aware of CSAM production, including AI generated material, please report it to the following:
- National Center for Missing and Exploited Children [1-800-THE LOST or www.cybertipline.org]
- FBI Internet Crime Complaint Center [www.ic3.gov]
References
- Website | Government Accountability Office | “SCIENCE & TECH SPOTLIGHT: GENERATIVE AI” | June 2023 | Accessed 26 December 2023 | URL: https://www.gao.gov/assets/830/826491.pdf
- Website | Department of Justice | “A Citizens Guide to Child Pornography” | August 2023 | Accessed 26 December 2023 | https://www.justice.gov/criminal/criminal-ceos/citizens-guide-us-federal-law-child-pornography
- Website | Stanford Internet Observatory | “Identifying and Eliminating CSAM in Generative ML Training Data and Models” | 21 December 2023 | Accessed 26 December 2023 | URL: stacks.stanford.edu/file/druid:kh752sm9123/ml_training_data_csam_report-2023-12-21.pdf
1 The term “child pornography” is currently used in federal statutes and is defined as any visual depiction of sexually explicit conduct involving a person less than 18 years old. See 18 U.S.C. § 2256(8). While this phrase still appears in federal law, “child sexual abuse material” is preferred, as it better reflects the abuse that is depicted in the images and videos and the resulting trauma to the child.
2See https://www.justice.gov/usao-wdnc/pr/charlotte-child-psychiatrist-sentenced-40-years-prison-sexual-exploitation-minor-and
3See https://www.justice.gov/opa/pr/registered-sex-offender-convicted-possessing-child-sexual-abuse-material
As stated in the earlier post, I think it’s rather clear from the wording of the FBI post that it is limited to using real images of children, or materials which are indistinguishable from them.
This does not seem to impact, nor contradict, the legality of CGI materials which depict fictional ‘children’.
This does not apply to contents where the minor does not exist and is still readily distinguishable from a photograph - or modified photograph - of a real minor.
If anyone thinks that there is reason to doubt this, I have included a footnote taken from the FBI’s June 2023 PSA, (bolded emphasis added by myself).
Generally, synthetic content may be considered protected speech under the First Amendment; however, the FBI may investigate when associated facts and reporting indicate potential violations of federal criminal statutes. Mobile applications, “deepfake-as-a-service,” and other publicly available tools increasingly make it easier for malicious actors to manipulate existing or create new images or videos. These tools, often freely found online, are used to create highly realistic and customizable deepfake content of targeted victims or to target secondary, associated victims.
To assuage any further doubt of this, I’ll even quote the DOJ citation taken from the article itself.
Images of child pornography are not protected under First Amendment rights, and are illegal contraband under federal law. Section 2256 of Title 18, United States Code, defines child pornography as any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age). Visual depictions include photographs, videos, digital or computer generated images indistinguishable from an actual minor, and images created, adapted, or modified, but appear to depict an identifiable, actual minor.
Computer-generated content that is still perfectly distinguishable from that of an actual minor is, and will continue to be exempt from, the language of the federal statute which defines child pornography/CSAM, 18 USC 2256(8)