Generative AI CSAM is not CSAM unless it involves the use of a real minor or their likeness to create explicit sexual content.
A troubling blog post from the NCMEC which discusses the advent of generative AI content. This post serves to counter this notion and hopefully correct some of the misinformation outlined therein.
I found this blog post while I was browsing their homepage looking to access the CyberTipline in order to report an illegal website looking to share CSAM on social media.
The article, to its credit, is that itâs mostly relegated to issues involving cases that have victims, not mere characters or fictional âchildrenâ, who by definition, do not exist and cannot be the victims of child sexual exploitation.
However, that ends the pleasantries.
As if the creation of this imagery wasnât terrible enough, NCMEC also has received reports where bad actors have tried to use this illegal GAI content to extort a child or their family for financial means. Furthermore, users of the technology to create this material have used the argument that, âAt least I didnât hurt a real childâ and âItâs not actually a childâŚâ
With regard exclusively to that last sentence, they have a point, one that the NCMEC has both acknowledged and accepted both internally and publicly on more than one occasion.
Though, I suspect that the quotation was intended to be received in-line with its previous statement, whereby they may claim that such materials were taken with innocuous photos, rather than photos or videos depicting recorded acts of child sex abuse, somehow makes the inclusion of a real-childâs likeness less impactful or meaningful.
It is important that @prostasia step forward and act as a meaningful voice in this argument, lest we trivialize and undermine the focus on CSAM prevention; whereby victims become less of a focus and more of a talking point to suppress allegedly problematic content without due cause or evidence.
As taken from Ashcroft v. Free Speech Coalition
Ferber recognized that â[t]he Miller standard, like all general definitions of what may be banned as obscene, does not reflect the Stateâs particular and more compelling interest in prosecuting those who promote the sexual exploitation of children.â 458 U.S., at 761.
The arguments that these fictional materials, on their own, constitute criminality or harm is and has always been without merit, especially when considering the NCMECâs own internal and public-facing policies and practices within this field, and the continued absence of any scientific evidence that supports any causal relationship with harmful acts or behaviors, all the while science continues to point to positive effects of these materials in providing people with prevention efforts.
Iâm horrified at how such an innovative and useful organization like the NCMEC can get caught up in rhetoric drummed up by the British IWF, whose legal systems, policies, etc. are not up-to-par with what most experts within CSE, paraphilia research, and prevention think about the issue.
CSAM has never been interpreted by the majority of the world to include contents or materials which are not actually children. Fictional âchildrenâ are not people, and therefore cannot suffer abuse. The NCMEC has always been good to emphasize this being the case.
Iâve seen some people try to argue the contrary, that they are âdepictionsâ of child sex abuse, but even this argument fails because itâs still not a real child. It goes from âit is a childâ to âit looks like a childâ.
Under this logic, explicit adult content produced with consenting petite/youthful adult actors with child-like features who could be misinterpreted as being far younger than they actually are could be CSAM, or even a doll, or drawing, or painting.
Attempts to undermine or dismiss the very real and necessary requirement that an actual victim be involved do nothing to actually help victims and only serves to conflate the reality of their abuse with an arbitrary matter of viewpoint, trivializing the very focus that is intended to be in their benefit with accommodating insecurity and discomfort with the subject matter, while also turning matters intended to deal with abuse into thought-crimes.
GAI CSAM is CSAM. The creation and circulation of GAI CSAM is harmful and illegal. Even the images that do not depict a real child put a strain on law enforcement resources and impede identification of real child victims.
It is only CSAM if it depicts a real child, as the legal definitions set forth by the US Supreme Court, Congress, and the legal systems of multiple US states have observed.
By prohibiting child pornography that does not depict an actual child, the statute goes beyond New York v. Ferber, 458 U.S. 747 (1982), which distinguished child pornography from other sexually explicit speech because of the Stateâs interest in protecting the children exploited by the production process.
In contrast to the speech in Ferber, speech that itself is the record of sexual abuse, the CPPA prohibits speech that records no crime and creates no victims by its production. Virtual child pornography is not âintrinsically relatedâ to the sexual abuse of children, as were the materials in Ferber. 458 U.S., at 759. While the Government asserts that the images can lead to actual instances of child abuse, see infra, at 13â16, the causal link is contingent and indirect. The harm does not necessarily follow from the speech, but depends upon some unquantified potential for subsequent criminal acts.
Moreover, while it is possible that such materials may add to the load of investigators tracking down and identifying victims, this concern is greatly overstated, as no resource Iâve been able to find has asserted that these tools would severely cripple or permanently impede efforts to do the good work of identifying and rescuing victims. Sorting thru materials that may âappear to beâ minors engaged in sexual conduct and materials that actually are is part of the job. Users of websites and platforms are only required to report acts or materials which may constitute actual CSAM or CSAM-related activities, in addition to grooming behaviors.
Investigative tools and methods exist to determine whether an image is photographic of a real-life event or synthetically created, or is an altered photograph. Moreover, it is very possible that generative AI models can be queried to determine whatâs within their training data, and even their own outputs studied to determine whether they are based on CSAM imagery.
It is essential that federal and state laws be updated to clarify that GAI CSAM is illegal and children victimized by sexually exploitative and nude images created by GAI technology have civil remedies to protect themselves from further harm. Additionally, legislation and regulation is needed to ensure that GAI technology is not trained on child sexual exploitation content, is taught not to create such content, and that GAI platforms are required to detect, report, and remove attempts to create child sexual exploitation content and held responsible for creation of this content using their tools.
But the law already does precisely this. 18 USC 2256 provides the federal definitions for child pornography/CSAM, which are:
(8) âchild pornographyâ means any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, of sexually explicit conduct, whereâ
(A)
the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct;
(B)
such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or
(C)
such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.
(9) âidentifiable minorââ
(A) means a personâ
(i)
(I)
who was a minor at the time the visual depiction was created, adapted, or modified; or
(II)
whose image as a minor was used in creating, adapting, or modifying the visual depiction; and
(ii)
who is recognizable as an actual person by the personâs face, likeness, or other distinguishing characteristic, such as a unique birthmark or other recognizable feature; and
(B)
shall not be construed to require proof of the actual identity of the identifiable minor.
. . .
(11)
the term âindistinguishableâ used with respect to a depiction, means virtually indistinguishable, in that the depiction is such that an ordinary person viewing the depiction would conclude that the depiction is of an actual minor engaged in sexually explicit conduct. This definition does not apply to depictions that are drawings, cartoons, sculptures, or paintings depicting minors or adults.
This definition is more than sufficient to outline what types of content or conduct is to be defined as CSAM, requiring that such materials be sufficiently realistic and also a depiction of a person who actually exists.
While it is possible that materials may be generated that do not depict real people, but may appear to depict a non-existent person, the requirement that a real-life person be used does not escape necessity, and is something that can be proven in a court of law.
It also outlines exactly how the use of a real childâs likeness can be proscribed, which is precisely how various criminal cases (even more recent ones) have been used fruitfully to bring those who sexually exploit real children to justice.
Splicing a real childâs face, likeness, or otherwise identifiable characteristics in such a way that they would appear pornographic would run afoul of these laws and leave those responsible open to prosecution.