NCMEC pDNA Hash DB audit - 'cartoon/anime' content found, but not met federal definition of child pornography

The NCMEC recently published its statistics and figures for 2023, and something that caught my eye was this blurb:

In 2023, NCMEC engaged Concentrix, a customer experience solutions and technology company, to conduct an independent audit of the hashes on our list. They were tasked with verifying that all the unique hashes corresponded to images and videos that were CSAM and met the U.S. federal legal definition of child pornography. The audit, the first of its kind for any hash list, found that 99.99% of the images and videos reviewed were verified as containing CSAM.
Learn more about the audit process and their findings here.

I checked out the PDF, and something caught my eye.

KEY FINDINGS
David J. Slavinsky, Concentrix’ primary site director during the audit made the
following conclusions upon completion of the Concentrix audit:

  1. Concentrix moderators completed two independent reviews of each of the 538,922 images and videos on NCMEC’s CSAM Hash List.
  2. Concentrix’s audit concluded that 99.99% of the images and videos met the federal definition of Child Pornography under 18 U.S.C. § 2256(8).
  3. Concentrix moderators identified 59 exceptions during their audit as follows:
    a. 50 images and/or videos were deemed not to meet the federal definition of Child Pornography.
    b. 5 images and/or videos were classified as a cartoon/anime depicting sexually exploitative content involving a child, but not meeting the federal definition of Child Pornography.
    c. 2 videos were damaged and could not be played.
    d. 2 images and/or videos contained sexual content, but the Concentrix moderators could not distinguish whether a child or an adult was depicted in the imagery.

This basically means that, of the 538,922 images whose hashes are included, only 5 of them were cartoon/anime. The audit was looking specifically for images that fell under 18 USC 2256(8), which cannot be read to include cartoons/anime within its definition criteria.

(8) “child pornography” means any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, of sexually explicit conduct, where—
(A)
the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct;
(B)
such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or
(C)
such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.
. . .
(11)
the term “indistinguishable” used with respect to a depiction, means virtually indistinguishable, in that the depiction is such that an ordinary person viewing the depiction would conclude that the depiction is of an actual minor engaged in sexually explicit conduct. This definition does not apply to depictions that are drawings, cartoons, sculptures, or paintings depicting minors or adults.

This is also something that aligned with a December 2023 Stanford study which explored the datasets used to train popular AI models to find CSAM.

Comprehensiveness of PhotoDNA: PhotoDNA is naturally limited to only known instances of CSAM, and has certain areas upon which it performs poorly. For example, based on keywords, significant amounts of illustrated cartoons depicting CSAM appear to be present in the dataset, but none of these resulted in PhotoDNA matches.

I think it’s interesting that the NCMEC brought in an outside firm to essentially perform the same task that they themselves ought to have done themselves. I suppose the external credibility is helpful, but I still would have preferred assurance that analysts were doing their jobs and ensuring that only content made from known victims would get in, not materials that are either ambiguous or deliberately do not fall under the scope and definition of 18 USC 2256.

1 Like

Actually, I think they were right. An audit should be performed by an outside expert. This assures people that the organization is honestly doing their job.

They don’t specifically mention removing any 3D renders, which are frankly, more likely to be present on the list than a cartoon. Doesn’t that tell you all you need to know?

Pinging @elliot for visibility

Worries me a little bit, too. But who’s to say that such contents wouldn’t also be considered a ‘cartoon’? 2256 is actually very clear on it being a real child or being indistinguishable from one.

1 Like

Did you know that Microsoft has a non-disclosure agreement preventing tech companies from disclosing false positives?

We know nothing about their methodology. Extraordinary power requires extraordinary scrutiny. This is not that.

1 Like

It seems reasonable to feel confident that the difference depends on whether an image is expression or evidence of exploitation.

I really don’t understand this view of yours. There are tech companies who have deemed realism in art to be indicative of malice, at one point or another, even though that is practically never going to be the case. Google has done that. Discord has done that. I know of more.

A sketchy NGO with a poor track record doing it wouldn’t come as a slightest surprise. For an audit, it contains very little information. Whatever your opinion is, I would like to see more transparency.

Yes, @Larry is right. It is good practice to get someone external to do it, although nothing prevents them from doing both. Think of it like this, there is a sketchy NGO which is doing censorship from behind the scenes and who works closely with the government. How are we to know they’re not doing anything “extra”? It’s not perfect but it’s something.

I actually agree wholeheartedly with something like PhotoDNA being used to only scan for known, verified CSAM.
The whole prospect of scanning people’s messages is very unsettling to lots of people, so the least we could do is ensure that they adhere to some objectivity, i.e. it’s only limited to sexually explicit depictions of actual children. This audit does reassure me that is indeed the case.

1 Like

Chie has a handle on this.

There is no legitimate state interest in protecting fire breathing dragons or other things that don’t exist. The legitimate state interest is in considering what does.

I understand the proxy idea. It’s interesting to consider that an expression that cannot be proscribed as incitement could be proscribed as evidence.

If condemnation depends on magical thinking, or the law of similarity that’s tethered to sympathetic magic, then it’s tethered to an ideology rather than to anything concrete.

It’s not evil to think that shape isn’t what makes anything precious. It’s not evil to feel uninhibited by something that cannot harm anyone.

3 Likes

It really makes no sense how you could conflate a depiction that is, as a matter of provable, objective fact, not of a real person, as though it were of that.
The argument that these materials promote CSAM and thereby makes it ‘CSAM’ is not logically sound, because 1. they do not promote CSAM, and 2. they are not CSAM because they lack an actual minor victim.

There’s no promotion of this type of material because all communities which consume this type of material participate in the detection/interdiction of CSAM content and consumers from within, sharing the information of offenders with authorities and making this point clear. Japanese law enforcement works with the NCMEC to report people sharing CSAM on Pixiv, and Pixiv may also work directly with the NCMEC if necessary, same with Pawoo. Moreover, these communities consume these types of materials because they lack a real victim, and prefer to abstain from criminal content.
Yes, drawn/cartoon/3DCG images may sometimes be consumed in conjunction with CSAM by offenders, but also adult porn, and petite adult content as well. But the amount of drawn/cartoon/3DCG content found alongside consumers of CSAM is far less in comparison to adult content.

It’s no different than conflating snuff content with scenes from action or horror movies, or violent video games. The Human Centipede and Saw are not snuff films, Lolita is not CSAM, and under the same virtue, loli/shota content or other forms of simulated material that are entirely fictional are not CSAM.

Someone tried to argue with me that it has to ‘lack serious value’ to be considered CSAM and almost immediately I was reminded of the types of arguments that people made prior to Ferber, which necessitated that ‘real child’ requirement for what defines child pornography/CSAM as a matter of law and speech.
CSAM, as a matter of (unfortunate) fact, has artistic value by virtue of being pornographic, but we do not support its creation because of how it was created and who it affects. Even if we discount that fact, should a recording of my sexual abuse that I suffered be retained and shared because someone found a way to make transform it into something religious persons would find less apprehensive? Absolutely not! But you wouldn’t argue that with material that lacks a real minor because there is no victim to be re-victimized by its circulation, and if you’re willing to acknowledge that, why not acknowledge the victim that you’re creating with the legal system by turning fiction into a literal thoughtcrime?

The fact of the matter is, these materials lack the market-demand connection with CSAM or acts of child exploitation, they lack a real victim, and at worst, may scare/offend people.
These are facts that the SCOTUS acknowledged in Ashcroft v. Free Speech Coalition and they have not changed, even with the rise of Generative AI. Deepfakes of real children and materials derived from actual CSAM do, however, have a victim and can be criminalized as such.

3 Likes