"<animeDrawingVirtualHentai>" found in CyberTipline API

Not sure if this was ever brought to light, or if there were any posts already made talking about it but here it is.
I found it on Google while researching whether, like INHOPE, the US-based nonprofit NCMEC treats anime/manga as criminal CSAM if reported as such.

This is the technical documentation for how the NCMEC CyberTipline functions, what info it includes, how it tags that info, etc.

C.1. File Annotations

Tags to describe a file.


    The file is depicting anime, drawing, cartoon, virtual or hentai.

Obviously, this is an issue of concern if you live in the US because it means that their tagging system is equipped to identify fictional/artistic works from actual CSA materials.

For whatever purpose, though, is unclear. NCMEC, in both their CyberTip reporting system and relevant media/blog posts, claims their focus is limited to what can only be legally identified as CSA materials (18 USC 2256 and a8 USC 2251 et seq.)

Federal law specifically excludes artistic depictions made without the use of minors from this definition, such as drawings, sculptures, paintings, characters played by youthful adults, etc.

So tell me, are they treating drawings of fictional characters, made without the involvement of minors, clearly and concisely excluded from law as CSA materials in the US?
Or is this to aid other countries where such materials are illegal?


I took the liberty of archiving the page incase they ever decide to censor it.

I have to admit though, that angers me. It also falls back to a thread by Prostasia(?) in the past and if the UK filter includes drawings in their blocklist.

Good job noticing that.


I’d like some formal transparency from NCMEC. There shouldn’t be a need to clarify that a report pertaining to CSAM isn’t anime because, by default, it is NOT considered CSAM. That should immediately be the end of it.

I just want everything to be okay here in the US. That’s all. I don’t want to see the time of NCMEC wasted by weeaboos on imageboards and risk real children undergoing actual abuse go without the proper attention because of a miscommunication regarding the difference between obscenity and child pornography.

Besides - lots of anime and manga depicts characters who are canonically described as minors engaged in sexually explicit conduct which are sold and shared freely over mainstream services.
Amazon sells To Love-Ru BD
Fakku and J-List both sell lolikon and lolikon-themed merchandise directly to consumers
Steam sells Nekopara and several other X-rated VNs
Jast USA sells pornographic VNs and X-rated patches that restore cut content - some depicting minor characters.
Netflix hosts Nick Kroll’s animated comedy series “Big Mouth” where children explore sex and puberty. Some scenes are graphic and show genitals.

All of this is very concerning and at the expense of actual child victims and artists, purveyors of legal adult content. NCMEC isn’t in the business of judging artistic merit or patent-offensiveness in terms of art that can only ever be an “appearance” or accusing something of being obscene.

I’d like the Prostasia foundation to take a look at this and say something, offer up some form of clarification or reassurance that they’re not treating anime/manga/fiction as though it were actual CSAM.


Hopefully the option is just there to quickly exclude cartoon porn being processed by law enforcement in countries where it’s legal. It’s a waste of time when there are hundreds of TOR services full or REAL child abuse. Likewise, not in every country bestiality is illegal so reporting them there would be just a waste of time. This is how I understand how these options work, to proritize removing stuff that is important to remove.

1 Like

The real problem is that a lot of places haven’t decided if lolicon is legal or not yet

Good catch! So, here’s another little scoop for you: although Prostasia Foundation has been denied access to some of the CSAM scanning tools (looking at you, Project Arachnid and Thorn), we do have access to PhotoDNA because we are implementing it for MAP Support Chat, and developing integration software for RocketChat. During the testing phase, we have been able to put it through its paces using a suite of test images, and also using some known-legal images such as hentai, to ensure that misreported images don’t trigger a match.

The good news is, it doesn’t seem that hentai images are included in the NCMEC database, despite the fact that there is a reporting field for it. Since they are notoriously secretive about what is in the database and about how PhotoDNA works, and won’t answer direct questions about it, implementing it ourselves is the only way we can find out. Testing has gone OK (no false positives so far, and we have an idea of how “fuzzy” matches can be), so we will be rolling it out into production. And here’s the exciting part… we’ll also be releasing the source code for our RocketChat integration software so that others can do the same.

1 Like

There are a number of possibility for this:

  1. They know people will send false reports and they want to give them the satisfaction of submitting it only for them to drop the reports.
  2. They’re forwarding the reports to other countries. This was previously confirmed and a letter was sent in response.
  3. They’re forwarding the reports to other countries and States they know it will be held to be obscene.
  4. They’re archiving the reports and images for later use.

As a libertarian, I oppose such filters no matter yet. But, that is irrelevant.

PhotoDNA has the unusual property of automatically reporting an incident to the police if a false positive occurs, which could result in some very… unpleasant encounters, especially if the police find anything which constitutes Probable Cause on the server. Like an admission to commiting a crime.

This is one of a multitude of reasons why I dislike this technology. Another being it gives a false sense of security and can be bypassed by any adversary who is sufficiently motivated. It is cumbersome, introduces fragility to security contexts, diminishes performance, and in the most rights preserving form, requires individual Services to lug around large databases of hashes locally.

Synchronising these databases too is cumbersome. Without these databases, all hashes theoretically end up going through a PRISM partner, Microsoft, which means all uploads to a Service ends up going through the National Security Agency, who will use it for undefined purposes and invade individual privacy. This may be a price worth paying here. Or not. Only the Service Operator can make the appropriate trade-offs they’re comfortable with.


Old thread, but here’s some relevant info.
Seems like NCMEC likely stores hashes for cartoon/anime if this document is anything to go by.
The Child Rescue Coalition reports that they store cartoon hashes on page 27. If they do it, how can it be said that NCMEC doesn’t do the same?


Because the NCMEC is only required to store and share hashes that are linked to forms of actual child sex abuse.
They only use that file annotation for the reporting of grooming, from what I was able to deduce after talking with a former analyst, reports for such material that aren’t grooming-related are labeled as either misreports or spam.

The NCMEC is very clear about what reports should contain, which is material defined under 18 USC 2256, and 18 USC 2252 et al.

Aren’t drawn materials still outlawed under the PROTECT act? I can’t find info stating that NCMEC specifically follows those 2 US laws.


That part of the PROTECT Act is an obscenity law, which is not a point of concern that the NCMEC’s CyberTipline seems to be able to account for.

Materials such as that are only illegal if they’re “obscene”, which is a rule that applies content made with adults as well, and since obscenity is a state-by-state thing, and definitions are all nothing but meaningless conjecture, it’s something that’s far more complicated and I’d rather not get into the nitty gritty details of it all since they’re confusing and inconsistent.

But cartoon child pornography IS legal in the United States.

It’s not just two laws. 18 USC 2256 is what defines “child pornography” at the federal level, and 2252 et seq. are the laws which control it. In none of those laws are images not made with real children included, which is deliberate since the wake of Ashcroft v. Free Speech Coalition.

It’s stated on the CyberTipline itself what types of materials can be reported when you go to file a report.
They don’t triage reports for fiction unless they’re used for grooming or solicitation purposes.

Moreover, Congress actually amended the law to exclude “obscene images” from the reporting scheme back in 2018.

Here’s the laws in question.

1 Like