"<animeDrawingVirtualHentai>" found in CyberTipline API

Not sure if this was ever brought to light, or if there were any posts already made talking about it but here it is.
I found it on Google while researching whether, like INHOPE, the US-based nonprofit NCMEC treats anime/manga as criminal CSAM if reported as such.

This is the technical documentation for how the NCMEC CyberTipline functions, what info it includes, how it tags that info, etc.

C.1. File Annotations

Tags to describe a file.


    The file is depicting anime, drawing, cartoon, virtual or hentai.

Obviously, this is an issue of concern if you live in the US because it means that their tagging system is equipped to identify fictional/artistic works from actual CSA materials.

For whatever purpose, though, is unclear. NCMEC, in both their CyberTip reporting system and relevant media/blog posts, claims their focus is limited to what can only be legally identified as CSA materials (18 USC 2256 and a8 USC 2251 et seq.)

Federal law specifically excludes artistic depictions made without the use of minors from this definition, such as drawings, sculptures, paintings, characters played by youthful adults, etc.

So tell me, are they treating drawings of fictional characters, made without the involvement of minors, clearly and concisely excluded from law as CSA materials in the US?
Or is this to aid other countries where such materials are illegal?


I took the liberty of archiving the page incase they ever decide to censor it.

I have to admit though, that angers me. It also falls back to a thread by Prostasia(?) in the past and if the UK filter includes drawings in their blocklist.

Good job noticing that.

1 Like

I’d like some formal transparency from NCMEC. There shouldn’t be a need to clarify that a report pertaining to CSAM isn’t anime because, by default, it is NOT considered CSAM. That should immediately be the end of it.

I just want everything to be okay here in the US. That’s all. I don’t want to see the time of NCMEC wasted by weeaboos on imageboards and risk real children undergoing actual abuse go without the proper attention because of a miscommunication regarding the difference between obscenity and child pornography.

Besides - lots of anime and manga depicts characters who are canonically described as minors engaged in sexually explicit conduct which are sold and shared freely over mainstream services.
Amazon sells To Love-Ru BD
Fakku and J-List both sell lolikon and lolikon-themed merchandise directly to consumers
Steam sells Nekopara and several other X-rated VNs
Jast USA sells pornographic VNs and X-rated patches that restore cut content - some depicting minor characters.
Netflix hosts Nick Kroll’s animated comedy series “Big Mouth” where children explore sex and puberty. Some scenes are graphic and show genitals.

All of this is very concerning and at the expense of actual child victims and artists, purveyors of legal adult content. NCMEC isn’t in the business of judging artistic merit or patent-offensiveness in terms of art that can only ever be an “appearance” or accusing something of being obscene.

I’d like the Prostasia foundation to take a look at this and say something, offer up some form of clarification or reassurance that they’re not treating anime/manga/fiction as though it were actual CSAM.


Hopefully the option is just there to quickly exclude cartoon porn being processed by law enforcement in countries where it’s legal. It’s a waste of time when there are hundreds of TOR services full or REAL child abuse. Likewise, not in every country bestiality is illegal so reporting them there would be just a waste of time. This is how I understand how these options work, to proritize removing stuff that is important to remove.

The real problem is that a lot of places haven’t decided if lolicon is legal or not yet

Good catch! So, here’s another little scoop for you: although Prostasia Foundation has been denied access to some of the CSAM scanning tools (looking at you, Project Arachnid and Thorn), we do have access to PhotoDNA because we are implementing it for MAP Support Chat, and developing integration software for RocketChat. During the testing phase, we have been able to put it through its paces using a suite of test images, and also using some known-legal images such as hentai, to ensure that misreported images don’t trigger a match.

The good news is, it doesn’t seem that hentai images are included in the NCMEC database, despite the fact that there is a reporting field for it. Since they are notoriously secretive about what is in the database and about how PhotoDNA works, and won’t answer direct questions about it, implementing it ourselves is the only way we can find out. Testing has gone OK (no false positives so far, and we have an idea of how “fuzzy” matches can be), so we will be rolling it out into production. And here’s the exciting part… we’ll also be releasing the source code for our RocketChat integration software so that others can do the same.

1 Like