Does the ncmec report things like hentai, loli and 3d stuff to the police in the united states?

I’ve heard that they don’t, but I would like a citation that definitively proves that they don’t, since their website mention “visual depictions.” would it not matter how realistic or unrealistic it looks just as long as its not depicting a real child that exists and is truly fictional? I also don’t want them to report the police just because of obscenity laws meaning it could “potentially” be illegal. Im 99 percent sure they dont, especially because finding real children is their whole job. just want clear confirmation, thanks

Yes they do. They are legally obligated to forward ANY report from a website to authorities.

Imgur and Google will report every fictional material (drawings etc.) to the NCMEC. Google has an algorithm tailored for Cartoons (they reported a known artist, because they detected Hentai on his Google Account, but authorities did not press charges).
So if a site says they treat fiction as CSAM ecpect a report to NCMEC which they forward (sites can state that they already reviewed the content and in those cases NCMEC does not even view the reported content).

@Chie hi, would appreciate your input, read an old post where you said the opposite

so the unfortunate reality is that the NCMEC will forward anything to the police, including images of adults or nonsexual images of real minors, if they have a proper suspect, regardless of whether they are legally actionable or not since the NCMEC’s liability statement is them saying that they’re not a law enforcement entity and can’t actually define anything. This includes both actionable and ‘informative’ tips, but ultimately it’s up to respective ICAC agencies to make that determination.

but for fiction specifically, things typically aren’t reported unless there’s a real minor involved or it involves photorealistic AI.

Reports from non-US origins or matters are automatically forwarded to their respective countries, regardless of their content. This is why authorities from European nations may occasionally complain about receiving non-actionable reports in large amounts in transparency disclosure (high intake of reports, only some of it actionable).

There used to be a legal requirement under 18 USC 2258A for websites to report drawings/cartoons and other things like that to the NCMEC, but that was written out, presumably due to a lack of interest by all parties involved.
They’re literally not CSAM, and even organizations like Thorn seem disinterested in them.

Some services like Google or Imgur do forward reports for it, but others like Discord or Twitter/X do not, likely due to there being no statutory requirement, or not being “apparent child pornography”, which has an actual definition. It’s best to stay away from these services/companies if at all possible and find alternatives.

Based on first-hand interactions with the NCMEC, the general public is broadly discouraged from filing cybertips for anything that isn’t tied to an actual minor since those reports are not legally actionable.
Nowhere on the public-facing Cybertipline reporting form or flow is there any suggestion that fiction is to be reported and the form used by ESPs registered with them does not even include fields for it (at most ‘generative AI’).
Only in the broader API that has to be manually integrated into a service’s backend does the specific file annotation fields appear for ‘virtual/drawing/hentai’ appear, and that’s a holdover from obsolete legislation.

Interestingly enough, there’s a lawsuit brewing that could see the scope of their activities be narrowed. Someone is suing precisely over their reporting practices having lead to some injury over the contents not being CSAM.

The NCMEC also doesn’t issue takedown notices to websites or platforms for fiction like they do for actual CSAM or exploitative imagery.

Web crawlers like ‘Project Arachnid’ and caching functions of search engines routinely scan for and report known CSAM content, as do CDNs like Cloudflare or Cloudfront (acquired by AWS). These are limited to known instances of real CSAM via hash matching.
But some of these are operated by groups such as the UK’s IWF (who do actively target fiction) among others, and their takedown notices for these specific contents are often ignored, since they don’t come from the NCMEC, despite such reports being shared with them.

1 Like

That actually is what i wanted to hear thank you. Their website does mention visual depictions but it doesnt necessarily say it has to be of a real child, but i also saw the last part of 18 usc 2256 where it says that cartoons dont count which was good. I would assume that obscenity laws require them to report everything but apparently not as they and law enforcement would be overwhelmed if they did. Can you give me a citation or source that says they dont care about fictional stuff? You convinced me but id appreciate it

Sadly, the most you’ll get is something like this:

https://forum.prostasia.org/t/ncmec-pdna-hash-db-audit-cartoon-anime-content-found-but-not-met-federal-definition-of-child-pornography/

Where they basically admit that it’s not CSAM as a legal matter.

Hope this helps.

2 Likes