Does the ncmec report things like hentai, loli and 3d stuff to the police in the united states?

I’ve heard that they don’t, but I would like a citation that definitively proves that they don’t, since their website mention “visual depictions.” would it not matter how realistic or unrealistic it looks just as long as its not depicting a real child that exists and is truly fictional? I also don’t want them to report the police just because of obscenity laws meaning it could “potentially” be illegal. Im 99 percent sure they dont, especially because finding real children is their whole job. just want clear confirmation, thanks

1 Like

Yes they do. They are legally obligated to forward ANY report from a website to authorities.

Imgur and Google will report every fictional material (drawings etc.) to the NCMEC. Google has an algorithm tailored for Cartoons (they reported a known artist, because they detected Hentai on his Google Account, but authorities did not press charges).
So if a site says they treat fiction as CSAM ecpect a report to NCMEC which they forward (sites can state that they already reviewed the content and in those cases NCMEC does not even view the reported content).

@Chie hi, would appreciate your input, read an old post where you said the opposite

so the unfortunate reality is that the NCMEC will forward anything to the police, including images of adults or nonsexual images of real minors, if they have a proper suspect, regardless of whether they are legally actionable or not since the NCMEC’s liability statement is them saying that they’re not a law enforcement entity and can’t actually define anything. This includes both actionable and ‘informative’ tips, but ultimately it’s up to respective ICAC agencies to make that determination.

but for fiction specifically, things typically aren’t reported unless there’s a real minor involved or it involves photorealistic AI.

Reports from non-US origins or matters are automatically forwarded to their respective countries, regardless of their content. This is why authorities from European nations may occasionally complain about receiving non-actionable reports in large amounts in transparency disclosure (high intake of reports, only some of it actionable).

There used to be a legal requirement under 18 USC 2258A for websites to report drawings/cartoons and other things like that to the NCMEC, but that was written out, presumably due to a lack of interest by all parties involved.
They’re literally not CSAM, and even organizations like Thorn seem disinterested in them.

Some services like Google or Imgur do forward reports for it, but others like Discord or Twitter/X do not, likely due to there being no statutory requirement, or not being “apparent child pornography”, which has an actual definition. It’s best to stay away from these services/companies if at all possible and find alternatives.

Based on first-hand interactions with the NCMEC, the general public is broadly discouraged from filing cybertips for anything that isn’t tied to an actual minor since those reports are not legally actionable.
Nowhere on the public-facing Cybertipline reporting form or flow is there any suggestion that fiction is to be reported and the form used by ESPs registered with them does not even include fields for it (at most ‘generative AI’).
Only in the broader API that has to be manually integrated into a service’s backend does the specific file annotation fields appear for ‘virtual/drawing/hentai’ appear, and that’s a holdover from obsolete legislation.

Interestingly enough, there’s a lawsuit brewing that could see the scope of their activities be narrowed. Someone is suing precisely over their reporting practices having lead to some injury over the contents not being CSAM.

The NCMEC also doesn’t issue takedown notices to websites or platforms for fiction like they do for actual CSAM or exploitative imagery.

Web crawlers like ‘Project Arachnid’ and caching functions of search engines routinely scan for and report known CSAM content, as do CDNs like Cloudflare or Cloudfront (acquired by AWS). These are limited to known instances of real CSAM via hash matching.
But some of these are operated by groups such as the UK’s IWF (who do actively target fiction) among others, and their takedown notices for these specific contents are often ignored, since they don’t come from the NCMEC, despite such reports being shared with them.

2 Likes

That actually is what i wanted to hear thank you. Their website does mention visual depictions but it doesnt necessarily say it has to be of a real child, but i also saw the last part of 18 usc 2256 where it says that cartoons dont count which was good. I would assume that obscenity laws require them to report everything but apparently not as they and law enforcement would be overwhelmed if they did. Can you give me a citation or source that says they dont care about fictional stuff? You convinced me but id appreciate it

Sadly, the most you’ll get is something like this:

https://forum.prostasia.org/t/ncmec-pdna-hash-db-audit-cartoon-anime-content-found-but-not-met-federal-definition-of-child-pornography/

Where they basically admit that it’s not CSAM as a legal matter.

Hope this helps.

4 Likes

I may have bad news guys. The NCMEC most definitively does report Drawings, cartoons and other visual depictions of what they call CSAM. Even tho there are no children involved in any way, shape or form

Their legal overview is all messed up and it seems to only contradict itself over and over again. Since I was UNABLE to understand how the heck the NCMEC works and whatever it is they classify as CSAM, I had to check through Inhope’s legal Overview of the several Hotlines around the world and the measures they take regarding certains types of content. And the truth seems to be that wether they report it as CSAM using their Hashing tools or not that’s irrelevant. Because wether or not they consider it CSAM, they KNOWINGLY will report it.

I will show you guys some screen captures and will share the url to the PDF so you can download it and look at it.

Apparently even tho they are aware that Drawings/Cartoons are NOT ILLEGAL as they themselves state, they STILL report them to Law Enforcement Agencies ( LEA )

[ EDIT ]

They once again, avoid talking about digitally generated CSAM. Just like they didn’t mention it in the article @Chie posted earlier this year regarding the audit Concentrix did for them. They seem to refuse to disclose information regarding 3D material.

Even after their own legal overview stating their concept of CSAM is strictly indicated under the definition of child pornography.

And for some reason, a lot of their practices also are reporting things that ARE NOT illegal. Such as Fictional Text depictions of CSAM or Praise of Pedophilia ( Only God knows whatever the fck that means. ) They are AWARE they’re not illegal but they still report it to LEAs.

https://inhope.org/EN/articles/inhope-global-csam-legislative-overview-2024

1 Like

Let me step in..

I saw this information earlier, and was able to find mistakes in this table. The NCMEC forwards reports they receive for anything, regardless if it’s a drawing or an adult, to LEA. It all winds up in a database that is organized based on jurisdiction. In some occasions, reports are forwarded directly by an analyst to LEA, and that never happens with anything that doesn’t involve a real child.

The NCMEC isn’t in the business of determining whether or not something reported to their hotline is illegal, only if it constitutes child exploitation. 3DCG, cartoons, and other virtual materials statutorily do not meet the definition of child pornography. People are generally discouraged from submitting cybertips for fiction and they are not hashed for things like PhotoDNA.

This paper by INHOPE is not entirely correct about its information. Some of the information about specific countries is not correct or is outdated, like with Sweden.

1 Like

That overview is wrong for a bunch of countries, because the hotlines are clueless or misunderstood the questions. For an example:

In Spain only images that are photorealistic are considered CSAM. The overview has a category for “realistic images”, but for whatever reason they responded “Illegal (Dependant on context” under Cartoons and said that it is only illegal if realistic, but it should be clear that the first question is to be assumed as non-realistic anyways. Other hotlines did not distinguish between possession and distribution as well. Out of the 3 replies from the german side only “Jugendschutz net” filled out the table properly. Italy is also wrong. Taiwan is also wrong and so on.

That list will only really be used to peer pressure other countries into closing “regulatory holes” to “protect children”.

Praise of Paedophilia” could mean public approval of a crime related to that since a lot of people use CSA and that synonymous. In this case however I am also clueless since it says “or CSA”.

Welcome to the most hated people on Earth club. Doesn’t matter that you’ve never done anything to anyone. You deserve to be skinned alive and burned at the stake. Just for thinking bad thoughts or that you even looked.

2 Likes

Yep, better to learn how to hate people back sooner rather than later.

2 Likes