Drawings, cartoons, and 3D CGI are not CSAM

Depictions that are CSAM (child pornography) are limited to those involving real children. Drawings, cartoons, text, etc. are not photographic evidence of child sex abuse or exploitation, and therefore cannot meet this definition.

I’ve been seeing a lot of bickering on Twitter over whether or not fictional media, like cartoons, fictional stories, or sex dolls count as “child exploitation”, that the mere idea of a child being sexually abused, be it in artistic or in the form of mere words, is exploiting real children or is in any way comparable to the harm, abuse, and rape of actual child victims.

They don’t, and I’ll explain why.

When a platform, organization, or person casually asserts that a drawing of a fictional character or hypothetical entities are equivalent to actual entities, attributed to living, breathing counterparts, they are committing a fallacy. They are willfully trying to blur the line between real life and fantasy or fiction by supplanting the idea that the discomfort associated with the mere concept as the same as the actual event, much to the detriment of the critical thought process necessary to actually find and help victims.

Child exploitation and abuse material is a problem because it has a victim, an actual child, a human being. That is the requirement necessary to assert that there was, in some way, exploitation or abuse.

You can’t “exploit” or abuse an idea, a concept without fallaciously broadening, and potentially undermining, the literal context by which “exploitation” is being used here. It’s simply not possible, rather, it is oxymoronic.
You can’t claim “child exploitation” applies to all children, then apply it to material that, by definition, does not involve or depict actual children.

Here, we see the literal definition of “exploit”, as provided by EXPLOIT Definition & Usage Examples | Dictionary.com
Under this definition, the employment of child actors for innocent or non-sexual material, like a commercial for children’s toys or a Nickelodeon sitcom aimed at a general audience, could count as “child exploitation” due to the overall objective behind the production of such material, which is profit.

“But we’re talking about SEXUAL exploitation, a narrow category…”
Right. But even then, you can’t exploit an idea in the same context as an actual, human being. Ideas can technically be exploited for profit, to stretch and distort that context to apply to the “idea” of children, in addition to actual victimization, then that would be a frivolous and dangerous exercise of ‘reaching’.

“But what about the likenesses and identities of minors?”
That is indeed a valid concern, and are of great interest to the victims of CSAM, an interest they have a right to protect, hence why combating and eliminating CP/CSAM is so important. It implicates the privacy rights of the children who were victimized by the material’s existence.
Material that exploits the likenesses of real minors, such as ‘deepfakes’ or digitally-altered images which use the face or identity of a real minor DO count as CSAM and are illegal. They are not ‘fictional’ when they do this, and for good reason.
The reason why works of fiction, such as drawings, dolls, etc. are plainly excluded from the definition of CSAM/CSEM is primarily due to the fact that, for these drawings to exist, the sexual abuse or exploitation of children is not a requirement.

We need to do a better job challenging these nefarious, dishonest, and emotional actors with sound arguments founded in logic and reason. Do not be swayed by their dim-witted appeals to emotion, appeals to morality, slippery slope fallacies, etc.

They know their arguments are not valid, but will scream louder to drown out the truth.
Don’t give up. Challenge them and point out the flaws in their arguments and their attempt to fool people into thinking their emotions and peer pressure on the issue are a valid replacement for a reasonable determination of facts.

They only want to eradicate thoughts or ideas they find unpleasant. There is no harm in allowing fantasy material to exist, as we’ve seen time and time again with respect to the overall harmless effect they have on consumers and society, and the fact that their free speech rights presumably protect such depictions (depending on jurisdiction).
They are not interested in the well-being of children, or even sexual exploitation, really. This whole “moral panic” is nothing more than a fashionable taboo, and enforcing that taboo at the very real expense of the rights of creators, survivors, and consumers. It is quite literally, nothing more than a frivolous assault on our freedoms just because they don’t appeal to the high-school-popularity-contest-style logic I explained previously.
They are not right. They are wrong, because the censorship they want is far worse in action, than the unfounded, long-winded prospect that these materials will ‘incite’ or cause some form of harm.


It would be tantamount to saying killing NPCs in a game are murder.

“No actual, identifiable humans were hurt in the production of this but the IDEA of some anonymous person possibly being injured in a hypothetical future is just as bad!”

Fictional characters can’t be exploited anymore than they can be killed. To argue otherwise heavily implies that these characters have legal rights and sentience…


Personally, I started classifying myself as a human trafficker a while back for importing dakimakura covers.

It’s their same logic, after all.


Antis tend to agree that killing NPC’s in a video game isn’t murder, even killing child NPC’s is ok but sex stuff? No, that’s bad.


yeah. it’s honestly confounding how they can say “it’s not real” and understand that, but invoke all of this fear and anxiety about sex that really isn’t justified.

Yeah pedophilia is bad, but it exists and it has a right to exist in a safe, fictional outlet. Catharsis theory is valid


Frankly, playing with ragdoll corpses gets me off good enough. The only regret is nudity/underwear or lack thereof.


Because they’re more comfortable with violence in fiction. Sex stuff is apparently sacred, but torturing and having body counts that would put history’s greatest mass murderers to shame is perfectly ok because “I’m conditioned to accept it as normal.”

Either all crimes are ok in fiction or none are, this picking and choosing thing isn’t logical in the least and highly subjective.


This is partially why I find the “normalization” arguments parroted by antis to be so unconvincing, especially when they assume that simply because it deviates from or is perceived to be offensive to the norm, it should be censored to protect those norms. It’s already a strike against you when you think that these norms could possibly be corrupted or negatively impacted, while not once stopping to see how this forced norm of censorship

We can all agree that pedophilia is not a good thing. A recurring sexual preference for minors and young children can be and is extremely difficult for millions of people.
People don’t need to be reminded of how bad it is when they read about it or see it in media to know that it’s bad, nor are they likely to be convinced that it’s good by how it’s portrayed or talked about in an erotic Japanese cartoon, pedophile or not. They know.


To be honest, yeah, the harmless, if weird, stuff shouldn’t be banned as long as nobody IRL is being harmed in their production. Creepy, disturbing, offputting… But it shouldn’t be illegal unless it actively hurts real people in demonstrable ways.

Fiction (even loli content, to the shock of Puritans) doesn’t really turn people into criminals who act out fantasies. Very rarely is that the case, and even then some of those cases had people with severe problems in their life beyond what type of fiction they enjoyed. If the stuff you mentioned is normalizing sex crimes, then video games and horror movies are also normalizing crimes, but fans of the violent fiction (that are also against sexual fiction) usually say something akin to, “Yeah, but I’m personally ok with some of these, so it’s ok. That stuff I dislike is bad though.”


This comes from a failure to be objective. I’m personally not okay with a lot of stuff. I’m not okay with rape, abuse, anything with overtly ‘mean’ tones, etc. But I simply don’t watch or read anything with that kind of stuff in it because it doesn’t appeal to me, but I’m certainly not going to tell others that they shouldn’t have it, it shouldn’t exist, or that it shouldn’t be made when there’s no objective justification for why they shouldn’t.
“B-but muh morals…” is not a reason.


And that’s the beauty of it, you’re defending the free speech you disagree with just as much the ones you do agree with. That’s true free speech. The people arguing for stuff they’re only ok with aren’t for free speech, they can’t be objective, it’s just a giant appeal to emotions and bandwagon logic.

If a piece of fiction bother you, just don’t watch it lol. Don’t ruin it for everyone else, that’s something a Puritan would do.


I bring this point up, in part, because I’m beginning to see people use the term “VCSAM”, meaning “virtual child sex abuse material”, a literal oxymoronic term.

How can you call it a form of child sex abuse material if there’s no actual child being abused? A fictitious drawing or CGI model of what appears to be a non-existent child is no more a child than a youthful adult. There is no person being used or exploited. There is no real identity. There is no abuse. There is no underlying crime being committed.
The desire to merge two similar, yet distinct concepts is palpable, almost to a point where it feels like a form of subtle rhetoric or propaganda. This is a farce.

So far, the only publications which use this are half-baked, biased opinion pieces which hail from Australia. But I fear that if such a logical inconsistency isn’t formally addressed, it could become potentially problematic for researchers, advocates, and lay persons alike.

By definition, CSAM is a visual recording of an actual child being sexually exploited or abused. It is distinct from the phrase “child pornography” because CSAM highlights and emphasizes the abusive and exploitive nature of the material in question, as a matter of objective fact.
The idea that something which does not involve or depict a real child being abused or exploited could in any way be grafted onto the definition of CSAM, even superficially, trivializes and needlessly complicates matters to an unnecessary degree.

Such a logical inconsistency cannot go unaddressed.

what do you think of this? Would Prostasia be willing to write a piece about it?


Indeed I was thinking of at least starting a thread about terminology. The people who are misusing the term VCSAM would probably never read it, but anything we can do to push back against that term is worthwhile. Another term that has been used, and that I also dislike, is NPAI (non-photographic abuse imagery). Last year Prostasia participated in a study on this topic from Lincoln University to express our concerns about the loaded implications of that term.


The lengths people are taking in order to validate and impose their prejudices and mores on ways they should not be is troubling.

The other day, somebody actually tried to tell me that the key defining factor of child pornography is the “idea of a child”, not whether an actual child was abused or involved in some way.

This moral panic needs to be destroyed. I can only hope that the academic institutions can remain neutral on the subject and not fall prey to such rhetorical devices.
This is literally how repression starts.


Aside from maybe deepfakes made with the faces or likenesses of real children, I don’t see how that term can withstand scrutiny.

When I run thru these arguments in my head, to me, it seems like they’re emphasizing that, as a concept, adult-child sex is always a form of abuse, regardless of the context or whether it’s real or fictional.
They’re basically rejecting the very idea of “consensual” child-adult sexual activity, while also inadvertently admitting that they’re fashioning child abuse terminology into a rhetorical device which they can then attempt to exploit to make it easier for them to include material that is not CSAM, without justification or empirical consensus.

While it’s true that adult-child sex is always a form of abuse in real life, such conceptual reasoning simply is not applicable to matters of fiction or fantasy material.
The guiding variables behind CSAM, as both a hypothetical concept and an actuality, would be contradicted by the lack of a real child victim.
Placing “virtual” as a prefix to this does not mitigate this gaping logical inconsistency.

To put it in layman’s terms, it’s the functional equivalent to claiming a horror movie is a snuff film.

I hope to see Prostasia make a post or blog post about it soon, maybe bring it up with other members of the advisory council.

There’s no hiding the fact that these opinionated, biased actors are trying to corrupt academic discourse.

There was a paper I was skimming thru not too long ago, where in the abstract, the authors make the bold claim that it’s “up to educated women” to fix things.
What bothers me so much about that paper is the fact that it’s supposedly a scientific paper invoking language and biases that are political, not scientific.

It’s not up to “educated women” to solve or fix anything.
It’s up to educated, qualified academic scholars and fellows to employ proper scientific procedure to further humanity’s knowledge of and understanding of the topic, regardless of their gender.


It’s very, very tiring to see this kind of rhetoric echoed day in and day out. I’m (thankfully) not a CSA victim but I cannot imagine some of the worst moments of my life made comparable to lines on paper or pixels.

It’s bizarre: many of these people have no issues with video game violence or bloody anime series, but heaven forbid a fan creator (often femme, BIPOC, and/or queer) explores sexuality? Suddenly the concern is that they’ll replicate it despite having no such fears about guro and the like.


I take issue with them calling pornography to begin with, truth be told. I see a lot of people claim that “CSEM is a pedophile dogwhistle”, since so many folks on cough Twitter have to clarify that this so-called “CP” folks are worried about are so often just anime characters having sex.

Pornography involves two consenting adult actors. Children cannot consent, so to label proof that they were sexually abused “pornography” minimizes the egregiousness of such material – not to mention it creates alarm fatigue. I hear someone “is a pedo”, and I have to wonder if they they a) are just drawing anime characters, b) are just a non-offending pedophile minding their business, or c) an actual abuser that people need to be made aware of. It’s very, very rarely C.


As a CSA victim myself, it’s very reassuring to see such a reasonable statement. I feel awful when I invoke my trauma to get a point across, but it seems like that’s the level I have to stoop to in order to get to some people.


People are supposed to be smart right? So I guess we must be able to do some kinda of agreement here. Like this, I have a idea: What if, when we fantasy about us abusing a fictional children, in response people could fantasy about arresting us in a fictional jail? What about that? Sounds fair to me :3


Only if I can slaughter all the fantasy guards during the prison break from this fictional jail.