Question about definition of CSAM

Apologies if my question is inappropriate or tasteless. This question just popped into my head and got me thinking about the meaning of words:

Why exactly is CSAM (Child sexual abuse material) called CSAM?. It it called this because it’s creation requires child sexual abuse? If so, then my question is this:

What about so-called “self-generated CSAM”? We’re talking minors who take nudes or record sexual videos of themselves. I wouldn’t call a teen posing nude or masturbating “child abuse”, unless they’re somehow abusing themselves. So, in this case, is it still called CSAM because the imagery and videos can be abused for someone else’s personal gain (gratification, blackmail, money, etc.), even if there’s no direct adult-minor contact?

Hm, another question popped in my head while typing this: are drawings based on real children or even outright deepfakes CSAM if a child wasn’t outright sexually assaulted to create them? I suppose you could still call it child abuse via the misuse of a child’s image without their consent (which they can’t give)…

Hm, I guess the core of my pondering is what exactly is the full definition/scope of child sexual abuse? Does self-generated CP and stuff merely based on real children, etc. fall under the category of child abuse?

Note: I am not making justifications for people to seek out CP/CSAM just because it’s self-generated or anything like that. I still think minors who make such content of themselves don’t fully understand the consequences of misusing technology. At best, anyone who uses a minor’s self-generated CP is still taking advantage of a minor not understanding the risks of such things. I’m only asking if that still technically counts as outright child sexual abuse.

Apologies if I’m rambling/being confusing, etc., or if the subject matter is inappropriate. I just want to understand what counts as what. Precise terms and definitions are important (eg. the difference between a pedophile and ephebophile, or the difference between a pedophile and a child molester), right?

Yes - so as to not minimize it. The issue with the definition is that it comes from groups that also include fictional sexual materials of children, and kids who share stuff on YouTube, TikTok, etc, and it doesn’t take coercion or no coercion into account.

In some cases, children have been charged for sending selfies, and they are both the victim and the perpetrator of their own crime in the eyes of the law.

That’s included in the definition.

According to the groups who created the definition, yes. That too. Even though it involves no real child. Even though it could be an excellent substitution with few of the ethical issues involved. Even though you could certify the material to not involve real children in any way, the law would still consider realistic imagery child pornography.

Yes, they are, but most of the public thinks that anyone with an attraction is doomed to hurt kids and will forever pose a risk to children, so before we can really hammer out proper definitions, we need to spread awareness that this isn’t the case.

2 Likes

Very interesting. Thank you!

I only asked because, to me personally, the term CSAM seemed too extreme to include things like self-generated CP. Especially considering how many teens get filed as a sex offender for consensual sexting with teens their own age. To include that of all things under this umbrella term meant to designate the worst of the worst seems… off.

Prostasia has used the term “unlawful sexual images of minors” instead when we want to use a value-neutral descriptive term.

3 Likes

Antis have found and screenshotted this reply, so let me drop the link that explains why “unlawful sexual images of minors” is a better non-pejorative term when referring to images that aren’t abusive, such as teen sexting images. Of course, this isn’t to deny that sharing of such images by others can’t become abusive, and for that we use the term image-based child sexual abuse to describe such sharing. But even in that case, it’s still inaccurate and stigmatizing to refer to the images themselves as child abuse when nobody was abused in creating them.

2 Likes

Not forgetting dumb kids who take nude selfies of themselves.

1 Like

Yes, I do think of myself as gross. The shame, the guilt, the fear, etc., been building up in me since as long as I can remember. I’ve attempted suicide many times, if only I saw this poster sooner, when I was even younger and dumber!

Seriously, go fuck yourself, you and all your kind. Most peds discover their sexuality the same time everyone else does: adolescence. Lots of tween and teenaged pedophiles out there, and here you are encouraging hate and abuse towards those teens. So much for child protection…

This discussion reminds me of a case where the police arrested a 17-year-old girl, who took a nude selfie and prosecuted her as an ADULT for Pornography depicting a CHILD, i.e., herself. This bit of outrageous prosecutorial hypocritical stupidity proves Charles Dickens was right, the Law is an Ass.

3 Likes

Antis are taking your posts out of context? Color me shocked.

Drawing a line between self-produced pornographic imagery and material that is forced/coerced and a product of abuse is helpful.
I think most can agree that teens consensually sending nudes to one another and a recorded instance of a teen being sexually abused are not identical, while also agreeing that both should not be allowed to circulate or exist.

None of that undermines the legitimacy of regarding such abusive imagery as CP/CSAM, nor should any of that be interpreted to mean that such self-produced materials should be, in any way, decriminalized, legalized, or made available just because such materials are self-produced.
This isn’t to say that there isn’t any risk or that there is no harm in letting minors produce this type of media or that it’s ‘not as bad’ as a video of a minor being abused, there are valid arguments and reasons for why self-produced media is rightfully and justifiably illegal alongside material that is made from acts of coercion, assault, etc.
Just that those situations and reasons are different, but also acknowledging the area of overlap between those is also important.

A market for self-produced sexually explicit media (SEM), namely in the form of a social media hack/leak (e.g. Snapchat) carries with it the same inherent harms and risks as a market for coerced SEM, whereby minors are deceived or humiliated through the use of such materials, and these markets, by their very nature, put real children and teens at risk.

These types of conversations are difficult for a lot of people to have because of how much paranoia and concern exists over these issues.

One, because of the stigma surrounding the subject matter, and whether such dialogs can be hijacked by those with exploitative intentions, as we’ve seen with pro-contact/pro-abuse idealists and the ‘youth liberation’ movement.

And two because not a lot of people choose to know or familiarize themselves with this subject matter. It’s very easy for someone who is both nervous and afraid to unintentionally read what Mr. Malcolm wrote in bad-faith and draw their own conspiratorial conclusions from it due to all of these factors, and that type of paranoia can be infectious.
People are right to be skeptical. People are right to be anxious. But people also need to think rationally and critically.

I believe comprehensive sex education which covers sexual practices ought to be considered in tackling the issue of self-produced SEM, namely in helping teens understand the risks and harms associated with taking those images and sharing them.

1 Like

Thought, when I read this, that ‘antis’ are going to pick up on it. Which is strange when you think about it.

If you had used the word “illegal” instead of “unlawful” it wouldn’t have seemed quite so injudicious. Which is an irrational point perhaps, as both words are pretty much directly synonymous; “illegal” = against the law, “unlawful” = against the law.

Yet somehow our brains (well mine anyway) interpret “unlawful” as being much more ambiguous than “illegal”. Almost as though it were a euphemism.

Of course the other point: that it implies “sexual images of minors” could be “lawful” (or “legal”) is going to be made regardless, but this brings up another point.

If an image is illegal, then this means the image itself is not permitted to exist. Consequently, any such image should be destroyed the moment that it is discovered. The point being that images (e.g. CSAM) can never be in themselves illegal, only the possession, creation of, or downloading (“making”) of them - in some circumstances. If not the case, then how would anyone prosecute?