Actually… Are you sure that the UK includes images of cartoons within their “CAID” database?
I’m looking at the link you provided, and it doesn’t even seem like they consider cartoons/fiction as “CSAM”, but rather as “prohibited images” and carries a maximum sentence of 3 years in prison.
@Maza
Are you ABSOLUTELY SURE that fiction is included?
Right. “Prohibited images”, not “child sex abuse images”. This article seems to be more of an overview on the UK’s laws and how they deal with or categorize such things.
Just becuase they talk about it here, that doesn’t necessarily mean they view them as abuse imagery, nor does it mean they include such imagery in their hash databases. They even specify that the materials included in their ‘CAID’ seem to be about abuse.
I’d hold off on anything of that nature being stated. Keep in mind, there are still very real reasons why they wouldn’t want to include materials that are, by definition, not CSAM that even the most prudent of zealot would argue against.
The people over at ECPAT, who try to argue otherwise, have no valid counterargument, and luckily, their fallacious reading of the facts are not held by others, including members of the scientific community agree that lumping in materials that are not products of abuse within a definition specifically focused on acts of actual abuse
I am fairly certain that they do. Also what’s funny is the following disclaimer on a report site:
Warning! Reporting anything other than child sexual abuse imagery wastes valuable charity resources and prevents our analysts from finding and removing more child sexual abuse content online.
It’s the UK. Those images are presumed illegal. Thus, even if I disagree with them being illegal there, it makes sense why they would accept reports. In the US it is different, as these materials are presumed to be protected until proven otherwise in court. Hence why NCMEC does not report them unless it involves a real child somehow.
Yes, but it’s hilarious that they warn about not wasting resources, so they can protect kids. There was a study they did and 50-80% of reports couldn’t be acted on cuz it was hentai hosted outside the UK. So they told on Twitter to make sure its in the UK. Not even trying to question their govs laws.
Can’t help feeling this might largely be down to a nuanced interpretation of the term ‘CSAM’.
In the US it’s: Child - Sexual Abuse Material.
In the UK it’s: Child Sexual Abuse - Material.
So the emphasis in US is on whether it depicts an actual child, whereas in UK it’s simply whether the material depicts an act of sexual abuse, even if it’s fictional make-believe.
(Similar in some comparisons to how Americans tend to emphasise the 1st syllable of certain long words and us Brits emphasise the 2nd.)
In either case, whether or not the child in question actually considers themself abused - either at the time or in later life - is, er… “Immaterial”.
It then remains to be seen if fictional abuse of a non-existent child actually harms anyone as opposed to actual abuse of a real child which certainly can harm that child. It is possible fictional CSAM could increase CSA. I very much doubt that and the research suggests otherwise. The jury is still out.
I think that is a wholly separate issue. Whether or not the child feels abused, it does not excuse breaking the law. Breaking a silly and useless law, such as banning fictional CSAM is more understandable, but it still is the law. Caveat legisruptor.