Social media a ‘conveyor belt’ for child abuse images, says NSPCC

I really don’t like anything the UK does because it’s more or less “How can we use children as a soapbox to get our foot in the door to censor anything else we don’t like further on?”

But anyway…

Social media is being used as a “conveyor belt” to produce and share child abuse images on an “industrial scale”, the NSPCC has said as it revealed more than 100,000 images had been recorded by police in the last five years.

The child protection charity called on Culture Secretary Nadine Dorries to strengthen the Online Safety Bill to disrupt this offending.

It said the draft Bill fails to do enough to protect children, and has published a five-point plan to bolster it.

2 Likes

The thing is I don’t know what exactly they refer to here since Manga is treated as CSAM as well. I can see how loli art is massive on social media and I have a feeling that this also plays a role here.

You can really inflate CSAM statistics with drawings and cry about how everything is going downhill, but what if a good portion of new cases are Manga? UK is a weird place.

2 Likes

The reality is that only a tiny fraction of CSAM is shared on social media. Most of the large numbers that Facebook reports relate to multiple reports of just a few pieces of content that are widely shared for humor value or as callouts, not for sexual gratification.

3 Likes

could that be due to the advent of things like PhotoDNA?

1 Like

Yes and because people have learned that social media companies don’t tolerate it.

2 Likes

Our pioneering new laws will be the most comprehensive in the world in protecting children online.

Checks CSAM and CSA statistics in the UK

Yeah, I doubt it. It’s funny how countries with less strict rules and more safe outlets have much lower cases.

3 Likes

It’s about projection and control.

While It’s definitely premature to assume, it would not surprise me in the slightest of these various lawmakers, statesmen, and “experts” or even layperson who argue so staunchly in favor of increased regulation and surveillance in conjunction with disallowing/prohibiting non-abusive sexual outlets were some of the most prolific CSA offenders themselves, as opposed to being out of a legitimate form of concern or belief, or even a straight “appeal to moralism/populism” political rhetoric, if not outright just uninformed, rash decision-making.

An easy critique I don’t see that many people make is that such restriction and regulation proposals represent a gaping lack of restraint on the parts of statesmen, wherein, like the impulsive proverbial “pedo menace”, they act in-kind, but out of fear and panic, rather than rationality.

It’s nauseating to see, and reflects a failure on the democratic process because the responsibility of the legislator is usurped.

The ground by which the fallacy of projection stands is a sense of personal familiarity and contempt.

1 Like

I’m afraid I take much of the missives from the NSPCC and their ilk with a large pinch of salt. A lot of it is iver-estimating, guesswork, and plain rabble-rousing.

I suspect the increase in prosecutions is down to more investigation rather than any significant increase in cases. The vast majority of CSAM trading is not done via social media, it’s done by private communication. The amount of CSAM I have seen on social media (ie Facebook, Twitter and so on) is tiny. It’s also removed promptly when reported.

I also think that most of what is gleefully called “more child porn” by the press actually isn’t, and the “millions of images of children being raped and tortured” is probably the imagination of some journalist trying to make a name for himself.

2 Likes

IMO also, though I would probably caveat that with “[potential] CSA offenders”.

Instead of a genuine reflection upon their own human weaknesses, and in the face of an honest “there but for the grace of God go I” reflection, they would instead advocate for a regime where any inappropriate thoughts regarding children would be so unfeasible it would be unthinkable… literally. But at least trying to achieve this makes them look good in the public eye.

However, being that such a situation (where it is impossible to abuse a child) is actually a much more unfeasible scenario - it doesn’t seem unreasonable to suggest that sometimes people with this mindset are prone to act on their frustrations; the frustration of their own unwelcome thoughts and the frustration of living in a world where it’s still possible for those thoughts to become actions.

Unfortunately (in this sense) the advent of the internet and Social Media has meant that the issues of CSA/CSAM and ‘Minor attraction’ have become more acknowledged by a greater diversity of people and hence a greater number of people who struggle with this dissonance. Were it that this wasn’t an unfortunate side-effect of discussing these issues, I’m sure @prostasia would have a much easier time of it.

I feel it would be a much happier and healthier society if it were understood by everyone that we all grapple with unwelcome desires on occasion - often of a sexual nature (though perhaps not always). There may be some exceptions, but I’m more inclined to believe those people would be deluding themselves.

I often imagine a world where anybody could talk to anybody else about anything at all without fear, prejudice or judgement. Not altogether sure if that’s wholly unfeasible, but sadly, I have to accept it’s completely unrealistic.

Interestingly enough, the person who was in charge of the plans to prevent children from seeing pornography on the Internet here in the UK about seven years ago was actually arrested for “indecent images of children.”

2 Likes

And these are the same people who project their sick desires onto those who just want to consume fiction, dolls, fantasy, etc. and not harm anyone.

As the Brits say… “Bloody hell”.

Evil truly is pervasive.

Two points really. First, from the article:

“It was immediately referred to the National Crime Agency (CEOP).”

I always wonder about all the officers and forensic staff that work at CEOP (‘Child Exploitation and Online Protection’ Command). Sure, their protection from prosecution is covered by “Memorandum of Understanding Between the Crown Prosecution Service (CPS) and the National Police Chiefs’ Council (NPCC) concerning Section 46 Sexual Offences Act 2003” .

However, how does that prevent said officers & staff from becoming inured to the content of the images they deal with, and how can anyone be sure that they don’t get the same thoughts and feelings thst the person who is being prosecuted for those images had?

There is internal mental health therapy/support services in place that supposedly deals with those who “struggle” with their feelings in this regard, but (as with their approach to CSAM in general) it’s aim is to repair the damage rather than prevent it. So, the moment any technician/officer starts formulating a personal opinion not in keeping with the accepted requirement of their role, shouldn’t it be that they are, at that moment, culpable in the same way as anyone else viewing CSAM?

Anyway, second point:

I was surprised that Patrick Rock was even investigated to be honest; 20 images of 9 girls which weren’t even nude.

No doubt there may have been political (with a small ‘p’) reasons for bringing this to court; anyone who wasn’t a public figure, let alone not been on the policy committee mentioned, may have got a caution at most, but IWF, CEOP, FBI or any other similar organisation that investigates the internet for CSAM/CSEM would be unlikely to progress an investigation of 20 images of clothed children, whatever the child’s postures in them… 20 thousand perhaps.

The immediate thought is: "were these images any different to those that are criticised in ‘Cuties’ still available on Netflix? At the time they compared them to that of the music video of Britney Spears ‘Hit Me Baby One More Time’.

“Sasha Wass QC, told jurors they would have to decide whether the images were worth ‘criminalising’ a man of previous good character over.”

In the end, they did.