Is it harmful to look at CP?

The focus is the act of looking at child pornography. Is it harmful?

I will start by forwarding a post from Ethan Edwards’ blog. Ethan is one of the founders of Virtuous Pedophiles.

Discuss: Looking at CP is not a victimless crime (2016) [Link]

I started debating this question privately with someone recently, and thought I’d address it publicly here.

I first thought about the phrase “victimless crime”. The <first article I looked at on the subject> convinced me that it’s not really a useful concept. Sometimes we are concerned with an uncertain risk of creating a victim (say drunk driving), and other times the victim is the taxpayer. The phrase implies the law is unjust. Some laws are unjust, but it’s better to discuss them on their own merits of harm, risk to others, what freedoms they curtail, and the costs of enforcement and investigation, etc.

I have argued <earlier> that laws against simple possession of CP are an unacceptable assault on civil liberties. Any benefit is not worth the right of people to be left alone, reading and looking at whatever they wish without government interference. In this, I agree with the position of the ACLU. But others do not, and think it is fitting to punish people for their private activities if they can trace harm to others.

Now suppose we (temporarily) turn away from what is legal and what isn’t and address only what is right and wrong. This simplifies things enormously. We no longer deal with investigation, discovery, and determining guilt beyond a reasonable doubt. We don’t have to deal with the stigma that comes from an investigation of innocent people.

Now we can just address what is right and wrong, given any particular situation.

So, what is the harm in watching CP? I have dealt with this topic before. What are possible sources of harm?

One occurs if the person viewing the CP has asked a producer to produce some new CP for him. That is morally egregious.

A victim can be harmed is if police seize a person’s computer, discover the CP, and notify the victim that such a video has been found. I think this harm can be laid squarely at the feet of society in having passed the laws to allow or require this.

A victim can be harmed if the downloader finds a way to contact the victim and does so.

Another source of harm (not to the specific person in the CP) would occur if the person viewing the CP was as a result more likely to abuse a child. Research suggests at most a very small effect and the possibility that it actually <protects against hands-on child abuse instead of increasing it>. What’s more, this can be evaluated by an individual himself. If he is certain that he is not going to abuse a child, then it does not affect him. If he has previously abused a child, we and society might see this very differently.

There is harm if the downloader pays for it, and the money makes its way back to the producer, encouraging him to make more.

Another is if he indicates his approval by writing a message of encouragement. And another occurs if a person decides to host the CP or pass it along to others.

But the typical CP downloader does none of these things. All he does is download a file and look at it.

The path of direct harm that is asserted is that the person in the CP is distressed to know that others are out there looking at a record of her abuse for sexual satisfaction. How can the victim come to know this? Several of the ways above could lead to such knowledge, but the moral responsibility rests on the individual people who do those things.

The remaining path of harm is that the download will register in some fashion in web statistics. I can imagine that a victim would be more distressed to know that 1,000 people have downloaded the CP compared to 10. Yet of course 10 downloads from a given site means nothing, since the material could appear anywhere due to reposting and forwarding. This would be very hard to trace. It is safe to assume that the average person downloading CP (as opposed to part of some semi-private network) is going to be seeing files that are at large on the internet and have been seen 1,000 or more times. The incremental harm is extremely small.

In general, we only pass laws to protect people against significant harm. Laughing at someone’s unusual appearance or clothing is quite likely to do harm in making them feel bad. But we don’t make a law against it. The state doesn’t get involved with that level of harm.

“Looking at CP is not a victimless crime” implies not only harm, but sufficient harm to justify it being a crime. It doesn’t qualify. Why would someone think it does? Because they have this gut-level conviction that a pedophile fantasizing about children sexually is in and of itself a horrible thing, and they cling to any justification for making it a crime, however slim or far-fetched.

I have so far been focusing on the minimum, shared morality that we in a diverse society share, which assumes freedom unless we can find harm to others. This is the only morality that should guide the law.

Yet more extensive and rigorous moral codes guide the actions of most of us. This is a good thing.

I suspect that the vast majority of people find looking at child pornography highly offensive morally. Here is a comment on one of my earlier posts, from a pedophile who does not hate his attractions, is not opposed to private fantasizing about children, and has no problem with looking at virtual child pornography (with no real children in it):

“CP is wrong. Anyone who contributes to the harm of a child is evil. Even if it’s the 1000th person watching abusive video and victim doesn’t know. Even if they had been hurt 999 times, they wouldn’t like to be abused for the 1000th time. Even if they didn’t find out…This is a selfish view. People who care for others and don’t want to do anything evil to anyone and are at least slightly altruistic won’t do that.”

This is a moral judgment shared by a great many people, including a great many pedophiles – and including pedophiles who download CP and hate themselves for it. I suspect I share it at a gut level too.

If we replace the initial statement with, “Watching CP is morally wrong because good people don’t benefit from the suffering of others” then it is in the realm of private individuals arguing for their own morality and trying to persuade others, which is totally appropriate. To be consistent, I think such an argument should extend to not enjoying “fail” videos of other people caught in embarrassing situations, and not enjoying a variety of news stories out of a prurient interest.

But “Looking at CP is not a victimless crime” when unpacked in the legal context relies on the minimum shared morality of harm. This is virtually nonexistent in the typical case, and the statement is wrong.

Summarizing Ethan’s post, harm occurs when:

  • The downloader asks a producer to produce some new CP for him.

  • Police seize a person’s computer, discover the CP, and notify the victim that such a video has been found.

  • The downloader finds a way to contact the victim and does so.

  • The downloader pays for it, and the money makes its way back to the producer, encouraging him to make more.

  • The downloader indicates his approval by writing a message of encouragement.

  • The downloader decides to host the CP or pass it along to others.

A typical CP downloader does none of the above. The remaining path of harm is when the download registers in web statistics. Ethan desribed a worst-case scenario:

I can imagine that a victim would be more distressed to know that 1,000 people have downloaded the CP compared to 10. Yet of course 10 downloads from a given site means nothing, since the material could appear anywhere due to reposting and forwarding. This would be very hard to trace. It is safe to assume that the average person downloading CP (as opposed to part of some semi-private network) is going to be seeing files that are at large on the internet and have been seen 1,000 or more times. The incremental harm is extremely small.

Not all kid porn downloader’s culpability is the same. Still harmful, but to varying degrees. Message board downloaders have different culpability than those who directly purchase from the producer.

Either way, cannot be permitted. We should mandate by law that all browsers install a filter to remove all known kid porn from view. Also maybe mandate some law, some way to install a filter for all computers imported into the country to make it no longer possible to store known kid porn. I found this out reading how Kazakhstan is planning to do so 7 years ago, or at least a university of theirs is doing some tests first. No country has been able to arrest themselves out of this problem.

Quoting Ethan in Why CP possession penalties are unjust – a summary (2019):

You could make a case for penalties if someone pays for child pornography, or perhaps even if they give effusive praise to a maker of child pornography. But passive downloading and viewing has but one effect in the real world – increasing a hit count somewhere. It simply does no harm.

In Hysteria vs Analysis Regarding CP Images Online (2019), Ethan responds to a lead article in New York Times:

Why [is looking at CP] a minor crime? There is no allegation that all this CP is distributed for the money – we used to hear it was a multi-billion-dollar business. Law enforcement has realized that so little money is changing hands that it’s time to quietly stop talking about it. The other common allegation is that people looking at the material encourage others to make more. The article says, “A private section of the [Love Zone] forum was available only to members who shared imagery of children they abused themselves.” It sounds like a small group of detestable people want more children abused so they can see new material, but these producers do not actively want their material seen by as many people as possible. Viewing by the second group, the passive consumers, does not encourage making more. All a passive access of a CP image does is to increase the number of hits, and it turns out that even those hits aggregated by the thousands don’t encourage more production either.

Which would eventually be abused.

1 Like

Which would eventually be abused.

So does any technology. But there is no way to fix the problem without it.

@Space I want to find alternatives to imprisonment where ever possible. Incarceration is expensive. America is using prisons too much while ignoring the other options like mandatory out patient treatment, fines, community service, suspended sentences. Especially on the federal level where they are addicted to imprisonment. You can look at Europe and in many countries imprisonment is the exception and not normal.

But I do not believe increasing the view count does “no harm”. There is a risk the creator of the child porn would see it on a message board, and a risk that would increase likelihood he will rape again. Odds can be minuscule that a particular view count from 8291 to 8292 increase would do that, but higher than 0%. Endangerment is a type of harm. Yes, penalties for purchase should be higher. Also incitement to molest and rape a child needs to be enforced more.

Quoting Ethan in Child sex abuse – recorded or not (2016):

Without doubt the vast majority of child sex abuse is never recorded at all. A tiny proportion is recorded and released into the dark web and seen thousands of times. Michael Seto in <“Internet Sex Offenders”> highlights a middle case – abuse that is recorded but never distributed. He says 3/4 of those who make it don’t distribute it, but just keep it for their private collection. A key argument for making CP viewing illegal is that it fuels a market for the creation for more. If 3/4 of the producers never distribute it at all, surely demand has nothing to do with their activity. It makes clear what a tiny portion of child sex abuse could even possibly be committed by the hope that recordings of the act will be viewed a lot. (The evidence that this tiny portion is actually influenced by viewing statistics is also extremely weak.)

CP producers don’t actively want a large audience. CP is evidence of crime. A big number on a view counter doesn’t encourage more production.

CP producers don’t actively want a large audience. CP is evidence of crime. A number on a view counter aggregated by the thousands wouldn’t encourage more production.

Is that really true though? This isn’t what StopitNowUK believes. Do you have a link to a study or something based on empirical evidence?

Either way, we still need to prevent people from storing these images. It’s appalling that there are people masturbating to a child being raped. Thankfully there is a small but growing movement to use censorship tools like DNAphoto (or however you pronounce it) to prevent kiddie porn from being viewable. I know a university in Kazakhstan is experimenting with it 7 years ago. I also know an NGO in India saying they want to see it it can be used on a hard drive to prevent potential offenders from storing these images.

I was told Prostasia strongly support these tools and as a prevention organisation, I wouldn’t be surprised. I believe as a prevention organization, we should really consider laws to mandate all computers sold, all browsers made available for download use a specific type of censor to make it impossible to view, download, or distribute identified child rape/molestation videos. I have little doubt that Prostasia would support this.

I’m interested in how to prevent this from being downloaded or viewed or shared. No one should be enjoying seeing a child being sexually exploited.

1 Like

Viewing CSAM is harmful to victims, which is why we are pushing to censor this crap. How to deal with CSAM possession-only offenders in 2040 will become irrelevant, because they simply will not exist anymore. And because by then we will have censorship on every medium, even your printer will check if you are trying to print out CSAM, if it detects you are it will simply not print it. Will there be CSAM offenders in 2040? Yes, but they will be producer offenders generally. Can’t become a CSAM offender with only possession offense when it’s impossible to access this horrific material.

In the future, AI can be developed to auto-detect CSAM and will autosend the suspected material to authorities by then. In the future, your browsers, everything will have AI that are better at determining whether something meets the legal threshold of CSAM than even many experts can, built up by case law. It will become impossible for you to see not just hashlisted CSAM, but even newly produced CSAM. It will be a great world to live in. Trust me.

I know a university in Kazakhstan is experimenting with it 7 years ago. I also know an NGO in India saying they want to see it it can be used on a hard drive to prevent potential offenders from storing these images.

Can I see the article? Can’t find it. I did find a link to what I thought would be the article but it’s broken…

I couldn’t find any opinions on CP viewing at stopitnow.org.uk. The idea that CP producers want views made perhaps more sense when CP was an industry. The influence of CP viewing statistics on CP production is not well studied. The laws around CP prevent its research, and there’s a lacking interest.

You can learn some just by talking to CP viewers. Most CP is old and shared freely. Virtually no one is paying for it. There are networks where new CP is shared, but they are reluctant to let you in. Possessing new CP gives you status. Only a small fraction of CSA is recorded, and only 1/4 CP producers share their product (Seto, M. Internet Sex Offenders. 2013). Punishment is a strong deterrent.

One point which nobody has brought up yet is:

“What about dumb kids who take naked photos of themselves?” Is that encouraging production?

How did so much CP end up on Tumblr without geting detected? Just a flaw in the scanners? Or new CP?

Senator Folmer was convicted for uploading content to there recently but it took years for them to spot it. It wasn’t erotic posing, or a 17 year old. It was a very young girl performing fellatio. He was a Republican, not a Democrat, and had nothing to do with Hillary Clinton.

A lot of people try to make erotic posing out to be the most common type of CP, but IWF claims more serious types of CP are. However, it is possible they’re simply prioritizing more serious types, or that it is easier to detect serious CP, than it is for the algorithm to tell the difference between legal nudity and illegal nudity.

It is also possible that NCMEC and other hotlines have beat IWF to taking down less serious content.

If we want to treat old CP and new CP differently, we could create new offenses for being in possession of old CP and being in possession of new CP. Or we could interpret this by whether it was old or new when the individual received it? The same could apply to distribution.

Alternatively, we could create an offense for distributing CP as part of a network, like in Germany. Or this could be treated as a factor in new CP offenses.

New CP could be defined as any piece of CP which has a high probability of having been produced within the last five years. Or would a large window be better? Ten? Fifteen? Twenty?

By the way, that blog has been terminated, which isn’t surprising given that blog post.

1 Like