A Theory And A Suggestion

I was not quite sure where to put this topic-wise, but I think Prostasia and Virped are the best places (staff, feel free to move it). I am mainly writing this for muggles, or adult-attracted people, so Virpeds and MAPs please bear with me a bit.

I was talking with my therapist this month about a theory I have, and I think some of it at least holds water.

Every single pro-contact pedophile I have spoken with or heard from, you see a few very common themes, and for some time I have been asking myself what the root cause might be for why someone would start to justify the idea that children can be sexual. I have also heard a lot from people who used to be pro-contact.

I wonder if the root reason is related to the amount of stigma levied at minor attracted people, in a way that becomes a circular cycle. A lot of people, myself included, are wary of much of society because of how we are treated by society in the way of laws, attitudes, etc. I probably do not need to go into too much detail with examples for some, but hear me out.

You belong to a stigmatized minority, and the news constantly uses basically the only concise word you really have to identify yourself to demonize, otherize, hate, and ostracize you. The only visual outlet you have is illegal and difficult to find, and beyond that you have almost equally hard to find fictional material. All of society tells you it is wrong to do this, wrong to do that. Is it any wonder that some turn to resent that so deeply that they rebel against those rules and push for change?

I think at least part of the solution is to have ethical materials - fictional or non-fictional - that are more easily available. Even with all my connections, I have a difficult time finding places I am comfortable with for a few reasons:

  1. Internet safety: Does it use reputable scripts, are there popups, does it work with the privacy add-ons I have installed?
  2. Ethical: Do I know that this site has nothing illegal going on with images of real children? Is there any way I can verify that?
  3. Searching safely: Given those two, how do I search for fictional material in a way that is safe? How do I make sure the site I am visiting is real?

I think Prostasia’s certification mark is an excellent step in this regard, but I think it would be wise to brainstorm some kind of way by which it is easier for artists or even organizations to vet their status as ethical, and have material that is both satisfying and ethical. Fictional images for some run into the problem that they do not look realistic enough, and real images run into the ethical/legal problems of potentially being connected to illegal operations or not knowing whether the image was obtained and distributed ethically.

I will not go too much into details, but suffice to say that minor attracted people are presently operating in a rather grey area. If one is not comfortable looking for images, or not comfortable asking people on where good sources are, that leaves fantasy of real children, which leaves minor attracted people generally MORE uncomfortable for a variety of reasons (myself included).

I know some minor attracted people might counter me on some points - image searches, Instagram, photo sharing sites - but my main point is, muggles have an easier time of obtaining pornographic material than minor attracted people, and I think that needs to change and can go a long way both in preventing image-based offenses, but also in not alienating minor attracted people from society in a way that pushes them towards pro-contact ideology.

1 Like

This might be opinionated… but many people probably disdain pedophiles because the very idea runs contradictory to child protection and the concept of “children are innocent”, right? Sure, everyone deserves a fair shake, but it comes into question when one acts as a danger to not only themselves, but many others.

Sorry, when I posted I presumed people were aware of and thinking rationally around minor attraction, not under the assumption that everyone with an attraction goes around raping people or wanting to. If you need a refresher, this podcast is a good place to learn, but maybe next time you could treat people with respect.

1 Like

Sorry if you felt I was being disrespectful. I just can’t help but get overly emotional over this controversial topic, but it is not like I intend to offend others, but I again apologize if you feel so. It would be beneficial to speak to these people in a civil discussion, and allow them to take their good traits and work upon improving them, while understanding their less savory traits and working upon them to help them exponentially. If you have any ideas, I’d like to hear them.

Are you suggesting a site on which each submitted item is check by a human, approved, and made available for viewing or for sale?

If so, I believe that’s a very good idea.

If would just have to be run by someone in a (sensible) country where such things were legal.

1 Like

Dontomiton, I sent you a message with some resources to check out, or perhaps a subject for another topic.

Sheila, essentially, yes. I am also suggesting that by focusing on ways to reduce image offenses - the majority of which are perpetrated by minor attracted people - like providing ethical outlets can reduce sexual offenses and also reduce stigma by making the public more aware of the needs that minor attracted people have.

As I said, I think stigma and pro-contact ideology are related, so reducing stigma can also reduce the number of minor attracted people who fall for such an ideology. Pushing the idea that minor attracted people currently do not have decent places to go for ethical pornographic material might be controversial, but for many rational people, their reaction may very well be, “Well, no wonder they turn to illegal images! We need to fix this!”

2 Likes

I’m co-founder of Virtuous Pedophiles (virped.org), and became aware of this discussion forum through TNF_13’s cross-posting of his idea to VP. As I said in a much longer reply over there, I have my doubts that such materials will affect pro-contact people very much. But I do think that identifying legal and ethical materials is important in itself for the fulfillment of all of us.

With that in mind, I am very interested in what discussions have gone on in this community to formulate what is ethical and what is not, and perhaps participating in them. I myself have blogged quite a bit on the problem of CSEM/child pornography (http://celibatepedos.blogspot.com/2017/03/index-child-pornography.html).

Maybe people have old threads to recommend.

Welcome, Ethan! Part of the problem is that when cartoon imagery and the suchlike are banned and lumped in together with real images of child abuse, they don’t disappear from the Internet altogether. Rather, they are just forced into its darkest corners. It is in these places where people who may have started off with a virtuous disposition will be exposed to unchallenged pro-contact ideology.

The response needs to be twofold: first, drawing a clear line between harmful material and merely explicit but victimless material. This is what we are trying to do with our No Children Harmed certification.

Second, we need staunchly anti-contact voices to colonize the “dark corners” where pro-contact ideology festers. This has already been largely successful on Twitter, where pro-contact views are quickly challenged, and anti-contact views are strongly and effectively reinforced. But on the censorship-resistant dark web (aka. the Tor network), it remains largely unchartered territory. The Priotab project at Sweden’s Karolinska Institute attempted to reach out to dark web forums to establish a foothold there, but it has been met with distrust. Prostasia has our own plan to extend support resources to the Tor network, and putting this very forum online there will be our first step. Meanwhile we are also challenging laws and policies that would banish lawful, victimless content to this lawless part of the Internet.

3 Likes

Ethan, not sure if I was making myself understood. My theory is that lessening the stigma interferes with people subscribing to pro-contact ideas, not directly influencing people who are pro-contact.

I appreciate your assistance. I would know about preconceived stigma, considering our previous conversation. The first part of any battle is calm conciseness, along with context and conscious. You can’t solve a problem if you are unconsciously adding to it(This most definitely applies to myself).

1 Like

Those both sound like worthy projects and I hope you make good progress with them.

Cases that interest me are those where the images are offered with no intention of being arousing, but in fact pedophiles do find them arousing (or alluring, or cute…).

Examples are family blogs, or recordings of gymnastics meets. I have felt that as long as the people don’t find out, there certainly is no ethical problem.

Even if they do find out, it is an interesting question to me. Earlier this year YouTube came under attack as people noted that their algorithms were very good at identifying pedophiles’ interests and showing them video after video of girls doing innocent things, though things pedophiles find interesting. The high hit counts were one strong indication of this, and even more was people leaving lewd comments or ones highlighting the exact time when (say) panties were visible. There was outrage, and YouTube disabled most comments and changed their algorithms to not link such videos to each other any more (or so it seems to me).

That was predictable politics, but it doesn’t deal with the ethics. One take is that finding out that pedophiles found their innocent videos alluring is a genuine grievance and pedophiles are obliged to make sure that doesn’t happen. Another is that that reaction is one of prejudice on their part. For a parallel, existing patrons at some formerly all-white organization might be distressed when black people show up, but this does not make their showing up immoral because that distress reaction is due to unjust prejudice. You could argue that society’s horror at pedophiles finding the videos interesting was pure prejudice.

A gray area is pre-teen girls with channels where they have discovered that videos that show off their (clothed) bodies get a lot of hits and they deliberately make more to get more hits. You could argue that it’s immoral for pedophiles to watch those because the girls might later regret their earlier choices, or you could argue that it’s up to the parents to supervise, or that whatever later distress the girls suffer is a mild case of learning how the world works.

To me these cases feel quite different ethically from actual child pornography – and even from the intermediate “erotic posing” sites.

Anyway, I’m interested in whether others have opinions on these questions or whether they have been discussed elsewhere.

(Is there a “preview” button? And just how picky is the quote format that my quote of terminus didn’t work?)

1 Like

Yes, there’s a preview on the right-hand side of the composition pane. But it sometimes gets covered up by a “This post looks similar to…” box, that you have to cancel so that you can see the preview. I fixed up the quote for you.

From Prostasia’s perspective, our main concern arises when moral panic impedes us from making rational choices around abuse prevention, for example by causing us to over-invest in ineffective approaches like banning cartoons or dolls, and to under-invest in things like prevention research.

Beyond that, what one considers to be moral or immoral is a personal matter, and it’s better to leave it out of the equation as much as possible, in order that we can be laser-focused on the more important question of preventing harm, rather than preventing people from having immoral thoughts (ie. thoughtcrime).

To address your example, you can believe that someone privately “getting off” on clothed yoga videos is morally abhorrent, or you can believe that it’s morally neutral, but it’s folly to make that be the deciding factor when considering what should be done about clothed yoga videos—because if we’ve learned anything about sexuality, literally anything can get somebody off, and somebody else will think that it’s immoral for them to do so.

More important factors in deciding what to do about borderline content are what harms do these videos create, and what are the ways that we could address or prevent those harms? One of the draft principles that we have developed to guide Internet platforms in making these decisions is context:

Note the phrase or to promote it. Google may have made a problem worse by allowing its algorithms to be used to create a new catalogue of soft-exploitative material that didn’t exist before outside of sketchy chan forums. Even though we (probably) shouldn’t be censoring this material, there are also good reasons why major platforms shouldn’t be collecting it together and highlighting it in an inappropriate context.

1 Like

In theory omission sounds right, but when policymakers are considering whether to accept recommendations, their personal emotional responses are a big part of it, including their sense of morality so “as much as possible” may not extend that far all the time.

Also, in a group like Virtuous Pedophiles, individual members are sometimes trying to figure out what things they should feel OK looking at and what things they shouldn’t, so when people share their own personal morality it helps others see if it applies to them. But perhaps you’re saying that is outside the scope of what Prostasia worries about.

I am ignorant of the whole history of Prostasia and the cases that led it to formulate policies, so I might benefit from some education. But this one is interesting. No celebrity or politician would really want to consent to having their image or quotes used in a hostile context, but only in a sympathetic one. Of course they have chosen the limelight so the rules may be different – though those same rules might apply to child actors or models or contestants in competitions. The negative versus positive is also interesting… You seem to presume lack of consent until consent is proven. Civil liberties tends to the opposite presumption.

Who is viewing the collection also seems important. If it’s just me, I don’t figure I need consent from what pictures I put in the same directory. Probably not if I just share it with 10 friends. If it’s publicly available then those issues come into play more.

I don’t think it’s reasonable to blame them initially for things that their algorithms create and thereby promote – what’s been created is only apparent when people see the resulting pattern. The same would go for racist or harassing videos feeding on each other based on people’s interests. Any remedy has to come after the fact. Their only choices are to suppress or to not suppress.

I’m not convinced that there was justification for their suppressing what their algorithm created in this case, other than to dampen public outrage. The outrage was really about the thought crime of finding children sexually attractive. I’d need to learn more about how something earns your designation “soft exploitation”.

I took it to be a shorthand for, “What makes you think that a child would want to engage in sexual activity with an adult, while also being informed enough to understand what it means?” It doesn’t mean kids don’t masturbate, or play doctor with each other, or have curiosity about sex. If you’re referring to kids who have started puberty (say 12 or older) then they might be interested and informed, but there will be misunderstanding and harm far more often than a healthy relationship when it’s with a much older partner.

TNF13. That is impossible to know and for the very reason that people are oppressed. It is pushed to the fringes in dodgier and dodgier places and it is unregulated by it’s very nature so as to avoid being taken down.

There is no real co-operation between sites. There might be a few here and there which have staff who talk, but they are often quite ignorant of what happens at each other.

It is also impossible to know if anyone has exchanged links or images privately without invading people’s privacy, which in theory works, but creates a hostile environment (as can be seen by one large site) which just pushes people to more extreme sites anyway where they can unwind with more like-minded individuals. Complete failure.

There are a few important things which could be done. A site shouldn’t be held liable, if something happens without their awareness. The societal obsession with completely stamping out CP defeats that.

An accredited site of sorts should be able to tap means to automatically detect and block problematic content (hashes seem to be a popular one mentioned here) without itself being held liable for the presence of that content which is blocked.

This takes a certain level of legitimacy which in itself means not trying to stamp these sorts of sites out like annoying rodents eating away at your food.

I don’t know if that is possible. Society actively rejects it, with the exception of Japan, you would still end up relying on a site that says that it itself doesn’t have any dirty business going on. I have a few personal heuristics that I follow, but it tends to vary from site to site.

For instance, the more secretive a site is and the more it hides sections, the more I think that there is something funny going on. Transparency is very important, although it is sometimes important to hide 3D sections as some search engines immediately jump to conclusions they shouldn’t.

The whole idea of accruing “reputation” or posting a certain number of “images” feels very wrong, although it isn’t impossible that it could be legitimate. I just wouldn’t take the risk on such a site.

I won’t name names in particular on here, as that would bring them negative attention, but most tend to lack illicit content. That is not to say that I haven’t run into it. Prostasia is aware of a fairly safe-ish one, as they seem to keep linking to it, perhaps unintentionally. Someone has been arrested for having a big mouth on it however, so be very quiet and just acquire images.

There was another problem after I dug around a biiit more. Some people linked timestamps of specific points when they were in compromising positions and some took screenshots to save on their computers of those moments.

The video itself may not have been considered pornographic, but those moments may have been without the overlaying context of the video.

There are also allegations from media that they uploaded unlisted videos to YouTube to share with each other, although it is unknown what they contained. The media did not like it however. Unlisted videos are videos with settings set so they don’t show up in search results and can only be accessed by someone who knows the direct URL.

This is a purely informative post and not intended to make a point.

It is very possible to save screen shots and also to download entire videos. I don’t see what that has to do with anything.

No one claimed the videos were illegal and pornographic, but they were against YouTube’s community standards. You can consider it poor taste to make comments like that, or the more obvious, “Wow that’s so hot!” And the solution to that was to disable comments on most videos of this type. But the decision to disable the “show me similar videos” algorithm in this case was separate.

There is a basic question of values here. If you as a girl post an innocent video where your underwear shows occasionally, is it a violation if someone makes a private copy and focuses on those moments? I say no… I say that’s like defining a thought crime.

I’m not so sure about reposting it publicly either. Why not laugh it off, as long as the source video had no hint of exploitation? Perhaps someone with an elbow fetish will focus on the moments when your elbows show, or someone gets off on your British accent. (I’m not sure of this and am open to counter-arguments).

Unlisted videos are a great way of sharing ANY videos among a small group – for instance, pictures of a class event that you DON’T want pedophiles finding by search. It doesn’t seem fair to assume the videos being reposted are problematic just because it’s pedophiles who are doing it.

How do you ethically analyze internet content without breaking human rights laws? A automated checking system, perhaps? In theory, it could have it’s practical purposes, but it might encapsulate it’s own fatal flaws. I’m, honestly, completely unsure of what to do.

You make rules. You enforce those rules. You let everyone know what the rules are. On the backend, you have systems designed to help enforce those rules, such as algorithms that detect when a picture is suspected to be of a real child, and flag it for human review. You operate in a jurisdiction that cannot come after you as long as you follow those rules.

Prostasia might have other ideas, but those are the ones already outlined.

I do need to point something important out. The overton window is shrinking, this means that as time goes by, these ideas will become less and less politically viable.

For instance, not that long ago, sharing the idea that relationships with teenagers could be viable (as wasn’t too uncommon several decades ago) wasn’t the end of the world, but now platforms are working very hard to purge this speech off the web. They are also purging lolicon, anything which could appeal to pedophiles, etc.

Ultimately, this shows a shift on society’s part as a whole which perhaps hasn’t been visible in a very long time. More classes of content are also being considered “child pornography” with draconian penalties as a result and politicians responding by saying how much illicit content there is after drawing up more of it out of thin air by widening the definitions. Not to mention, some of it is likely stupid things like sexting, but algorithms can never tell the difference.

They are also putting pressure on other countries, such as Japan where simple possession of real child pornography was legal until recently, and they continue to push to get these definitions widened.

Realistically speaking, and Prostasia will not like this, stopping people from doing undesirable things is impossible and may even constitute a crime against humanity, necessitating measures such as psychological profiling and even genocide. Extremely low level content also reduces the likelihood of crime and psychological suffering and public conceptions / cultural norms may be a factor in how someone may feel, as-well as obviously popping up in really obvious places (e.g. searching someone’s name in the search results and it appears in a very controversial context).

Finally, spreading information to identify pedophiles like some are doing to combat crime (e.g. TNF13’s own actions), even if well-intentioned, will likely be used to discriminate against innocent and lonely / depressed people who are unlikely to even commit a crime. Many real abusers as it were are known to the authorities, but for one reason or another, police opt not to pursue a report or authorities in a school may not escalate it. Family members / friends also tend to cover for each other.

That is illegal and a violation of child pornography legislation in many countries.

That is one of the reasons that the media got mad, asides from the comments.
The biggest reason however that it became such a big deal is likely because of SESTA / FOSTA.

As an additional note, and as mentioned by pro-contact, in one way as it a sort of Whataboutism when someone criticizes them, is society’s very lax attitude towards childhood neglect which is extremely damaging. This should be tackled.