Something I’ve been thinking about a lot more than usual lately is whether there is a palpable, empirically observable and measurable detrimental effect that forms of simulated child pornography (3DCG, lolicon/shotacon hentai, text-based stories) have on the market for CP/CSAM.
In Ashcroft v. Free Speech Coalition, it was successfully argued that such materials wouldn’t contribute to the market for CSAM, since the appeal for virtual child pornography for, as well as its now-established legality and accessibility, would actually divert demand for materials produced using real children away from that exploitative market which thrives on abuse. It was successfully argued that such materials would actually protect children, in this sense. Supporters of legalization have also argued that such materials will serve to scratch the itch, and satisfy the needs of pedophilic individuals in need of an outlet for their sexual impulses without the risk of actual abuse, or the consumption of actual CSAM.
(I’d be remiss if I didn’t admit that I’m a supporter of legalization.)
Proponents for censorship/criminalization have argued that such materials don’t have such effects, that they exacerbate the market and demand for actual abuse material, and are commonly traded alongside it, since it acts a catalyst for pedophiles to gather and share such materials among themselves.
Proponents of censorship/criminalization will also argue that fictional material acts as a sort of ‘stepping stone’ to real abuse material, and will try to integrate these arguments with the previous one. They will claim that such materials will begin to lose their appeal to pedophilic consumers, who will want something more ‘real’. This argument is, in essence, an example of the content progression hypothesis, an invocation of the slippery slope fallacy.
I find the claims made by proponents of censorship hard to believe, if they’re even capable of being believed, since there is very little evidence to prove them.
From what I’ve seen, literally ALL of the communities and websites which focus on simulated/virtual child pornography, whether they are mere ‘lolicon’ forums or pedophilic communities, seem more than content with fictional material, rather than embracing actual abuse material, and these communities are very open to embrace and enforce this.
Websites, such as Gelbooru, maintain an active NCMEC contact and others who aren’t even based in the US going out of their way to report CSAM or CSA-related offenses to US authorities, should they become aware of them.
This also applies to communities that allow ‘pro-contact’ idealists to promulgate and discuss/argue about things will draw the line at sharing or discussing ways to acquire CSAM or commit acts of CSA.
I have no doubt in my mind that loli/shota or 3DCG materials have shown up alongside actual CSAM in some instances, but I have an extremely hard time believing that such materials are as ubiquitous among this underground market for CSAM as censorship proponents claim.
There have been busts on CSAM trading websites/communities that have expressly disallowed the sharing of fictional material, and such requirements have been emphasized by prosecutors and investigators.
Not to mention that the overwhelming majority of CSAM offenders possess no fictional material.
Another argument worth considering is the ‘community argument’, whereby proponents argue simulated/virtual pedophilic material should be banned because it serves as a social catalyst for pedophilic individuals to congregate and share CSAM.
This argument is both poorly substantiated, illogical, and simply wrong. There is zero evidence to back up the claim that such materials and communities are misused by those with criminal intentions to engage in criminal activities.
Those familiar with Gelbooru might be quick to claim that ‘Toddlercon’ material was allowed at one point, but was removed and disallowed because the administration noticed various types of unsavory conversation being posted within the comments sections of these posts, as well as users requesting or posting links to actual CSAM, as opposed to opposition to the content.
A little bit of research will reveal that Gelbooru’s incident is an outlier, and was merely a consequence of their refusal to properly moderate that section of their community. Such content was hidden by default.
Gelbooru’s decision to remove such content from the site, rather than re-consider their moderation practices, was poorly-reasoned and the incident an outlier, as other imageboards and communities, such as Danbooru, still allow it and such unsavory or criminal communications being non-existent.
Pixiv, the popular Japanese art platform, also allows such content, and a cursory glance at the comments of their posts will reveal a lack of such unsavory or criminal comments, or if they exist, they’re promptly removed thanks to user reports.
Moreover, according to various users, Gelbooru still inadverdently has such content, but it’s just not tagged as such.
Even considering Gelbooru’s incident as a talking point, such an occurrence is not representative of the bigger picture. It was one platform out of a dozen, with a staunch minority of its users misusing their platform, creating new accounts after being banned, to engage in criminal discussion, rallying around a very specific, niche topic that the site administrators deliberately tried to make as inaccessible to the average user as possible. Gelbooru’s response was to simply remove such content from their platform.
Taking this fact into consideration, it also doesn’t make sense, since it can be deduced that the majority of such consumers were not viewing such content with criminal intentions, and just about any service, or part of it, can be misused. The question as to whether prohibition should be justified is contingent on whether or not there exists an inexorably high, if not intrinsic degree, of risk for subsequent criminal activity, not whether there is some risk. I believe the facts I’ve presented show that such a risk of misuse is not present.
As for whether such fictional materials may act as a ‘stepping stone’ towards actual CSAM consumption, this claim has very little evidence to support it. The claim that pornography acts as a ‘gateway drug’ to actual criminal activity or materials has been regurgitated throughout history, with very little (if any) evidence to back it up.
While some studies have acknowledged pedophilic individuals abstaining from indulging in paraphilic fantasies or impulses out of a fear that such materials would ‘exacerbate’ their risk of CSA perpetration, the presence of such persons is always extremely limited or non-existent.
In this study, they say this:
Masturbating to child fantasies
Masturbation to fantasies of children was a hotly debated area among forum users. We coded 39 extracts relevant to this subtheme, including posts by 29 unique individuals. Some users suggested that masturbation to fantasy helped with sexual urges, whereas others argued it intensified their attractions. Forum users exclusively interested in children were likely to endorse masturbation to fantasies involving children as an effective strategy to manage their interests in the absence of other outlets. Some users advised using masturbation prior to interaction with children to relieve sexual tension. Others, however, stated they avoided it completely.
A number of users were concerned that masturbating to fantasies of children would reinforce that behavior, making their attractions more intense. This appeared to be a minority view (seven extracts), although others had mixed feelings or acknowledged the possibility that masturbation to child fantasies would reinforce their interests. One user described using this mechanism of reinforcement to try to develop greater sexual interest in adults. The majority of users appeared to be of the opinion that masturbation to child fantasy was either harmless or decreased tension or arousal that, unchecked, might lead to problematic situations. Approximately two thirds of extracts contributing to this theme reflected this view, although some differentiated between fantasies involving known versus unknown children.
As well as this:
Potentially Maladaptive Strategies
A broader question is whether using pornographic material that appears to function as a proxy for indecent images of children and masturbating to pedohebephilic fantasies are strategies that influence the likelihood of offending. Meta-analysis suggests that approximately half of individuals identified as having used indecent images of children have also committed contact sexual offenses (Seto et al., 2010). For the purposes of diagnosing paraphilias, neither Blanchard (2010) nor Seto (2010) distinguished between pornographic materials depicting real and fictitious children. If this real/fictitious distinction is trivial in terms of risk of contact offending, using some of the legal outlets discussed by forum users may be a risky strategy. However, given the lack of empirical research on this question, the opposite may be true, whereby seeking out legal forms of pornographic material, even where they are not quite a perfect fit to the individual’s sexual interests, may reflect protective factors that function to reduce the possibility of offending.
There appears to be a degree of fear or anxiety over this, but such intensification/exacerbation concerns seem to be overstated when the broad effects of pornography, even ‘problematic’ pornography, are not shown to have a causal effect on users, even when they’re predisposed to CSA perpetration.
Studies have found that CSA offenders and CSAM offenders are meaningfully different, with CSAM offenders typically being of low risk of CSA perpetration, and that roughly half of those predisposed to CSA perpetration do not even have a primary or secondary pedophilic sexual interest (let alone pedophilic disorder). Moreover, studies have actually found that the overlap between mixed offenders (MOs) can be explained as CSA offenders actually transitioning exclusively to CSAM offending.
There are a lot of questions that need to be asked with regard to these associations, but the fact of the matter is that a causal connection between CSA perpetration and pornography consumption has NOT been found.
Anyway.
The reason why I’m making this thread is to share my thoughts on the preventative value of virtual child pornography with respect to the markets for CSAM. CSAM, though it may prevent its consumers from harming real children, is NOT a valid solution to the issue of CSA perpetration. An actual child has to have been sexually exploited or abused in order for that material to exist, and the existence of it serves as a market for such material, and we, as a society, owe it to our children to stomp out the market for material that thrives on abuse.
But like I said, virtual child pornography serves as a valid safe and legal alternative to actual abuse material. As the SCOTUS correctly surmised, it does not contribute to the market for abuse material, as the intrinsic attributes of virtual child pornography, as well as its legality, both serve as a valid motivator for the possession, creation, and consumption of material not made with actual children, and also place it in a very different place within the market.
The legality of virtual/simulated child pornography is interesting, as it effectively creates a space for a market to flourish outside of the underground CSAM trade, despite the fact that the market for virtual/simulated material had already existed prior to this.
This is, like I said, due to the intrinsic attributes of virtual child pornography, as such materials are not contingent on the existence of an actual minor. Artists can create such materials as fast as they can draw/model/rig/render/animate them, and such mediums may even have certain aspects about them that make them more preferable to pedophilic individuals than actual CSAM.
Allowing such virtual materials to exist in a space that allows it to be safely and freely created, distributed, and consumed, without the risk of legal retribution, has exacerbated these facts, as well as opened people’s eyes to the harms that CSAM causes.
So, in drawing parallels to the drug market, it seems that the legality of virtual child pornography, in and of itself, may pose a greater influence on its benefits than most would consider. If simulated/virtual child pornography carries with it the same degree of legal risk as actual CSAM, then, outside of empathy for child victims or intrinsic appeal, why should pedophilic consumers go for it?
It’s sort of how Cannabis was considered a ‘gateway drug’. Studies in the mid-late 20th century on marijuana, its effects, and its users noted associations between cannabis markets and consumption and more ‘hard drugs’, and so used this association as a springboard to support a literal slippery slope fallacy. What these early studies failed to consider was whether these associations were organic, or what may have influenced them.
Drug dealers who sold hard drugs also dealt cannabis, and this fact was noted by drug busts on both dealers and consumers. Nobody at this time ever thought to consider whether the legal environment played a role in these associations, as drug dealers who dealt in hard drugs were the one’s commonly peddling it. It begged the question: if cannabis were legal, would the ‘gateway drug’ fallacy have merit?
Later research would find that to question answered.
(THIS IS NOT TO COMPARE PORNOGRAPHY TO SUBSTANCE USE! IT WAS MERELY A COMPARISON TO SHOW HOW LEGAL ENVIRONMENTS SHAPE OR INFLUENCE EMPIRICAL/SOCIAL/LEGAL ASSOCIATIONS! PORNOGRAPHY DOES NOT AFFECT THE BRAIN SIMILARLY TO DRUGS!)
The same argument exists for virtual child pornography. If the law treats VCP as though it were actual CP, then the implications are far more drastic and far-reaching than just being unjustified thoughtcrime, an already heinous aspect as is. The law, under this treatment, may actually run the risk of EXACERBATING the market for CSAM consumption.
There’s no beating around the bush when it comes to pedophilia and the impulses of those who are pedophilic. It will exist, and continue to exist. Denying pedophilic individuals their right to act on these impulses in a safe, legal manner will only cause more harm, both to children and to those who do not pose a threat to them.