Child psychiatrist jailed after making pornographic AI deep-fakes of kids

Now if only we could convince anti-AI fanatics to focus their efforts on actually harmful shit like this


What’s interesting is that he was charged for CSAM production and possession, as well as the creation of deepfake material, under child pornography laws, basically confirming we don’t need obscenity laws to be able to target this type of direct, real harm.

Keeping criminal prohibitions limited to materials where the rights of actual, real, living children is precisely what the SCOTUS had in mind for Ashcroft v. Free Speech Coalition.

By prohibiting child pornography that does not depict an actual child, the statute goes beyond New York v. Ferber, 458 U.S. 747 (1982), which distinguished child pornography from other sexually explicit speech because of the State’s interest in protecting the children exploited by the production process. See id., at 758. As a general rule, pornography can be banned only if obscene, but under Ferber, pornography showing minors can be proscribed whether or not the images are obscene under the definition set forth in Miller v. California, 413 U.S. 15 (1973). Ferber recognized that “[t]he Miller standard, like all general definitions of what may be banned as obscene, does not reflect the State’s particular and more compelling interest in prosecuting those who promote the sexual exploitation of children.” 458 U.S., at 761.

Section 2256(8)(C) prohibits a more common and lower tech means of creating virtual images, known as computer morphing. Rather than creating original images, pornographers can alter innocent pictures of real children so that the children appear to be engaged in sexual activity. Although morphed images may fall within the definition of virtual child pornography, they implicate the interests of real children and are in that sense closer to the images in Ferber. Respondents do not challenge this provision, and we do not consider it.


We all know that it would have not mattered if the images were alterations or fictitious. They would have got his ass either way using obscenity.

Also, AI stuff is still pretty questionable because how can someone truly proof it is not an actual child? What if it is something like this here on a broad scale? As time goes on this stuff will get better and thus harder to spot.

That’s not how burden of proof works. The prosecutor should be responsible for proving it is based on an actual child


I’m surprised by the levelheadedness of the comments in that news article.

Especially since it’s a British news website.

1 Like

Which is a very trivial task, given how easy it is for training images to be extracted. All you would need to do is query it, and once you find a match, you could do an analysis that would rule out look-alikes and meet the burden of proof.

If these were about targeting images of minors that did not exist (fictional characters), then I would be more worried.

1 Like

This isn’t necessarily true. All cases that have been brought by the DoJ under Garland involving obscenity seem to be limited to people who already have criminal records/histories relating to contact child sexual abuse, CSAM possession, or were for materials that constitute CSAM anyway.

I think it’s safe to say that the DoJ under Biden is not interested in pursuing these types of matters unless it can be somehow linked to a real child, or if the suspect is a known sexual offender.


Still incredibly dangerous that the government even has the option to prosecute you based on the fiction you like, though. Governments have proven time and time again that they can’t be trusted to just “not enforce” harmful laws


Well, that’s sort of the state that pornography in general is in.

But a silver lining is that nothing is obscene unless declared so in a court of law, and on a case-by-case, state-by-state basis, meaning that the same thing can be brought in and found obscene in one court, and not obscene in another court.

There also exists a presumption of innocence. Matters of obscenity are not matters of fact, merely opinion and conjecture, so that presumption goes a long way in more ways than one could imagine.
This is why CSAM is its own category and is not required to be found ‘obscene’. Even pornography, for its own sake, has artistic value, a fact that academics are finally being more vocal about.

1 Like

This criminal would have probably received the same sentence regardless if AI was involved or not. The criminal had real illegal material of kids. It was not just deep fakes. He also distributed the stuff correct? I use a negative prompt for AI art to block out illegal content as well as a ram disk. With Stable Diffusion 1.5, you can tell that the images are video game like images. They look fake. I use fictional in my regular prompts as well to tell the AI to generate a fictional person even if it is based on an actor.

Stable Diffusion also was released in 2022. So AI art of that nature probably was not involved here. The criminal committed his crimes in 2021. The criminal still harmed real people regardless.

This article hypes up the AI so much that it’s difficult to tell if the CSAM mentioned is real or not, almost as if it’s just thrown in as a side note. Either way I find his 40 year sentence + 30 years probation to be extremely cruel and unusual punishment. The guy may as well have been an actual child abuser to merit such a sentence and would likely get less time for murder.
I wish Prostasia would clarify its Twitter/X post on this because it really sounds like they are attacking AI and equating it with real sex abuse.
It may not be a good thing and can definitely be grounds for legal action, but pasting a face on a generated body is not sex abuse.


The underlying rationale is that it’s a real child’s likeness. It may not be as severe as actual child sexual abuse, but it places them in a position that they did not (or could not) otherwise consent to by way of (mis)appropriation of their likeness.

Contrasted against images of fictional/virtual ‘children’ (that are fictional characters which do not exist), there is some harm and risk to be observed when it’s a real child. Virtual/fictional children are not ‘children’, they are not persons, and cannot be afforded any of the same rights or legal protections that are afforded to actual children.

They are also not conceptualized the same way, in that these virtual/fictional depictions do not promote the market/demand for materials involving real children because the demand for them is grounded by a set of preferences or ideas that the latter does not have afforded to it, hence why virtual/fictional child pornography is on the same level of legality as adult pornography.

1 Like

Prostasia’s post doesn’t even mention AI, so I’m not entirely sure where you feel clarification is needed.

Using someone’s image in porn without their consent is abuse, imo. It’s obviously not equivalent to rape, and our legal system desperately needs reforms in sentencing especially to reflect that, but it’s still involving someone in a sexual context without their consent

It’s wrong for the same reason that AI revenge porn is wrong

The article is about using AI to create deep fakes.
The reason revenge porn is bad is because of the “revenge” part when it is used for nefarious purposes.

People should be free to manipulate legal data in whatever means they wish. How this person got caught isn’t mentioned, but I highly doubt he was blackmailing anyone whose “likeness” he used.

1 Like

The criminal had real illegal content at first before AI was used.


Including someone in porn requires their consent. If you don’t have that, it’s abusive

Where I live, the average prison sentence for MURDER is 15 years. What this man did was definitely not okay, and he certainly shouldn’t be allowed around children and perhaps lose his medical license entirely. Maybe he should do some prison time too. But 40 years?? That’s absolutely insane!

This sets an incredibly dangerous precedent which ought to alarm MAPs in general and Prostasia in particular. Supposing someone takes photos of fully clothed children without their knowledge, uses AI to manipulate the images to make them appear nude and engaged in sexually explicit conduct, then further programs the AI to redo the images to look like pen-and-ink anime and then sets up a website with those images. Then MAPs download the images imagining them to be legal. But when their homes get raided and the police seize their computers, the prosecution is able to prove that those “anime” images are AI-alterations of real children.

Since these hypothetical defendants didn’t actually create those images themselves, but nevertheless received and possessed them, presumably they won’t get 40 years in prison. Perhaps the State will be merciful and let them off with a mere 20 years, with the possibility of parole after 10 years, or somesuch.

We’re not there just yet, but with this conviction, we’re only a precedent or two away.


I think Prostasia opposes the direct use of children’s images in the creation of sexual content

1 Like

One person’s mainstream G-rated image is another person’s “sexual context.”

For example, I provided an AI with the following G-rated instructions (in my search for an “illustration” for a fictional story I wrote):

“Photo of brown-eyed, 8-year-old girl in a livingroom standing facing the camera. Pouting. Angry. Frowning. Arms folded across her chest. Long, straight, light-brown hair. Hair tied back with a white hair band. long-sleeve sweater. Blue & white plaid, pleated skirt which reaches to her knees. Pale blue knee-high socks.”

And it responded with the following G-rated image:

As one can see from my instructions, this image was not based on a real child. But suppose hypothetically that it were? >99% of people would likely find nothing sexual about such an hypothetical image, any more than the one above. However, for me personally the above image is hot porno. I lo-ove to see pullover sweaters and plaid pleated skirts on girls and young women - it’s just how my fetish works.

So… should >99% of people be permitted to possess an image similar to the above, (but a hypothetical version based on the facial features of an actual child star or model from the web), which they don’t find erotic, while I get 40 years in prison for possessing the same image because I do?

1 Like

I personally have a LOT of very mixed opinions when it comes to the subject of using a real-life reference for NSFW art. It’s a subject I struggle to come to any concrete answer on, and many whom I ask have their own opinions and admit they don’t have all the answers.

Now, it should be noted that (unless I am misunderstanding) this man had non-AI CSAM. Pure photographs of naked children that he had taken without their knowledge. I fully agree that this is immoral.

But let’s move away from this case and into hypotheticals. For a first example, let us take a duo of heavily stylized fictional characters who happen to be based on some real people: Dipper and Mabel Pines from “Gravity Falls”.

Dipper_Pines Mabel_Pines

The Pines twins are unrealistically designed cartoon characters. However, they are loosely adapted from real people: the creator and his twin sister. Alex Hirsch based Gravity Falls on his own childhood experiences and interests. The main protagonists are based on himself and his actual family. A popular ship amongst some sects of the fandom is “Pinecest”. Three guesses to who the ship is referring to. IIRC, Alex has stated that he doesn’t want to tell anybody what they can or can’t do with his characters in their fanworks, but a ship indirectly based on himself and his own sister as children obviously skeeves him out.

Is hentai based on the Pines twins immoral because they’re loosely based on two real-life people? Said people having been vocally disgusted by those who ship them?

How about a character who’s less cartoony, but still stilized? Clementine from Telltale Games’ “The Walking Dead”.


The comic-inspired art style of Telltale’s adaptation is not going to be mistaken for reality. But the overall main protagonist of the series, Clementine, is physically based on a real little girl. Derek Sakai is the art director for this game, and he based Clem on his own daughter. If you couldn’t tell, Sakai is of Japanese decent. Clem, despite supposedly being African American, is often mistaken for East Asian. If you were to alter Clem’s design and make her look more realistic, she would prolly resemble the real Sakai’s daughter at that age.

Is it ethical to rip Clem’s model from the game and use it for porn animations when she wears the stylized face of a real child?

Speaking of video games, there’re many games now that use motion capture. Many actual child actors are brought in to do physical acting that is then made into an in-game model. Sarah from “The Last of Us”, Alice from “Detroit: Become Human”, etc. These are real people whose likenesses got converted into video game characters, characters whose models can be ripped from the game and used for NSFW purposes.

Because these are the digitized bodies of actual children, is this skirting a line far too close to reality? Is animating a realistic model comparable to deepfaking?

Even without 3D, many would still object to making sexual art of real child actors. 2D hentai that is clearly based on, say, young Emma Watson as Hermione Granger; the late Heather O’Rourke as Carol Anne Feeling; young Dafne Keen as X-23, etc. If it’s a stylized drawing that only barely resembles the real actors, should that get a pass? I recall an infamous porn artist making brutal rape porn of X-23. It was heavily stylized, but obviously based on how she appeared in the live-action movie “Logan”. Keen’s lawyers dropped the hammer HARD on this guy, scared him into stop making loli altogether even when based on fiction. Was it moral to stop him? Or should his freedom of expression been protected as it wasn’t a deepfake?

I feel like I’m dipping into philosophy at this point. Questions like this quite literally keep me up all night, obsessed with the morals and ethics of this gray area. I’ve heard opinions ranging from “all loli encourages the sexualization of children and must be banned” to “anything short of actually raping people should be allowed per the freedom of expression; even deepfakes are art”.

I myself wonder if the situation is comparable to political cartoons and political deepfakes. Unless I’m mistaken, there’re no laws against such things. If I use these media (print and video) to depict a certain politician in a particularly unflattering manner, should it be censured as libel/slander? Or should it be protected as artistic expression of my opinions/beliefs? Remember when Kathy Griffin held up the (fake) bloodied severed head of Donald Trump? I heard folks express opinions ranging from “that is a threat of violence against Trump and should be prosecuted as such” to “it’s an artistic way of expressing her disapproval of Trump and should be protected”. Is this situation comparable in any way to the plight of deepfakes, hentai, etc.?

The “beheaded Trump” image in question:

I hope I’ve at least expressed why I feel so ambivalent about this whole thing. Again, the man in the above article had unambiguous CSAM. But had that not been the case, should he have still been arrested? If he had simply swiped some fully-clothed stock photos off the Internet and used them to make sexual images, should that be prosecutable? Or if he had recorded himself masturbating to fully-clothed stock photos of children, should that be illegal? What’s the right thing to do here? What’s the correct course of action?

My apologies if my questions cause harm/are offensive. I’m just trying to make sense of what I view as a complex topic. Also, my apologies for this miniature essay of a comment. Just tryna get all my thoughts out there, y’know?

1 Like