Now if only we could convince anti-AI fanatics to focus their efforts on actually harmful shit like this
Whatās interesting is that he was charged for CSAM production and possession, as well as the creation of deepfake material, under child pornography laws, basically confirming we donāt need obscenity laws to be able to target this type of direct, real harm.
Keeping criminal prohibitions limited to materials where the rights of actual, real, living children is precisely what the SCOTUS had in mind for Ashcroft v. Free Speech Coalition.
https://www.law.cornell.edu/supct/html/00-795.ZO.html
By prohibiting child pornography that does not depict an actual child, the statute goes beyond New York v. Ferber, 458 U.S. 747 (1982), which distinguished child pornography from other sexually explicit speech because of the Stateās interest in protecting the children exploited by the production process. See id., at 758. As a general rule, pornography can be banned only if obscene, but under Ferber, pornography showing minors can be proscribed whether or not the images are obscene under the definition set forth in Miller v. California, 413 U.S. 15 (1973). Ferber recognized that ā[t]he Miller standard, like all general definitions of what may be banned as obscene, does not reflect the Stateās particular and more compelling interest in prosecuting those who promote the sexual exploitation of children.ā 458 U.S., at 761.
Section 2256(8)(C) prohibits a more common and lower tech means of creating virtual images, known as computer morphing. Rather than creating original images, pornographers can alter innocent pictures of real children so that the children appear to be engaged in sexual activity. Although morphed images may fall within the definition of virtual child pornography, they implicate the interests of real children and are in that sense closer to the images in Ferber. Respondents do not challenge this provision, and we do not consider it.
We all know that it would have not mattered if the images were alterations or fictitious. They would have got his ass either way using obscenity.
Also, AI stuff is still pretty questionable because how can someone truly proof it is not an actual child? What if it is something like this here on a broad scale? As time goes on this stuff will get better and thus harder to spot.
Thatās not how burden of proof works. The prosecutor should be responsible for proving it is based on an actual child
Iām surprised by the levelheadedness of the comments in that news article.
Especially since itās a British news website.
Which is a very trivial task, given how easy it is for training images to be extracted. All you would need to do is query it, and once you find a match, you could do an analysis that would rule out look-alikes and meet the burden of proof.
If these were about targeting images of minors that did not exist (fictional characters), then I would be more worried.
This isnāt necessarily true. All cases that have been brought by the DoJ under Garland involving obscenity seem to be limited to people who already have criminal records/histories relating to contact child sexual abuse, CSAM possession, or were for materials that constitute CSAM anyway.
I think itās safe to say that the DoJ under Biden is not interested in pursuing these types of matters unless it can be somehow linked to a real child, or if the suspect is a known sexual offender.
Still incredibly dangerous that the government even has the option to prosecute you based on the fiction you like, though. Governments have proven time and time again that they canāt be trusted to just ānot enforceā harmful laws
Well, thatās sort of the state that pornography in general is in.
But a silver lining is that nothing is obscene unless declared so in a court of law, and on a case-by-case, state-by-state basis, meaning that the same thing can be brought in and found obscene in one court, and not obscene in another court.
There also exists a presumption of innocence. Matters of obscenity are not matters of fact, merely opinion and conjecture, so that presumption goes a long way in more ways than one could imagine.
This is why CSAM is its own category and is not required to be found āobsceneā. Even pornography, for its own sake, has artistic value, a fact that academics are finally being more vocal about.
This criminal would have probably received the same sentence regardless if AI was involved or not. The criminal had real illegal material of kids. It was not just deep fakes. He also distributed the stuff correct? I use a negative prompt for AI art to block out illegal content as well as a ram disk. With Stable Diffusion 1.5, you can tell that the images are video game like images. They look fake. I use fictional in my regular prompts as well to tell the AI to generate a fictional person even if it is based on an actor.
Stable Diffusion also was released in 2022. So AI art of that nature probably was not involved here. The criminal committed his crimes in 2021. The criminal still harmed real people regardless.
This article hypes up the AI so much that itās difficult to tell if the CSAM mentioned is real or not, almost as if itās just thrown in as a side note. Either way I find his 40 year sentence + 30 years probation to be extremely cruel and unusual punishment. The guy may as well have been an actual child abuser to merit such a sentence and would likely get less time for murder.
I wish Prostasia would clarify its Twitter/X post on this because it really sounds like they are attacking AI and equating it with real sex abuse.
It may not be a good thing and can definitely be grounds for legal action, but pasting a face on a generated body is not sex abuse.
The underlying rationale is that itās a real childās likeness. It may not be as severe as actual child sexual abuse, but it places them in a position that they did not (or could not) otherwise consent to by way of (mis)appropriation of their likeness.
Contrasted against images of fictional/virtual āchildrenā (that are fictional characters which do not exist), there is some harm and risk to be observed when itās a real child. Virtual/fictional children are not āchildrenā, they are not persons, and cannot be afforded any of the same rights or legal protections that are afforded to actual children.
They are also not conceptualized the same way, in that these virtual/fictional depictions do not promote the market/demand for materials involving real children because the demand for them is grounded by a set of preferences or ideas that the latter does not have afforded to it, hence why virtual/fictional child pornography is on the same level of legality as adult pornography.
Prostasiaās post doesnāt even mention AI, so Iām not entirely sure where you feel clarification is needed.
Using someoneās image in porn without their consent is abuse, imo. Itās obviously not equivalent to rape, and our legal system desperately needs reforms in sentencing especially to reflect that, but itās still involving someone in a sexual context without their consent
Itās wrong for the same reason that AI revenge porn is wrong
The article is about using AI to create deep fakes.
The reason revenge porn is bad is because of the ārevengeā part when it is used for nefarious purposes.
People should be free to manipulate legal data in whatever means they wish. How this person got caught isnāt mentioned, but I highly doubt he was blackmailing anyone whose ālikenessā he used.
The criminal had real illegal content at first before AI was used.
Including someone in porn requires their consent. If you donāt have that, itās abusive
Where I live, the average prison sentence for MURDER is 15 years. What this man did was definitely not okay, and he certainly shouldnāt be allowed around children and perhaps lose his medical license entirely. Maybe he should do some prison time too. But 40 years?? Thatās absolutely insane!
This sets an incredibly dangerous precedent which ought to alarm MAPs in general and Prostasia in particular. Supposing someone takes photos of fully clothed children without their knowledge, uses AI to manipulate the images to make them appear nude and engaged in sexually explicit conduct, then further programs the AI to redo the images to look like pen-and-ink anime and then sets up a website with those images. Then MAPs download the images imagining them to be legal. But when their homes get raided and the police seize their computers, the prosecution is able to prove that those āanimeā images are AI-alterations of real children.
Since these hypothetical defendants didnāt actually create those images themselves, but nevertheless received and possessed them, presumably they wonāt get 40 years in prison. Perhaps the State will be merciful and let them off with a mere 20 years, with the possibility of parole after 10 years, or somesuch.
Weāre not there just yet, but with this conviction, weāre only a precedent or two away.
I think Prostasia opposes the direct use of childrenās images in the creation of sexual content
One personās mainstream G-rated image is another personās āsexual context.ā
For example, I provided an AI with the following G-rated instructions (in my search for an āillustrationā for a fictional story I wrote):
āPhoto of brown-eyed, 8-year-old girl in a livingroom standing facing the camera. Pouting. Angry. Frowning. Arms folded across her chest. Long, straight, light-brown hair. Hair tied back with a white hair band. long-sleeve sweater. Blue & white plaid, pleated skirt which reaches to her knees. Pale blue knee-high socks.ā
And it responded with the following G-rated image:
As one can see from my instructions, this image was not based on a real child. But suppose hypothetically that it were? >99% of people would likely find nothing sexual about such an hypothetical image, any more than the one above. However, for me personally the above image is hot porno. I lo-ove to see pullover sweaters and plaid pleated skirts on girls and young women - itās just how my fetish works.
Soā¦ should >99% of people be permitted to possess an image similar to the above, (but a hypothetical version based on the facial features of an actual child star or model from the web), which they donāt find erotic, while I get 40 years in prison for possessing the same image because I do?
I personally have a LOT of very mixed opinions when it comes to the subject of using a real-life reference for NSFW art. Itās a subject I struggle to come to any concrete answer on, and many whom I ask have their own opinions and admit they donāt have all the answers.
Now, it should be noted that (unless I am misunderstanding) this man had non-AI CSAM. Pure photographs of naked children that he had taken without their knowledge. I fully agree that this is immoral.
But letās move away from this case and into hypotheticals. For a first example, let us take a duo of heavily stylized fictional characters who happen to be based on some real people: Dipper and Mabel Pines from āGravity Fallsā.
The Pines twins are unrealistically designed cartoon characters. However, they are loosely adapted from real people: the creator and his twin sister. Alex Hirsch based Gravity Falls on his own childhood experiences and interests. The main protagonists are based on himself and his actual family. A popular ship amongst some sects of the fandom is āPinecestā. Three guesses to who the ship is referring to. IIRC, Alex has stated that he doesnāt want to tell anybody what they can or canāt do with his characters in their fanworks, but a ship indirectly based on himself and his own sister as children obviously skeeves him out.
Is hentai based on the Pines twins immoral because theyāre loosely based on two real-life people? Said people having been vocally disgusted by those who ship them?
How about a character whoās less cartoony, but still stilized? Clementine from Telltale Gamesā āThe Walking Deadā.
The comic-inspired art style of Telltaleās adaptation is not going to be mistaken for reality. But the overall main protagonist of the series, Clementine, is physically based on a real little girl. Derek Sakai is the art director for this game, and he based Clem on his own daughter. If you couldnāt tell, Sakai is of Japanese decent. Clem, despite supposedly being African American, is often mistaken for East Asian. If you were to alter Clemās design and make her look more realistic, she would prolly resemble the real Sakaiās daughter at that age.
Is it ethical to rip Clemās model from the game and use it for porn animations when she wears the stylized face of a real child?
Speaking of video games, thereāre many games now that use motion capture. Many actual child actors are brought in to do physical acting that is then made into an in-game model. Sarah from āThe Last of Usā, Alice from āDetroit: Become Humanā, etc. These are real people whose likenesses got converted into video game characters, characters whose models can be ripped from the game and used for NSFW purposes.
Because these are the digitized bodies of actual children, is this skirting a line far too close to reality? Is animating a realistic model comparable to deepfaking?
Even without 3D, many would still object to making sexual art of real child actors. 2D hentai that is clearly based on, say, young Emma Watson as Hermione Granger; the late Heather OāRourke as Carol Anne Feeling; young Dafne Keen as X-23, etc. If itās a stylized drawing that only barely resembles the real actors, should that get a pass? I recall an infamous porn artist making brutal rape porn of X-23. It was heavily stylized, but obviously based on how she appeared in the live-action movie āLoganā. Keenās lawyers dropped the hammer HARD on this guy, scared him into stop making loli altogether even when based on fiction. Was it moral to stop him? Or should his freedom of expression been protected as it wasnāt a deepfake?
I feel like Iām dipping into philosophy at this point. Questions like this quite literally keep me up all night, obsessed with the morals and ethics of this gray area. Iāve heard opinions ranging from āall loli encourages the sexualization of children and must be bannedā to āanything short of actually raping people should be allowed per the freedom of expression; even deepfakes are artā.
I myself wonder if the situation is comparable to political cartoons and political deepfakes. Unless Iām mistaken, thereāre no laws against such things. If I use these media (print and video) to depict a certain politician in a particularly unflattering manner, should it be censured as libel/slander? Or should it be protected as artistic expression of my opinions/beliefs? Remember when Kathy Griffin held up the (fake) bloodied severed head of Donald Trump? I heard folks express opinions ranging from āthat is a threat of violence against Trump and should be prosecuted as suchā to āitās an artistic way of expressing her disapproval of Trump and should be protectedā. Is this situation comparable in any way to the plight of deepfakes, hentai, etc.?
The ābeheaded Trumpā image in question:
I hope Iāve at least expressed why I feel so ambivalent about this whole thing. Again, the man in the above article had unambiguous CSAM. But had that not been the case, should he have still been arrested? If he had simply swiped some fully-clothed stock photos off the Internet and used them to make sexual images, should that be prosecutable? Or if he had recorded himself masturbating to fully-clothed stock photos of children, should that be illegal? Whatās the right thing to do here? Whatās the correct course of action?
My apologies if my questions cause harm/are offensive. Iām just trying to make sense of what I view as a complex topic. Also, my apologies for this miniature essay of a comment. Just tryna get all my thoughts out there, yāknow?