Yeah, no… this story stinks. And I won’t believe a word of it until the author provides some objective proof to their claims that Google actually scans for and files reports on fictitious cartoon/drawn child pornography. Such material is on the same level of legality as adult pornography, and the idea that specific works can be judged objectively to fit the bill of obscene criminality is beyond senseless.
It’s not clear which of those two technologies were used in late 2020 in Kansas, when Google detected “digital art or cartoons depicting children engaged in sexually explicit conduct or engaged in sexual intercourse” within a Drive account, according to the warrant. It goes on to detail the graphic images, which included what appeared to be sexually explicit cartoons of underage boys.
What warrant?? Please provide the warrant in question if it exists. I want to see it. I want to make sure that it exists and that drawings are the focus of the inquiry.
I would even accept a REDACTED warrant, wherein the names and identifying characteristics of the person in question are withheld to protect their privacy. I just want proof that it exists.
As per its legal requirements, Google handed information on what it found, as well as the IP addresses used to access the images, to the National Center for Missing and Exploited Children (NCMEC), which then passed on the findings to the DHS Homeland Security Investigations unit. Investigators used the IP addresses provided by Google to identify the suspect as the alleged owner of the cartoons, and searched his Google account, receiving back information on emails to and from the defendant.
There is no legal requirement for individuals or ESPs to report drawings, cartoons, or other forms of noncriminal virtual child pornography, that responsibility is discussed in 18 USC 2258A, and is very clear in its scope, which is limited to depictions made using real children.
(2) Facts or circumstances.—
(A) Apparent violations.—
The facts or circumstances described in this subparagraph are any facts or circumstances from which there is an apparent violation of section 2251, 2251A, 2252, 2252A, 2252B, or 2260 that involves child pornography.
(B) Imminent violations.—
The facts or circumstances described in this subparagraph are any facts or circumstances which indicate a violation of any of the sections described in subparagraph (A) involving child pornography may be planned or imminent.
As shown here, it’s very clear that drawings are not included.
It appears the suspect may actually be a known artist. As no charges have been filed, Forbes isn’t publishing his name, but the man identified in the warrant had won several small Midwest art competitions, and one artwork from the 1990s had been mentioned in a major West Coast newspaper.
Well I would hope no charges would be filed! Should this warrant exist, it was executed in late 2020 and still no arrest, indictment, arraignment, or docket to be had. It’s safe to assume that no charges would be filed after that long, right??
And even then - we still don’t have a warrant or any proof to go off of!
This may concern some artists who draw or depict nudes. But the law around cartoon imagery is worded so as to provide some protection for anyone sharing animations of children engaging in sexual conduct for the sake of art or science. A prosecutor trying to convict anyone possessing such material would have to prove that the relevant images were “obscene” or lacked “serious literary, artistic, political, or scientific value.”
Right. And such things cannot be adjudicated in such a way that a computer system can adequately compensate or accommodate for. What happens in 5 or 6 years when such materials are unlikely to be considered obscene, even in the most prudish, conservative state or locale?? Just remove it from the hash table??
What about the BILLIONS of artwork that caters to these interests?? Wouldn’t that have an effect on the efficacy of a CSAM scanning regime?? There’s a reason why the NCMEC does not include this in their list of reportable criteria.
There are a lot of things wrong with lumping in fictional materials alongside actual CSAM, and I’ve only begun to touch on why that is.
It TRIVIALIZES a regime where the rights of actual, real children are supposed to be priority number 1!
If you’re going to lump in cartoons, you might as well lump in depictions of petite/youthful adults, since legally they’re on identical ground as far as child pornography law is concerned.