TLDR: I may have made a grave error in a google search in a country where cartoon/anime depictions are illegal. Looking for insight into what danger I face after the fact.
Like other search engines, Google flags uploads of CSAM if the upload’s hash matches one from whichever database(s) they use. It also filters search results to not include CSAM to some degree.
One example of flagging content: There have been cases where people have been jailed for uploading real photos of CSAM to reverse image search.
If something like loli hentai were uploaded to a reverse image search, would that generate a report? It seems like loli hentai is abundant in image searches with non-banned keywords, so their filtering looks lax. Still, I’m not sure how relevant that is to searches with uploads.
There was a topic last year about PhotoDNA apparently not flagging hentai, but it was unclear if that included characters depicted as underage. I’ve read that groups like the Child Rescue Coalition (US) do keep cartoon hashes in their database, so it’s unclear whether other databases do the same.
Does anybody have info on how Google approaches the flagging of cartoons? I know I should not have risked this (it was a very small number of searches, but I remember a case where someone in my country got tagged for 2 searches, both photos), but I had a pretty reckless addiction that I’m getting counselling for. All I can hope for is that the searched images were obscure enough to not be in a database somewhere.
If it is reported, how likely is it to be followed up? I had a vpn but it’s no-log policy is unproven.