Harnessing PhotoDNA for use into a browser for good

So in an exchange with a prominent developer of PhotoDNA, this developer pointed out that the censorship tool could be deployed relatively easily. He also pointed out that it would not be used to report anyone. If you were to come across CSAM that is hashlisted, it would simply block.

His rational is that the reason why it would be ill advised to have an auto report system because you simply cannot determine the intent of users.

He also pointed out many browsers such as Firefox are skeptical of this deployment because of privacy concerns.

I believe that in principle, PhotoDNA SHOULD be deployed to all browsers at some point in the future and could be deployed to operating systems to make it extremely difficult to collect or see CSAM. However, I do believe some concerns pointed out in this group and elsewhere need to be addressed first.

  1. There are some rumors or cases that legal images have have been hashlisted. If this is the case, it needs to be resolved.

  2. What are the privacy concerns? This is not some wing of the government to report people, it’s simply a censorship tool if it were to be used for browsers as he has stated. But are there moral hazards that it could eventually become used to censor things which have absolutely nothing to do with CSAM. Such as by repressive regimes. So there are valid concerns that it may start out as an innocent anti CSAM censor, but could be used for evil by despots. What kind of safeguards could exist to mitigate this?

I believe once all the privacy and ethical issues are worked out, it should be deployed, and should be deployed to TOR especially. But in order for the TOR community to accept this, I believe they will need proof that it won’t be used as just another arm of the government, but simply as a censorship tool against CSAM that is about two things: protecting people from being exposed to it as well as preventing revictimization of CSAM victims.

The general consensus within TOR is CSAM is a blight in OUR community, ON BEHALF OF THE TOR COMMUNITY, WE WANT NOTHING to do with CSAM. If you were to work out all the issues, I for one would love to make it extraordinarily difficult for anyone to access this contraband. IF PhotoDNA could be deployed in TOR in an ethical manner, WE WELCOME IT WITH OPEN ARMS!

When Firefox adopted DNS over HTTPS they specified conditions for resolvers that Firefox would trust, and one of them was no blacklisting of websites (even for CSAM) unless there was a transparent process for maintaining the list. I suggest that a similar standard would be required if Tor/Firefox were ever to adopt the proposal you suggest. It might be that the IWF hash lists would meet the standard but the NCMEC lists wouldn’t. Prostasia would like to be a part of any discussions on this topic.

1 Like

I’m still not convinced you’re not an elaborate troll. Are you making fun of us?

Distributing a database is stupid as you will simply give the clients all the hashes they need to dodge to get around your censorship regime. Someone can simply fork the browsers too (by modifying one line of code) and redistribute it.

America, Australia, UK, New Zealand and Canada are thought control repressive regimes. The Netherlands are better, although they’re getting hit by populists. Germany is okay, as is Japan. The ones who care more about privacy / civil liberties don’t mention “CSAM” much.

“CSAM” is mostly a Western weapon to censor the web by exaggerating the proliferation of the content (how many duplicates does the report counts count? how many hits are someone mutating an image to try to get around a filter? how many are backups? how many are transfers in bulk?), the number of people impacted and individual impacts by cherry-picking the worst (find the victims worst impacted and get them to show up at a hearing as opposed to the ones completely unaware). Most of it is borderline too, according to official research papers.

The content is bad, but the West obsesses over it too much. Pushing this hard is also going to get you crucified by every single civil liberties group. A more practical idea is to create alternatives, promote them and get countries to repeal really regressive laws like throwing someone in prison for a drawing than to become everyone’s enemy.

It would really help if every person didn’t end up going underground into a child rape echo chamber haven. This isn’t the meme version of what people call advocates of that on Twitter either, anyone likely to be doing that on Twitter is a troll.

3 Likes

Viewing/possessing CP is a victimless crime. No harm is made as long as the activity is kept secret. Measuring the activity and making the numbers public is bad. Victims are better off being unaware.

Stopping the distribution of CP makes sense, but it’s a huge challenge. Will we ever get to a point where a victim can be confident that the internet has forgotten them? Perhaps it’d be more effective to reduce the shame and internalized stigma among victims.

Last time I checked there was zero empirical evidence that the distribution of CP caused harm to the victims. This may have changed.

1 Like

I too would hope to see a serious discussion around the topic as well.

they specified conditions for resolvers that Firefox would trust, and one of them was no blacklisting of websites (even for CSAM) unless there was a transparent process for maintaining the list.

I think that’s a decent balance, we obviously shouldn’t trust blacklists that are opaque when it comes to their process of list building (did they intentionally lump images deemed legal into illegal images? for example). But are open to the use of such lists if deemed transparent.

This could be a plugin at first, couldn’t it? That would make a lot more sense. People who wanted to use the plugin could use it, and—let’s face it—it could really only be for the people who want to use it. As @LoliShadow points out, those who don’t want to use it would just fork the code if it was built directly into the browser. But a lot of people would feel safer using Tor if they knew that a random link wasn’t going to unexpectedly take them to illegal images (sure, the risk is low but the perception of risk is there).

The main difficulty in using the plugin would be that because the hashlist vendors will never give up their lists, there would have to be an API call made to a remote server, and this would have to be anonymized. This too might be something the hashlist vendors wouldn’t agree to. But if there is interest in the idea, I could reach out to my contacts and find out.

1 Like

Having it start out as plugin at first via API to remote server which is anonymized would be nice. I would install it in a heartbeat if it become available.

I believe there is a dev by the name of Hany Farid that you mentioned in the article. I wonder what his take of this idea would be. I believe he has a contact email that you can reach out to.

The road to hell is paved with good intentions.

Honestly, I just see this being abused.

2 Likes

Risks need to be taken for the greater good. I’m a utilitarian, if censorship achieves my ends, I will support it. Victims of CSAM have a much harder time recovering because people keep accessing and sharing them. I believe reading up suicide attempt for this group is 80% if I recall correctly.

Perpetrators end up for possession end up screwing not just themselves and their families. CSAM possession offenders have extremely high suicide attempt rate. They are not just doing real harm to victims of CSAM, but are harming themselves.

We could eliminate quite a lot of suffering if we can pull this censor properly. 1)Reducing harm to CSAM victims 2)preventing potential perpetrators from crossing the line in such a way that harms victims and themselves.

Yeah, no. You and I are not going to see eye-to-eye on this.

2 Likes

I find non utilitarians to be ultimately irrational cowards unable to do the necessary dirty work. Furthermore, I do not advocate for privacy violations, what I have advocated for is needed censorship to protect vulnerable people and reduce depression/suicide statistics in victims and pre-offenders by prevention.

As well as research into the links between CSAM possession offenders and contact offenses in order to determine whether fictional CP (Like certain types of Manga and 3DCG fictions) would increase likelihood of offending or decrease it before making any substantial policy recommendations.

If the research indicates censorship via PhotoDNA scanners would fulfill the greater good principle, and required that such scanner are to be enacted into all browsers even if it snags a few legal images, so be it. Utilitarianism does not need to rely on emotional knee jerk reactions. We rely strictly on evidence. Some solutions I proposed have horrified people, but that is because they don’t think like I do.

Why is it harmful to view and possess CP? Is there any empirical evidence that distribution of CP is harmful?

1 Like
  1. The children depicted don’t want anyone looking at it. In my country, the government notifies the victims depicted in it every time someone is caught with it or sharing it once the victim turns 18. So until the images are taken out of circulation it will be a difficult for the victims to fully heal since they will always know about it. One of the biggest desires is that people stop accessing those images and I believe we owe it to them.

  2. Also it harms the person watching it too. In UK at least from what I’ve heard and remembered, CSAM view/possession offenders have the highest attempted suicide rate of any demographic. UK’s justice system regarding CSAM offenders that are not producers or distributors are much more focused on rehabilitation and less on punishment than the USA. (Usually suspended sentence of some kind measured in months to a couple years + required counseling & electronic monitoring of phones and computers). But despite the relative leniency, they still have extremely high depression/attempted suicide risks. This tells me there is something about this ghastly material that fucks people up psychologically.

From a Utilitarian point of view, It must all be destroyed. These two things above prove that CSAM is nothing but trouble. Even if there is no evidence that viewing or distributing CSAM increases demand for rape and molestation of children, there are the harms I described above that should be enough to warrant aggressive censor said content.

If there is compelling evidence fictional CP diverts people from accessing CSAM, than strong effort must be made to repeal laws prohibiting fiction ASAP. But that wouldn’t go far enough, once legalized in the US, regulated promotion of said said fiction should be part of the solution alongside censorship of hashlisted CSAM. If the demand can be satisfied by harmless alternatives in the form of sexualized depictions of non-existent NPCs and characters, I say why not.

  1. Stop reporting the incident to the victim. It will solve the problem.

  2. I’m pretty sure the high suicide rates among CP posession offenders are explained by the criminal justice system and society’s response to the crime. I’ve heard of people being addicted to hardcore CP, which can be traumatizing to see. I believe they are few. According to Michael Seto, it’s hard to find CP where the child isn’t smiling.

1 Like

They have an extremely high suicide attempt rate because of fear of being thrown in prison, put on the sex offenders register (make no mistake that the register is no less different than a prison sentence), inability to get employed and being socially ostracized. A lot of suicides in particular, come from them being thrown in prison or imminently being arrested / having been arrested.

Getting better at detecting them (something I have doubts whether is possible) would only make this happen more frequently.

It would be impossible to completely “pull the wool over their eyes” so to speak with this, but avoiding notifying them would be a good idea, particularly for those who don’t want to be notified.

Only in a third of cases and I would imagine that many of these are for cartoons / looking at legal images with sexual intent / not really a pedophile.

Being spied on isn’t a good experience either when anything you think, say or do could be twisted against you. Or when the “counseling” does little but make the problem worse (according to people from the U.K. who have been through it) as it is little more than conversion therapy.

The most persuasive theory here is that they don’t want people looking at it as it reminds them of their trauma. This is something that is very unfortunate, if you can mitigate the impact by reducing it’s transmission (without infringing on civil liberties in a harmful way), creating alternatives or avoiding provoking them by mentioning it unnecessarily, this may be good.

1 Like

They have an extremely high suicide attempt rate because of fear of being thrown in prison, put on the sex offenders register (make no mistake that the register is no less different than a prison sentence), inability to get employed and being socially ostracized. A lot of suicides in particular, come from them being thrown in prison or imminently being arrested / having been arrested.

Getting better at detecting them (something I have doubts whether is possible) would only make this happen more frequently.

Terrorists, killers, rapists and subhuman-chomos regularly get significantly harsher sentences yet they have far less suicide risk. I doubt the judicial system’s punitiveness is the main cause of it. Also in the UK, their SOR is not made public, no pointless “2500 feet away from public parks” rule. Possession offenders often get limited time on it, we are talking often 2(if Cautioned and nothing more), 5, 7 10 years, once a year you give info they want unlike in the US where it’s often every 90 days. UK SOR is really nothing compared to Florida or rest of USA.

It would be impossible to completely “pull the wool over their eyes” so to speak with this, but avoiding notifying them would be a good idea, particularly for those who don’t want to be notified.

That’s the main reason why I want the idea of PhotoDNA censorship of hashlisted CSAM and CSAM websites via the end user’s browser, operating systems and storage systems to be discussed and explored more. Whether notifying them or not makes their situation worse, they still suffer knowing these heinous images are out there. ONLY LESS THAN 1% OF THESE CRIMINALS GET CAUGHT, WE THE CRIMINAL JUSTICE SYSTEM CANNOT RESOLVE THIS ISSUE BY ITSELF. WE NEED PREVENTION, WE NEED TECHNOLOGY.

Only in a third of cases and I would imagine that many of these are for cartoons / looking at legal images with sexual intent / not really a pedophile.

Statistics show that less than a quarter of the 2,528 people sentenced for making, distributing or publishing child sex abuse images in 2017 were jailed, with almost half handed suspended sentences, and one in five given community orders.

“Making” btw in UK means affirmative downloading. And it’s true that half are given prison sentences, they are suspended sentences meaning they don’t go to prison, they are on probation as long as they don’t fuck up the conditions.

Or when the “counseling” does little but make the problem worse (according to people from the U.K. who have been through it) as it is little more than conversion therapy.

Re-offense risk for CSAM possession offenders is actually slightly low with total sexual recidivism at 5.4%:
in the most recent meta-analysis including a combined sample of 2,630 online offenders, Seto, Hanson, and Babchishin (2011) reported that 3.4% of online offenders were found to reoffend with another CP offence, while only 2% reoffended with a contact sex offence.

While 5.4% sexual recidivism isn’t exactly great, it isn’t terrible either. If it doesn’t work, the recidivism among this group of criminals should had been far higher. Other countries also have similar recidivism risk for this group. Sadly they are stubbornly in the 3-6% range, but I’m not convinced it does not work.

Terrorists, killers, rapists and [child molesters] regularly get significantly harsher sentences yet they have far less suicide risk [than CP viewers].

I’m surprised that child molesters have a much lower suicide risk than CP viewers. Would you like to provide us some evidence?

Whether notifying them or not makes their situation worse, they still suffer knowing these heinous images are out there.

I can see three things that cause harm to CP victims.

  1. fear that people have seen their abuse or will see it.
  2. fear of being judged.
  3. the belief that CP viewers approve of their abuse.

(1) Removing CP permanently from the internet is a huge challenge. We cannot do it with PhotoDNA or any other modern surveillance technology. I think it’s more reasonable if we

  • stop notifying victims when someone sees their images.
  • limit CP distribution.

If a person can’t tell the magnitude of a problem, they can choose to believe it’s small.

(2) A victim may experience undeserved guilt over their behavior and reactions. We need to hear a variety of victims’ stories, not just those society expects to hear. In other words, we should reduce the stigma. This could help CP victims feel less shame for how they appeared on camera.

(3) There is a misunderstanding that CP viewers approve of children being harmed or take pleasure in the fact. In a survey on Virtuous Pedophiles regarding the attitude of CP viewers, 50 % of the participants chose “I feel really bad about getting sexual enjoyment from a child’s abuse”.

I’ll end with a story on CP viewing from the perspective of the offender (below).
https://celibatepedos.blogspot.com/2015/05/compassion-for-cp-viewers.html

2 Likes

Well, I took a look at that post. What a strange site. A site in which someone is OPENLY TELLING EVERYONE ELSE they are a pedo? WTF. While I am sympathetic of ex-criminals who reform themselves, that website gives me the creeps. But it does not surprise me that CSAM possession offenders tend to feel guilty if only 2% re-offend with another CSAM offense.

Yes, victims of CSAM should absolutely NOT be judged by what happened to them, they were forced into this situation by sexual predators. Sadly, it is true that >30% do get recognized. This is not just emotionally harmful, but physically dangerous.

The last one “3” is interesting. If they think many do not approve of what’s happening, does that impact their well being in any way?

All that post does is make it even more necessary from a utilitarian perspective to censor the fuck out of CSAM.

I’m not convinced it can’t be done though. Maybe you are right, but that does not mean we shouldn’t try to figure out how to preserve privacy while protecting the victims and the general public from CSAM. With future advances in technology, I believe they may very well figure out a way to make it essentially impossible to not just view PhotoDNA hashlist CSAM on the web browser, but to make it impossible to affirmatively store in on one’s own hard drive or other storage medium. It may not happen in ten years, or even twenty, but it would be a surprise if they have not figured out how to do so by the turn of the century.

Aside from the obvious phychological harm the continued distribution/download of such heinous images bring to victims, they are essentially a radioactive lethal legal needle on the floor – waiting for someone who has a limited understanding of the law or issues to violate. A person may stumble across such heinous imagery, but may, without thinking it through back-space into said imagery. Something we call “Rubbernecking”. I do not condone doing so whatsoever, but some people who have no interest in CSAM may find themselves doing so by impulse. Now that person is liable for up to 20 years imprisoned with lifetime supervision. These “rubberneckers” could have been spared from stepping on such a needle as well as spared the CSAM victim from another re-victimization if only PhotoDNA was an aggressive censor to the problem.

The general trend for “reforms” regarding CSAM sentencing is towards harsher sentences. By 2040, I predict there will be a mandatory minimum for possession, something along 5-10 years with a maximum of life for access with intent to view/possession. This is going to happen. In this case, our theoretical rubbernecker would be looking at centuries in prison. What if someone innocent gets convicted?! Well… an innocent person who was convicted too would be screwed in this case.

Considering the harms CSAM does to victims and the dangers it poses to the general public as I posted above, I consider it to be perhaps more dangerous than radiological weapons. Like with radiological weapons, the harm to health, whether to victims, to innocents convicted but were really not guilty, or to otherwise normal but unfortunately foolish people who act foolishly around such material they stumble into, the harm can be indeed lifelong.

I believe we must take the issue of CSAM as seriously as we take the issue of other extremely dangerous substances, and I say it what it really is: a very psychologically and legally deadly substance that must be purged. And as this material gets circulated more, and as sentences become more harsh, I only expect this material to become ever more dangerous for everyone. The longer it percists for, the more psychologically and legally dangerous it becomes (the OPPOSITE of radiological weapons thus making them FAR MORE DANGEROUS).

I think society and companies in general need to take CSAM MUCH MORE SERIOUSLY than they do now. This DOES NOT MEAN I support harsher sentencing for possession which is ineffective, but it means we need to really explore methods that can effectively come very close to eradication of what is essentially am extremely toxic psychological and legal substance. We need to look outside the criminal justice system for solutions and answers.

Sadly, it is true that >30% do get recognized. This is not just emotionally harmful, but physically dangerous.

Can you provide evidence that >30 % of CP victims know someone recognized them from CP? I can see how that would be emotionally harmful, but how is it physically dangerous?

You would have to be a total jackass to tell a victim you saw them in CP.

If they think many do not approve of what’s happening, does that impact their well being in any way?

Yes I’m sure it can benefit victims.

1 Like

It only takes one total jackass to do it, but yes, they shouldn’t do it. NYTimes reported an incident where this happened, although I would imagine it happens less frequently than people like to make it out it does.

In one case, the court case against the offender was televised and the cp offenders learnt of the victim’s existence through that. Precautions may need to be taken to protect the identity of a victim and their safety in future cases.

I very much think society takes it seriously enough, thank you very much. They take it seriously enough they are seriously contemplating dismantling free speech to do it. It is disingenuous to pretend that they do not. This is very much not the case.

What you may have confused with “not caring” is the idea that the world and everything society does does not revolve around CP. People do not wake up in the morning and think for every winking moment about what they will do about CP. To them, it is simply another category of crime and no one ever says murderers aren’t taken seriously or to make plans on how they will reduce the murder rate.

If someone who is innocent gets convicted, then that is a sign of an unjust system. There is no use in apologizing for a government that does not follow basic due process.

I am not going to look for sources right now, but I have seen sources which aren’t taken out of tabloid media. This also refers to 2017 when there was a sudden influx of cases which they needed to handle, politicians have since pushed back.

WE NEED TO STOP TALKING LIKE THIS.

https://www.msn.com/en-gb/news/uknews/child-abuser-banned-from-devon-and-cornwall-after-10-years-in-jail/ar-AAHcUXU
It’s a woman which helps to break up the stereotype, but you can be subjected to exclusion zones, if they deign to do so.

They also get fewer opportunities to commit suicide in prison.

It isn’t the justice system’s punitiveness either, but destroying someone’s life. You’re not going to get a job. You’re going to be on a list. You’re going to be watched. You’re going to be discriminated against. You’re going to be in the newspapers and treated the same as if you did molest someone. This information doesn’t magically appear from people’s minds overnight.

I’m sorry that you have such a hard time coming to terms with the fact that we live in a democracy.

This is often attributed to someone being scared by the justice system, rather than counseling. They would also know to take better precautions next time. It should also be noted that a lot of offenders that are caught don’t fit the normal profile of who you are thinking they are.

Wait rofl

TheGuardian said in 2017 there were no services and they were going overseas to Germany for Dunkelfeld. No psychologist wanted to fall into disrepute by dealing with these characters. The services may be a bit more recent than that in the U.K.

4 Likes