Harnessing PhotoDNA for use into a browser for good

I wonder if Ethical AI knows that the British Police are planning to move their CP Database to Amazon AWS for easier access and that law enforcement use a lot of the same technologies that everyone else does and not a parallel universe law enforcement set.

The U.S. Senate uses email and Signal.

The British Cabinet once used Zoom, which was stupid.

FB Moderators have to be able to moderate the content on their servers.

You tend not to realize how similar the government is to you. Many consumer technologies tend to be well-built and the further you distance yourselves from them, the fewer eyes have scanned over it to prevent bugs and security issues.

Lastly, Iā€™m not sure we even want to know how much CP is out lying on peopleā€™s hard drives or who they are of. It isnā€™t a very good Pandoraā€™s Box to open.

4 Likes

I wonder if Ethical AI knows that the British Police are planning to move their CP Database to Amazon AWS for easier access and that law enforcement use a lot of the same technologies that everyone else does and not a parallel universe law enforcement set.

Iā€™m very well aware of this.

Lastly, Iā€™m not sure we even want to know how much CP is out lying on peopleā€™s hard drives or who they are of. It isnā€™t a very good Pandoraā€™s Box to open.

We do because we need to identify victims and we need to destroy these sick fucks collection. Which is why I have continuously proposed some method for the use of PhotoDNA to prevent hashlist-CSAM from being transferred into any hard drive or SD card. These sick fucks will no longer be able to build up their horrific collection.

Hopefully one day, it will be an industry standard for hard drives to be built to disallow identified CSAM from being stored onto a hard drive or SD disk. I do not want criminals attempting to sneak CSAM onto an innocent personā€™s computer which has in the past resulted in innocents getting into serious trouble. And I do not want these sick fucks to be able to keep their horrible collection of child abuse.

Why do we need to destroy someoneā€™s collection? Identifying victims is a somewhat compelling interest, although it could equally be done via counseling in places like schools.

It should be possible for a trained psychologist to tell if someone has some sort of psychological anguish, we may even be able to deduce likely ones by their behavior? A lot of people arenā€™t silly enough to record themselves committing crimes.

A hash canā€™t really identify anything that isnā€™t already known, so it wouldnā€™t be so good on new victims and police are surprisingly effective at infiltrating dens (although they have had a huge issue with funding lately which no one in government cares to solve, although they do care about passing police state laws).

If you up the funding by millions or tens of millions, they should be more effective, you will arrest more people and you donā€™t have a police state. The most likely ones to be caught are the ones doing the most damage by spreading images far and wide.

3 Likes

Why do we need to destroy someoneā€™s collection?

The idea of them possessing such disgusting material as images of REAL child sexual abuse is beyond appalling. THEIR COLLECTIONS MUST BE UTTERLY CRUSHED by any means necessary. It will be impossible for victims of CSAM to heal until all depictions of them have been rendered utterly and completely extinct. Furthermore, these depictions NORMALIZE abuse in the minds of offenders making them MORE LIKELY to perpetrate abuse themselves. There is no evidence fictional CP does this as they know itā€™s fake. But real depictions do indeed normalize abuse!

Identifying victims is a somewhat compelling interest, although it could equally be done via counseling in places like schools.

Yes, it could be done through looking for warning signs, but identifying CSAM offers near absolute proof and can be part of the solution.

A hash canā€™t really identify anything that isnā€™t already known, so it wouldnā€™t be so good on new victims and police are surprisingly effective at infiltrating dens

But it can be used to stop the spread of identified images. CSAM survivors will one day be able to live a fucking normal life if only sick fuck degenerate retards would stop possessing or sharing this horrifying crap. I literally donā€™t fucking know how fucking else to fucking put it what the fuck. Every time these images are viewed, THEY LITERALLY SUFFER. If someone watches a video of them being raped as a child, the victim, even if the victim is an adult now suffers as a result of them watching the video, regardless as to whether they know about that specific individual or not. PhotoDNA has the potential to transform lives if only it were used extremely aggressively.

Also as non-degenerates, you, me, and others on this forum have an inalienable right to not be involuntarily exposed to CSAM. As you said earlier even CGI fake stuff terrifies you, youā€™d be far more terrified if you stumble across real CSAM. I donā€™t know if you know this or not, but on the federal level, the average prison sentence for crimes is 4.5 years. CSAM possession w/o intent to distribute is also around 4.5 years. Why the fuck should we be involuntarily exposed to such horrific materials? Considering law enforcement is not perfect, you stumbling across it could still lead to a false conviction if they were to falsely believe you intentionally access it. THERE HAVE ALSO BEEN CASES WHERE INNOCENTS WERE FRAMED BY CRIMINALS. ALSO ACCIDENTS HAPPEN, COURTS ARE NOT FUCKING PERFECT. Why not prevent this horrifying scenario from happening by censoring the fuck out of CSAM? CSAM is extremely dangerous. Make no fucking mistake, to the victims, and to the general public.

Itā€™s mostly for these reasons why I want PhotoDNA censorship to be deployed with extreme aggressiveness. I want cencors to be deployed not just onto browsers like TOR and Chrome, but into every fucking hard drive, every fucking SSD card, every fucking thumb drive. It should be deployed to even digital printers because we know some sick fucks are going to try to preserve their dengerate materials by printing it out. Their sick collection will be fucking eradicated, the victims will be able to heal, and people like those on this forum, Me, and Prostasia who are non-degenerates will be safe.

If you up the funding by millions or tens of millions, they should be more effective, you will arrest more people and you donā€™t have a police state. The most likely ones to be caught are the ones doing the most damage by spreading images far and wide.

Yes, no one is arguing against this. Find those who run child porn sites and bring these subhuman creatures to justice. Castrate and mutilate them if you must, it does not matter what justice is done so long as they are stopped.

What about the FBI whoā€™re hosting them themselves?

1 Like

What about the FBI whoā€™re hosting them themselves?

The FBI needs to rescue children. So they have to break some moral codes to achieve their end. This is part of the reason why torture of terrorist suspects is completely justified by the CIA. Itā€™s not pretty, but it needs to happen.

Do you believe for a moment that this wonā€™t also result in the copyright industry also requiring that you canā€™t save copyright content on your hard drive unless it is licensed? And that governments wonā€™t include hashes of official documents so that whistleblowers can be immediately identified before they are able to share leaked documents?

1 Like

Do you believe for a moment that this wonā€™t also result in the copyright industry also requiring that you canā€™t save copyright content on your hard drive unless it is licensed? And that governments wonā€™t include hashes of official documents so that whistleblowers can be immediately identified before they are able to share leaked documents?

Of course they will do that. There is no question that will happen. But how important is it for people to violate copyright law? Iā€™m more than willing to sacrifice that for the genuine utter annihilation of known CSAM. Iā€™ve read articles on how CSAM could be genuinely destroyed. Katya PMed me an article from Khazakstan university about a hypothetical plan to install PhotoDNA onto all imported computer to prevent criminals from storing CSAM. Hany said something about browser level censorship of this degenerate material.

From what we know, there are over 10,000 CSAM victims identify, research from protectchildren.ca suggests 85% need continued therapy, probably as a result of the constant sick fucks who keep accessing images of their abuse. I believe some If we could eradicate CSAM, we could vastly improve the lives of these individuals. I would love nothing more than to utterly Annihilation the sick fuckā€™s collection of horrors.

As for the whistleblower issue, that IS a valid concern, Iā€™m not sure how to resolve that issue. But as a utulitarian, I look at both pros and cons. Cure perhaps >10,000 people of the curse so that they shall one day live normal lives like the rest of us? Or let the issue go without proper intervention. There is no way to fully stamp out this horrible issue. Criminal justice, therapy, rehab can only get so far with offenders. More needs to be done to STAMP IT OUT before they can even begin.

1 Like

Freedom is important, but needs to be balanced with the rights of the child. I support a ban on all computers, microchips and browsers that donā€™t have those specific filterings! Too much suffering is caused by this crime! We can prevent re-harming of CP victims and we can prevent people from committing those crimes and getting into trouble in the first place too. It needs to be done.

I certainly appreciate him mentioning that. I would hate for someone to unknowingly/unintentionally have CSAM show up (say it shows up on their social media timeline for example) and they get in trouble even though they werenā€™t ā€œseeking it outā€

2 Likes

Ideally these filters should be protecting victims, but also the general public.