I have looked at that paper, and again, no.
I don’t doubt that they work with law enforcement, but if it’s not part of the NCMEC’s list of hashes, then it’s likely not actionable unless verified against their list. It’s far more likely that they’re just sharing the hash database by given to them by the NCMEC and adding their own hashes, and forwarding that to ECPAT, which is both legally and ethically dubious, since many of these images have to be triaged and followed up on by qualified, authorized federal investigators since identities have to be matched.
Under federal law, no private entities are allowed to handle child pornography that have not been vetted by the government to do so, and to my knowledge, the CRC have not been granted that privilege. I have no doubt that they’ve created their own tools, but to the extent that other companies, let alone Google, have used their tools or hashes to detect and remove CSAM.
Google has developed their own regime to detect and report illegal CSAM and they, like Microsoft and everyone else, use what was provided by the NCMEC.
Also, I don’t trust the CRC as much as I can throw them. They’re on record spreading misinformation regarding the supposed likeness of a woman’s daughter being used to market a child sex doll, which we very easily debunked by simply looking at the doll itself and how far back it went.
They’re liars.
You can find all that information here;
Because the likelihood for false-positives increases drastically with the simplicity of the image, given how malleable they can be and the fact that artists usually create these images in sets. There’s a reason why they restrict this type of technology to CSAM detection, and why existing regimes that are utilized by both public and private interests do not include them.
The CRC is playing with fire by doing this, and given how their own tools are not even used as often as they claim, I’d say they’re not worth worrying about.
What you SHOULD be worried about are the tools used by Cybertip.ca.