The Trichan takedown: Lessons in the governance and regulation of child sexual abuse material

This could also be posted under ‘CSA prevention’ or ‘Criminal justice’.

I am keen to see other’s take on this before adding my own opinion… other than to say I think this is an initiative that Prostasia can get behind.

Abstract, taken from:

Amidst renewed concern about the prevalence of online child sexual abuse material, the global technology sector is refocusing on models of multistakeholder governance and the development of new technological solutions. This paper argues that the language of multistakeholderism and technological solutionism obscures the administrative and commercial practices that facilitate the widespread distribution of abuse material. To illustrate this point, the paper describes the 2019 intervention of the Canadian Centre for Child Protection in the operations of “Trichan”, three websites that were amongst the largest purveyors of abuse material on the open web for 7 years. The case study underscores the materiality of the Internet and the role of commercial relations within the infrastructure stack in the provision of illegal content. While identifying opportunities for the mass removal of abuse material, the paper questions the discretion granted to technology companies under laissez faire regulation, and troubles characterizations of Internet infrastructure as neutral and instrumental factors in the epidemic availability of abuse material."

I tend to agree. When they present CSAM like this, they’re treating it not as the commoditization of a child’s sexual abuse and the end-result of a supply-and-demand economic model that thrives on the sexual exploitation and abuse of children, but moreso as a type of “annoyance” or some sort of arbitrarily-decided nuisance that merely ‘inconveniences’ global tech companies with the legal requirements that governments place on them to detect, report, and remove CSAM wherever it appears.

At least… that’s the overall point I’m getting from the abstract. It really doesn’t do anyone justice to frame it, nor treat it like this, and I’m afraid that if people begin to look at CSAM as though it were “just a photograph” that people have to destroy, sort of like how Chinese tech companies have to treat speech or materials critical or unfriendly to the ruling political regime.

It’s getting to that point where society ought to be reminded of exactly why CSAM is and should be prohibited. I feel as though the conflation of art/fiction and child sex dolls with the actualization of child sex abuse does little justice to victims of CSA.

@prostasia @terminus does this seem on-point to you or am I getting it wrong? I haven’t been sleeping very well lately.


We will have some commentary on this in our next newsletter. It was actually Prostasia Foundation who originally had the gateway to the Trichan websites taken off Bing and DuckDuckGo, so we obviously applaud this outcome. However, we do not agree that the Canadian Center’s tactics of pressuring hosting providers to cut off service without a court order is a good one, because they have misused that to take down lawful websites without accountability or review. Once again, full details of this are to follow in our next newsletter.



Yeah, I still don’t think the Canadian offices for child protection are doing a good job, considering that their own bots purposefully uploaded CSAM to a website dedicated to image-searching anime/manga artwork, then reported the CSAM it uploaded to their host which promptly cut them out.

1 Like

Now you mention it, I did note this at the time but forgot it related to Trichan. I did attempt to search for any Trichan relevant post on the forum before posting, but nothing registered.

Regarding the tactics you mention: I guess that is what was nagging at my subconscious as I read the article. I felt a little uneasy about whether the “just do it” approach was possibly a bit reckless. Even though it sort of echoes the storyline of some Hollywood justice warrior… I guess that’s exactly why it might be a problem.

Another consideration was that they didn’t seem to have a game plan for actually finding, apprehending or prosecuting the actual person(s) running these sites. Somebody must be writing these emails after all, unless it’s an AI. (There’s a thought; a “Paedophilic” Artificial Intelligence - :thinking:?)

On page 19 especially, it gives a partial clue as to why law enforcement (and the governments that provide the legislation they interpret) are much keener to go after the ‘low hanging fruit’ of people who download CSAM, rather than face the frustration of preventing the abuse material being made available in the first place.

“In the case of the technology industry, there are no legal obligations to deny service to customers who collude or facilitate in child sexual exploitation, although such commercial arrangements are fundamental to the ongoing CSAM epidemic. Nor has the private sector implemented comprehensive voluntary child protection measures that would screen clients engaged in the provision of illegal content.”

Despite (or because of) this, there is a widely held - if not altogether conscious - belief that people who can be empirically seen to want to view CSAM are more culpable in this online transaction than those who are willing to supply it: ‘supply is driven by demand’. All the Trichan boards are doing is taking advantage of an enterprising opportunity after all.

In a way it’s a bit like telling a pharmacy (or their suppliers at least) that they can provide anything to anyone, and it’s up to the end customer to not purchase anything illegal.

Anyway, 4am and I’m going to bed. 'Night.