Article: Family photos land at the Federal Police

This article with quotes from Prostasia’s Jeremy Malcolm was just published in the Swiss newspaper SonntagsZeitung. This is an English translation.

Family photos land at the Federal Police

To protect children, Google and Facebook monitor all users - the authorities are flooded with suspicious posts

Philippe Stalder

Bern. On the desk of the Federal Police is a holiday photo, which has sent Mr. Huber via Facebook chat to his colleagues. To see: Huber in a deck chair, in the background two children who jump into the water. The question to be answered: harmless holiday photo or child pornography? The decision: false alarm, again.

“Last year, we received around 9,000 messages,” says Florian Näf, spokesman for the Federal Police Office (Fedpol), who described the scene. “Criminally relevant were just under ten percent of it.” The rest were notifications without criminal relevance.

Most people use the services of American operators such as Gmail, Facebook or Whatsapp on the Internet. These are required by law to report illegal content such as pornography with children or animals, as well as extreme violence by the private non-profit National Center for Missing and Exploited Children (NCMEC). NCMEC does a triage and forwards the messages unprocessed to the Fedpol, provided the sender or addressee is in Switzerland.

Algorithms are searching for naked skin and children

According to Näf, the operators not only investigate specific suspicious cases, but also search through all uploaded content - even in private messages - using algorithms that recognize, among other things, naked skin and children. Fedpol employees must therefore review all messages and decide whether they are harmless holiday or family photos or child pornography. The former would be deleted together with the personal data, the latter forwarded to cantonal law enforcement agencies.

A work that goes to the substance. The sighting of child pornography material was therefore limited to a certain number of hours, says Fedpol spokesman Näf. In addition, the investigators would have twice a year to talk with a psychologist.

But also for data protection reasons, the duty of the operator to screen all contents is very questionable, warns Martin Steiger, who specializes as a lawyer on digital law. “It is equivalent to a private mass surveillance, which provides all users without cause under a general suspicion,” says Steiger. A society must take firm action against the spread of child pornography. Against the background of the low hit rate, however, raises the question, Steiger, whether the current approach of the operator is really proportionate. “The purpose does not justify all means.”

«The system lacks any transparency»

Jeremy Malcolm is the managing director of the US Prostasia Foundation for child protection based on evidence. He also criticizes the current procedure: “The system lacks any transparency and accountability.” Malcolm sees the danger that in the name of child protection, a wider censorship of the freedom of art and freedom of minorities is committed.

Research by a journalist writing under the pseudonym Violet Blue has shown that many references to child pornography in the United States come from Protestant child protection groups, who often report digital educational services for adolescents, pedophile prevention services and even unpleasant art projects. In April 2018, a strict sex trade law came into force in the US. Since then, according to Malcolm, internet sites are being blocked more and more preventively - which is more of a pity than help for children in the case of counseling services.

The American child protection organization NCMEC declined to comment on its business practice. However, an internal document reveals that 99 percent of the messages come from private operators such as Google and Facebook. In addition, after they began monitoring their users algorithm-based, the number of reports skyrocketed: from 1.1 million in 2014 to 8.2 million in 2016. NCMEC explains the sharp increase mainly by “using new ones.” technologies ».

The operators themselves make their algorithms a secret. For example, Facebook announces on request that each upload will be matched with a database of child pornography content that is already known. In addition, algorithms were used that recognize nudity. In the first quarter of 2019, it had discovered 5.4 million contributions, closed accounts and reported the users behind, NCMEC. How high the proportion of false reports, but also says Facebook.

Illegal content on the phone - what to do?

Sometimes you can not avoid them: people who find it particularly funny to share inappropriate content on Whatsapp or another service. Anyone who looks at this with disgust, or maybe amused, loads it automatically on his device - and makes himself punishable. Martin Steiger, lawyer for digital law, advises to immediately delete the videos and to clearly protest in the chat for any investigating readers.

Harmless holiday photo or child pornography? 90 percent of the assessed material in Switzerland is not criminally relevant.

I commented about this on a supposedly loli-friendly and pedo-friendly site, but they got very offended for whatever reason, I hope they fire that useless mod.

Anyway, it’s great to see a news outlet that isn’t so blatantly biased towards the same nonsense people have been babbling incoherently since forever, we need more of these.

Excessive monitoring wouldn’t help as much as you would think. Yes, regular moderation allows for clarity, conciseness and regulation that disallows vandalism and authoritative abuse, but too much of anything is a bad thing. Relying on computer protocols over contextual contemplation, especially when these cases would require intensive analysis and forethought on a massive scale, would be too daunting of any given individuals. To each their own, I guess?