Facebook Messenger?

Do you think that Facebook Messenger’s plans to go encrypted are linked to the unreasonable liabilities imposed on platforms by FOSTA? The tech industry has a tendency of routing around unreasonable laws and policies in ways that are even worse than what the law’s benefits would have.

For instance, currently Facebook can see images as they flow through it’s platform, but if they go encrypted, then a significant amount of content will go dark and it will be impossible to tackle that without breaching people’s fundamental rights to privacy.


The U.K. strongly opposes it which is not surprising considering that the U.K. resembles a modern day 1984 that hates accountability, privacy and security.

At a certain point with all the overzealous AIs, government officials snooping in on private affairs for giggles (LOVEINT at NSA), bans on arbitrary things, backdoors everywhere so Russia / US / script kiddies can wreck havoc (it looks like we all want to go back to the good old days in the 2000s where everyone got infected by a virus and had to go spend $200 to get it fixed), random communities / people have to be banned arbitrarily or you might get sued (FOSTA), etc. you might as-well just shutdown the internet and call it a day.

Do people even understand how AI work? It is a big box of weights. Numbers. You basically have a box with a bunch of random numbers, you feed it with data and it runs millions of times over a certain period of time, shifting towards curves that happen to slightly better categorize images, so that it happens to do what you want it to do and you have your AI. It has less intelligence than a gerbil and a crazy number of false positives, especially at scale.

If you crack it open, you just see a bunch of random numbers and have no clue what it even does without testing on images to see if it happens to work there. That means you can be using the site normally, only for some innocuous image, or something stupid like beach nudity / breast-feeding, to trigger an alert leading to a conversation with a moderator (if they ever come) and they might decide you’re an eyesore and decide to get rid of you anyway for some arbitrary reason.

You also do all of that, put in all that work, anger all those people, and the press just ends up twisting all of that work against you anyway. Looking for every little thing to construe as if you’re evil. That is one reason why moderation for any sort of abuse is so abysmal. You don’t try, worst you get is a tut-tut. You try, you get punished hard.

Reddit is even running around smacking down anime communities which don’t even have anything to do with porn, apparently. Who knows who else will be smacked down once they fall out of favor with the media.

https://news.ycombinator.com/item?id=21149744
People on Hacker News weigh in on the latest bit of news, they are always enlightening.

It’s hard to say whether Facebook’s plan to add encryption (hopefully end-to-end, though I don’t know the details) is related to FOSTA/SESTA.

I would suspect that it’s more likely Facebook’s attempt at telling people “hey look! We care about your privacy”

Those in power want privileged access to personal communications and are using public safety as a reason. It’s not unreasonable to speculate that much of what there is to gain is unrelated to public safety. Also, privileged access leaks could mean great losses.

Perhaps, although the number of privacy scandals they have been involved in are simply legendary and they never really did anything about them. Just look at the things they manage to get away with: https://www.forbes.com/sites/kashmirhill/2014/06/28/facebook-manipulated-689003-users-emotions-for-science/#75369864197c

I do know they are having a lot of problems with false positives from AIs that are supposed to tackle things like illicit images, I believe it is a nudity detector of sorts, that may have a connection to FOSTA. I do slightly scratch my head at the notion that the mere existence of images is grounds for a lawsuit under a sex trafficking law, but people do love to sue.