Section 230 of the CDA is at risk

Section 230 is at risk again, this time, by a bill being pushed by Sen. Graham to sunset the provisions of the law.
President Donald Trump has been very problematic regarding this law, as are most Republican members of Congress, mostly because he wants to censure social media companies for moderating their platforms appropriately.

I’m hoping we can see Sec. 230 safeguarded and unfettered.

We cannot allow the groundwork that has preserved and allowed the Internet as we know it to be weakened or jeopardized.
These right-wing idealists each have their own ideas for why Sec. 230 should be repealed, all of which seem to have absolutely no relevance to why it exists, or are deliberate attempts to undermine our free, open Internet.

I used to believe that Section 230 protects MAP supporting groups from being censored, but now I think differently. With the recent censorship of topics (COVID-19 topics opposing WHO’s points while WHO is the one being accused for doing wrong, election fraud accusations with evidence, Nanjing Massacre topics) made by platform providers, I see the providers can manipulate/mark/silence anything they dislike without legal liability. Seems that their violation of speech freedom is protected by Section 230 as well, under the name of “management.” If next time they come after MAP helping topics, I cannot imagine the rest.
Even if Section 230 is not bad enough to be abolished, it needs some changes regarding the responsibilities of platform providers. Not only politicians but also big provider companies may abuse the power.

I genuinely don’t see the benefits for this. It looks like a free-to-be-negligent statue. But I suppose Americans are fine with big business breaking laws. Thanks to sec 230, facebook cannot be held accountable for the most heinous crime: Genocide. That is taking place in Myanmar. A proper response would be for facebook to shut down all genocide promotors.

Aside from the obvious elephant in the room of Genocide, another issue is social media can simply turn a blind eye to child rape imagery and be free of civil liability. We need social media platforms to be forced to find all the illegal content and genocide promotors and keep it off their sites.

First of all, you’re wrong. Completely and wholly wrong. Sec. 230 means that websites aren’t held liable for how users may misuse or abuse their platforms. This is a good thing, because it inadvertently protects a website from being shut down simply because it was used for committing crimes.
Without it, anybody with the means to do so could very easily disrupt the websites capability to operate in a robust, user-friendly manner into a legal liability, thereby putting them at the literal arbitrary mercy of criminals and limiting what types of services could reasonably or feasibly exist.
The Internet and its sphere of user-generated content is a consequence of freedom and was not designed to be limited in such a way, nor was it designed to be a curated “walled garden” of content like we see on cable TV networks. There are already limitations on Sec. 230 protections, such as if a website does not fulfill its requirements to remove and report CSAM to authorities, or act in “bad faith” in matters pertaining to such subject matter.

As far as I’m concerned with the “genocide” comment, I doubt you’re correct. Facebook has some of the most strict “community standards” and rules I’ve ever seen on a social media platform. Anything encouraging or abetting genocide of any kind would more than likely be removed from their platform, as threats, hate speech and the encouragement of violent acts are expressly against their ToS.

If websites are not doing enough to remove illegal content, they have zero right to exist. It’s that simple. Ending section 230 does not mean websites will automatically shut down. It’s that they will not be required to search for all illegal content and if they refuse they will be shut down. It appears I know more about the laws in your country than your organization does. Most countries do not have section 230 and they do fine.

It’s firmly apparent that you don’t.

Because most websites, even those of a foreign nature, will host them on hardware and networking infrastructure that’s physically located in the United States for those Section 230 protections.
An example would be Japanese imageboard 2channel. 2channel is considered a predecessor to websites like 4chan and 8chan, but is wholly and identifiably Japanese, in both function and audience.
Uncensored hardcore pornography is illegal in Japan and websites hosted there have to employ stringent, unusual content restrictions on what they can allow on their site if they wish to maintain an adult audience.
This means that platforms must designate a significant amount of time, money, and resources to moderate which can be prohibitive if you’re not a large company or don’t have deep pockets.

By allowing the infrastructure to be hosted in the US and having content be delivered securely and efficiently via CDNs such as Cloudflare, they’re allowed to host whatever they want so long as it doesn’t violate US law.

There are many other examples I could cite, including Australian communities that are based in Australia, operated from Australia, but hosted out of the US in order to function. Section 230 was and is a piece of valuable foresight into the nature of what the Internet would become and is necessary in order to maintain and actively protect not only our freedoms, but perhaps yours as well.

2 Likes

Every site is obligated to remove illegal contents the very moment they are spotted, and that is achieved pretty much immediately once someone spots such content and reports it. If a site decides not to remove such content immediately, they already can be sued.

You don’t even know how extremely rare such situations are, yet you are uncritically willing to accept a change that will drastically affect every person on this planet, on a blind belief, that it’s rational. Are you really that gullible, to believe, that any change the government does, is always proper one, or that it’s always for the betterment of the society? What has happened to the belief that the elites participate in a cult that wants to legalize sexual abuse of children? Suddenly they started caring about the interest of the society instead of their own?

Elimination of section 230 doesn’t mean that companies will now be more responsible for removing illegal contents, you live in a fantasy land where there is infinite money, people and time and any idea is possible. In real world, things cost, not only money but also people, who need to be hired, work, they have to make, the time they can spend doing that work, devices, that require maintenance and much more. There are billions of terabytes sent on those platforms daily, from all over the world. It’s impossible for humans to verify everything, and AI makes mistakes, which in this case, could cost the companies too much for them to even consider taking such risks.

Illegal sites that host illegal materials willingly already are in darknet and will exist in deepweb for many years now. They already are breaking the law, so they don’t care about section 230, and never will.

Therefore this change ultimately affects regular people, law-abiding citizens like you and me.

In reality, most platforms will simply disable any ways of communication, and those who won’t will be approving your posts and contents after verification, which might take a long time. There will be strict limits, on how much posts you can make daily, and most likely messages as well.

Imagine communicating with your friends and families using max: 10 messages per day total, for all people. Or imagine you are in a life-threatening situation, and you want to ask for help through messager app, but you already exceeded your limit, and you have no other ways of communicating with people.

But I guess a couple of thousands of people dying per month is worth making sure that almost never seen materials will be now completely unseen by regular people, while those who want to see them will lose nothing, and will continue exchanging them in darknet.

1 Like

Every site is obligated to remove illegal contents the very moment they are spotted, and that is achieved pretty much immediately once someone spots such content and reports it. If a site decides not to remove such content immediately, they already can be sued.

IF they become aware of it. Right now, there is no legal obligation for these companies to search on their own platforms for illegal images and criminal communication. There needs to be LAWS MANDATING that they search their own platforms for illegal content and isolate it from view for reporting to authorities! THIS IS THE GOAL OF THE REPEAL. I looked at the reporting function on various social media. They make it very difficult to report things. I then thought about how someone would report an illegal image of a minor, and it does not appear to be very straight forward. THIS IS BY FUCKING DESIGN. They don’t want to become aware of it so they aren’t fucking legally mandated to take action which costs them money. This is the problem I have with 230. IT NEEDS TO BE AMENDED or destroyed and rebuilt from the ground up.

Imagine communicating with your friends and families using max: 10 messages per day total, for all people. Or imagine you are in a life-threatening situation, and you want to ask for help through messager app, but you already exceeded your limit, and you have no other ways of communicating with people.

We will allow them protection, on the grounds social media actively take reasonable steps to seek out and isolate and report illegal content on their platforms.

More needs to be done to target and kill those in the dark web that produce and trade this material. However, even in mainstream social media sites, there is a huge problem with illegal imagery. Darkweb having a worse problem is no excuse.

There is an obligation for them to search on their own platforms for such materials. This is why you barely ever see them. Again, you expect impossible to happen. For a company to filter every single message or media file to be verified by a human being and accepted as appropriate in hundreds of categories, verifying if they break some law or not. And what about the grey legal area? What if it’s a matter of interpretation. This is why we have courts and judges instead of just the police, to decide ambiguous matters. Company will not risk getting a fine, if they conclude that what you post can in any way be not only illegal but considered as illegal, they will censor you.

The problem isn’t that something illegal is hosted, the problem is that it’s being seen by people. And if the very first person sees such contents, it’s pretty much immediately removed once they report it, if not, the company gets sued. In my entire life, I’ve seen an illegal content, with the exception of copyrighted materials, since they are quite frequently uploaded, only once on social media. In my entire life. Seeing it once more would be way better than living permanently having what I can and can’t say limited. You literally hope that all evil from the world will disappear magically with laws. But that is sadly not how reality works.

Advocating for improved reporting systems is certainly a good thing. I never had problems finding buttons to report materials, but I might be more tech-savvy than a majority of people, so it’s understandable.

For as long as there is an option to report something, you can learn where it is, and be prepared for such a situation. You need to learn how to report contents only once, so I don’t really know how they had to design this functionality to achieve the effect that you propose. Do they change the position of the report button every time you reload your page?

Well, that is the thing, how you want to protect the possibility for people to communicate with each other freely, when the law introduced pushes the companies into a direction where they have to filter every single attempt of communication to avoid legal action. You can’t do both unless you introduce a new law, which will most likely be active after a couple of years during which the problem I mentioned might rise.

But even then, this creates a loophole for the initial reason why the first law is introduced. If you allow companies to not be punished when it comes to private communication, then the illegal materials will be distributed through private communication. I mean, I bet most governments would be happy with that outcome, they seem to care about not seeing problems instead of fixing them, but I don’t think both you and I would be satisfied with it.

You cannot destroy a problem, you can only solve it. There is another option that you don’t take into consideration. If your problem that you have with 230 is that companies aren’t motivated to filter through the materials like CSEM that they host and you aren’t satisfied with user reporting-based approach, you can create the third party company that deals with filtering of CSEM materials and has integration with the social media platforms to instantly hide any posted material, with the information that the material was hidden by the government to the users. The government then puts legal punishments, if the third party company hides anything other than legitimate CSEM, they can get fined, and if social media will show purposefully a hidden material that legitimately was CSEM, they can get fined.

This way, you have two companies that have the self-censorious attitude (to avoid fines, they must both cooperate in the common goal of removing CSEM), the risk of censoring, not legal things exists but is limited to the lowest possible level.

It’s a win-win-win-win-win situation:
Win for the social platforms, since they have help in dealing with illegal materials, while still having control over the final say in case of malicious activity from the third party company.
Win for the third party company, since such law created a market for such companies, so it increases economic growth.
Win for the society, since they see way less illegal materials, and their own freedoms aren’t completely removed, and now there is an additional market that created more jobs.
Win for the government, since they aren’t responsible for backlash resulting from restriction to peoples speech.
Win for the new social platforms, that will have it easier to start their business, when they can’t afford to filter through all the contents. And because there is a third party company whose services they can hire, they don’t have to set up any systems, and instead, they can hire them for a lesser amount of money, since they have fewer contents to moderate.

Basically, the lack of incentive from the side of the social platform can be used to create a new market who specializes in supplementing that lack of initiative and solving the issue in exchange for money from either the platforms or the government, or both, using the legal liability to balance the motivations of both companies.

It’s not the best solution, and there are still risks, but it results in the same effects, but with much, much lesser restrictions of peoples freedoms. There is always a multitude of ways you can solve a problem, not just “amend” or “destroy”.

1 Like