As politically conscious internet users and tech-savvy jurists may already be aware of, Section 230 of the Communications Decency Act has been dragged into the limelight of controversy as of late.
President Donald Trump, the Justice Department, and a disturbingly high number of GOP Congressmen becoming vocal about their contentions regarding the law and the (justified and necessary) amount of immunity it provides web platforms for how their services are used by third parties, such as users, in both civil and criminal prosecutions, providing the platforms are acting in “good faith”.
These immunity provisions are justified and absolutely necessary to preserve the Internet as we know it today. Without them, social media platforms, search engines, etc. would not be anywhere near as efficient as they are now, as holding a website responsible for any crimes or civil tort caused by a user or third party would pose a significant deterrent from innovating and revolutionizing how we function as a society, as well as block new or up-and-coming web platforms and services from existing.
Congress ought to concede to the reality by which our world is contingent on and preserve Section 230 of the CDA for the good of our rights, freedoms, and continued prosperity as we move forward in a data-driven millennium.
Well…
McConnell wants to do away with the very thing that holds the modern Internet as we know it together. This has been no secret. Trump vetoed the NDAA (National Defense Authorization Act), a bill in which Congress routinely passes to fund our military and other things because it did not include a provision to repeal Section 230. This caused the US House to veto it, of course, but Trump also railed against the $600 stimulus checks, saying that it wasn’t enough and called for $2000.
McConnell, of course, blocked that bill from reaching the Senate, triggering outrage and ire from all sides. But now, McConnell drafted legislation that would increase Stimulus payments to $2000, plus a Section 230 repeal and a resolution recognizing “voter fraud”.
It is unclear to me whether they are playing 4D or 5D chess, as a ploy to kill off talks of increasing stimulus payments by the Democratically controlled House or as a means to achieve a goal the GOP has long pursued - the end of Internet free speech and platform liability protections.
What is clear, though, is that the repeal of Section 230 goes beyond the flagrantly false allegations of “national security” threats and accusations of platform-based censorship. It would destroy the Internet as we know it and harm sex workers.
I just don’t understand how or why people could look at Section 230 like it’s a bad thing… It’s quite a simple idea, really.
Internet platforms are available to be used by anybody and it’s understandable and inevitable that they may be abused to commit crimes. It makes sense that these platforms remain free from liability in regards to said crimes.
It drives me insane with paranoia and fear how Republicans and conservatives can look at rational, common-sense practices that value freedom and innovation as “bad things” that need to be reigned in like a rabid dog.
You’re not American if you don’t support our First Amendment or Section 230. It contains “good faith” provisions and exceptions to hold websites accountable for allowing their services to be misused. Websites and platforms are required by law to report CSAM/CP if/when they see it.
There is no plausible rationale to repeal or harm Section 230 of the CDA.
It is my hope that the Democrats can see thru this ruse and retain Section 230 protections.
And, should they cave in, the Courts reinstate the law in some manner or fashion.
Sec 230 is basically a get out of liability card for big tech. They should be liable to any revenge porn, CP, and murder/rape threats that are posted on their site.
If a site willingly hosts such contents, and decline to remove it once they are noticed and notified about it, they already are held liable for it. Section 230 doesn’t protect any site, it protects you, from having your opinions and ideas perpetually censored on the internet over the companies fear of being held liable for your actions.
Didn’t Bing have image detecting software but simply refused to deploy it? That lead to a lot of illegal imagery that could have been removed but simply was not.
You have to provide more information about this story since it’s the first time of me hearing about it.
But even without it, there are some things that can be said about it, even if assuming it’s a hypothetical scenario.
First questions:
If they had such technology, why they refused to use it? Do you imply that they purposefully wanted for illegal materials to be distributed? To risk legal action being prosecuted for distribution of CSEM? Why would they do it? The most reasonable answer is that this technology of theirs wasn’t working properly, which is why they didn’t use it. Removing Section 230 wouldn’t change that, if anything, it shows that such protection is still required since Big Tech still doesn’t have automatic filtering technology that would work reliable enough not to resort to complete censorship in a scenario where Section 230 were to be removed.
Did the person seeing this illegal imagery reported that? If they did, and the Bing declined to remove it, they could sue Bing under the distribution of child pornography charges. As you can see, there are already laws that prohibit such materials from being distributed, so removing Section 230 is pointless.
Do you think that removal of Section 230 would actually make the companies work harder on removing such materials? What makes you think they would? Because what most companies do when they encounter a law that paralyzes their business model, is that they change the country in which they operate. There have been some cases of Google simply disabling their services in countries that introduced such laws, as a protest, and they won. What makes you think that Facebook won’t simply change the country and ban any traffic coming from the US? They have been investing a lot in the development of the internet infrastructure in countries that don’t have it, so it’s not like they won’t be able to fill the loses of customer base they would lose by exiting the US. An hell, maybe even China would be willing to give Facebook a helping hand, with special rules just for them, so they can gather information on Facebook users. It would be a bad outcome, but still better for Facebook than what the removal of Section 230 would bring.
There is a lot of issues with this scenario. But if your main idea is that we need laws to motivate companies to better remove illegal materials, well, we already have them. It’s not the fault of the social media platforms that the government is incompetent at using them.
And here is a radical idea. What if, instead of going after social media platforms, hurting millions of innocent users that use them to communicate with their friends, finding long lost family members, being in constant contact with those who they love over long distances, like in the case of a pandemic, just because 10 to 100 evil people sometimes upload illegal materials on them, we should make the government actually go after those 10 to 100 evil individuals?
I really don’t get this mentality of going after everything that normal law-abiding citizens use without hurting anybody, to solve a problem, instead of going after the people who actually are responsible for creating the problem in the first place.
They already are, hence the “good faith” provisions.
It’s already illegal for websites to knowingly host or not act on CSAM or other criminal material and websites are taken down as a result.
You have to provide more information about this story since it’s the first time of me hearing about it.
It’s pretty big news. And don’t forget Tumblr also had similar issues. It’s completely inexcusable for them to have so much of this horrible illegal content available for so long that from what I’ve read about two years ago in an article, the author mentioned that a user could literally stumble onto it days, weeks or even months after the original uploader uploaded the stuff. How the hell does something this horrible remain available on Tumblr for so long?
If they had such technology, why they refused to use it? Do you imply that they purposefully wanted for illegal materials to be distributed? To risk legal action being prosecuted for distribution of CSEM? Why would they do it? The most reasonable answer is that this technology of theirs wasn’t working properly, which is why they didn’t use it.
There is a lot of issues with this scenario. But if your main idea is that we need laws to motivate companies to better remove illegal materials, well, we already have them. It’s not the fault of the social media platforms that the government is incompetent at using them.
I guess it’s possible the scanning technology just wasn’t there yet. But they were using a pretty similar program to that facebook uses. I think it’s photodna or some variant. FB has been using it successfully so why would they turn it off for bing?
Yes it’s good that it forces companies to report when they become aware of the material. The issue I have with section 230 is that it does not require companies to search for these horrible images on their platform. This is why I want 230 to be changed to require them to search for it to avoid being liable for damages.
Maybe they turned off photodna so they can feign ignorance? It takes resources to report and store data they are required to store when they become aware of illegal images of children. If someone reports, yeah they would take action. But the issue is there isn’t anything forcing these businesses to proactively look for this material on their platforms to remove it. If there was, maybe they would have used their scanning tools earlier.
just because 10 to 100 evil people sometimes upload illegal materials on them, we should make the government actually go after those 10 to 100 evil individuals?
By requiring businesses to proactively seek out illegal material on their own platforms, it makes it easier to terminate illegal user accounts, and arrest/prosecute the ones uploading and producing this stuff.
If they don’t remove such materials, they can be sued under the charges of distribution. Removal of Section 230 would have the same effect. Nothing will be changed to solve the problem, it will only hurt innocent people who use these sites for legal purposes. The law isn’t a perfect system, there are mistakes, a lot of is left for interpretation, and if there is one thing that companies try to avoid as fire, it’s being liable in court for some legal reason.
You assume they had such technology, without actually knowing that. Once again, if they had it, and it worked, it would be risky for them not to use it. You don’t provide a logical explanation of why it wasn’t used, nor any links to the story even though you said it was “big news”. It looks as you making things up as you go. The idea that they can do it so they “can feign ignorance” could have some merit in case of minor things, like a person using racial slurs to insult someone else. But not when it comes to CSEM. Such discovery not only would risk them getting a legal battle, but it would completely destroy companies brand name, and risk them losing in court. Look what pornhub has to deal with now, even though they had significantly less illegal materials than Facebook has every month. Once again, if a user reports such materials, they can’t feign ignorance, because they had the report. They have logs when it came, which moderator saw it, what they do about it and much more. They gather such data exactly for the purposes of such legal battle, to defend themselves.
The scale is the problem. If you have 10 billions of materials uploaded every day, how do you expect to filter all of them out in search of illegal contents? Even with software solutions, it would take more than a day, so with each day, you have a growing number of images that wait to be filtered, and that won’t be filtered without the addition of new human and machine power, or without periods of times where there is less materials uploaded.
This is why user reports are the most effective way of solving things. But the problem is, a lot of people report things that aren’t illegal, bloating the system and making the filtering slower.
There are physical restrains related to amounts of data sent and it’s validation, and changing the law doesn’t change those restrains. If you eliminate Section 230, nothing will change, people will still continue to upload such materials to sites. Maybe even Big Tech itself will send such materials to their smaller competitors, to make the government completely destroy their business in court. Do you want to improve the monopolization of the Big Tech companies? Because that is what you propose.
They might have the resources to handle such contents fairly effectively, although I doubt that. But any smaller site will never have the money, time or people to hire. Millions of individuals would be required to filter out uploaded contents in search of something illegal.
So you do admit that Big Tech companies simply don’t have enough resources to handle this issue, and your solution is to force them to use resources they don’t have, hoping this will eliminate the problem. You contradict yourself in your ideas. It’s weird to see that you have enough of a strong capability to analyze the situation, yet you still persist in implementing a change that contradicts the conclusions of your analysis. I don’t really think you truly believe in what you say.
In this article, they have been using it, and still went under fire for spreading illegal content. Once again, while it’s comfortable to believe that single text change in legal papers will save the world, the reality doesn’t care about such human constructs and doesn’t adjust itself to such demands. Nothing will be solved by removal or even adjustments of Section 230, because such change doesn’t solve the core of the issue. It only bullies regular users to make a small minority of gullible people feel good about themselves, giving them a false sense of the issue being solved, while the number of minors sexually exploited to produce such materials stays the same.
They are already required to do that. Section 230 doesn’t allow social media sites to do illegal activities, and hosting CSEM is illegal. It only gives them protection, if they actually try to solve the issue, and all Big Tech tries to do that. This is why Microsoft developed PhotoDNA, this is why Facebook employs more people to filter out such contents, this is why Tumblr blocked all pornography on their site. None of those companies wants to see CSEM being spread on their platforms because it simply costs them a lot. Be it in legal fees or through having their brand name ruined.
Your entire line of argumentation is based on this single notion that all illegal materials must be removed before anyone will see them at all costs, even at the costs of innocent peoples wellbeing. And I can’t help but notice, that you don’t seem like an unintelligent person, which is odd, and suggests that you care more about performative opposition to such materials, rather than actually developing effective solutions to the core problem with such materials, which is the children being hurt through sexual exploitation.
You seem not to care about that fact, and instead care more about people seeing that such problem exist. You see, most regular people, the majority of our society, don’t actually care if someone sees such materials or not. They are concerned with the fact that such materials exist in the first place, because that means, that some child has been sexually abused, and maybe is still abused to this day, and requires our help. They want to find that child and rescue them.
You seem to do the opposite, you don’t care about the children being exploited, you care to make sure that people will not see such materials, to make them unaware of the scale of the problem that is child abuse. You want to implement drastic measures to make sure, that people will not have that awareness, even if that means their freedom of speech will be impaired permanently.
What motivates a human being like you to have such zealous attitude to keep the problem hidden?
And I’m asking this question because in recent days there has been an influx of people who had exactly the same attitudes like you, and had the same rhetoric as you, that turned out to be horrible human beings.
One person made 3 accounts, admitted to being willing to rape people, and defended a child rapist from being prosecuted in the UK. All of that while supposedly claiming he is against sexual exploitation of children, and having similar views about this problem as you.
There have also been 4 to 5 people, from some Facebook group, most likely for child predators, since one of the members got very defensive when I ask them to invite me there for a discussion. They also have a similar line of reasoning to yours but displayed a lot of psychopathic and narcissistic traits, traits that are extremely common among sexual predators, both praying on adults, and especially praying on children. Which is odd when you consider that they were members of the same group. What kind of group would pull and aggregate people who lack or have impaired ability to feel empathy, among other horrible traits? Definitely not a fan club of people who enjoy gardening.
Either way, as I said, the ideas and reasoning that you propose have already been discussed on this forum, and I doubt there is anything of substance that you can add to things that already have been mentioned and answered. So I don’t think further engagement with you will be productive.
From how I see it, the GOP have lost in their attack on sec 230 especially with trumps term about to end. But the idea that democrats arent entirely for repealing it is silly, Biden was calling for it to be repealed before Trump was, his most trusted tech advisor recently released an absurd article saying section 230 is hurting our kids. We are 100% going to get a section 230 repeal or reform under Biden and it’s going to pass much easier due to Bidens ability to be bipartisan unlike crazed Trump.