Twitter loli artist ban wave

Twitter has been banning loli artists en mass on Twitter this week. I found out after clicking on the user where a JP guy tagged him. Went onto some art forums and people have started to notice it as well.

They did not change their policy, so why are they suddenly banning them all?

Could it be somehow related to Germany suing Twitter for not deleting criminal content? It does seem to fit into it perfectly time wise.

It only really talks about antisemitic posts tho.

2 Likes

Any examples of artists?

2 Likes

I came across only a

filler filler

1 Like

I’d like to see confirmation of this, too.

This piece in the Twitter TOS doesn’t mention foriegn entities.

However, things are likely different for foreign entities. I’d have to read what doesn’t apply to me.

2 Likes

Twitter ToS means nothing lmao. There are many countries who have laws that forces foreign entities to obey their law and enforce it regardless if it is legal in their own nation or not. Not obeying means getting millions in punishment. You go tell your boss that they should not delete controversial content and rather pay millions every now and then.

2 Likes

1 Like

Twitter was the only mainstream social media these guys had, so I genuinely wonder what will happen to them now. I saw some loli fans go over to Mastodon and make their own instance, but they all got nuked and the Admin keeps a list of all banned instances and states the reason “Loli”. Mastodon bans all porn anyways, so it is awful.

Honestly really sad. Is there any other group who are being treated like hardcore criminals more than lolicons in the realm of fiction?

What would a world look like where all loli and its discussion are banned? Sounds like a dream for the darkweb and then the likelyhood of encountering CSAM is infinetly higher.

2 Likes

Baraag’s still up, though?

3 Likes

no it doesn’t. because of the way that mastodon works (anyone can create their own “instance”–think “server”) each server can make it’s own rules. But Mastodon (the software that people choose to use for their servers) cannot “ban” anything, itself.

edit: That being said. The largest sex worker friendly instance switter.at (run by AssemblyFour, who disclaimer, was founded (possibly co-founded?) by Lola Hunt who sits on Prostasia’s Advisory Council) had to shut down due to puritanical anti sex worker/anti porn Australian censorship laws (and upside-down land doesn’t have the same free speech protections that the US First Amendment has. Seriously…thank god for the founding fathers for writing the First Amendment.). There is another sex worker friendly mastodon instance located at ynotnetwork.com

4 Likes

Tangential, but related.

That was tweeted here.

2 Likes

Twitter’s trend of ignoring CSA prevention experts continues

6 Likes

At the same time Musk brought back and welcomed racists and literal Nazis whose account is all about worshiping Hitler on his platform. But well done for “not tolerating” people who have a divergent sexual orientation, clearly that crosses the line of what should be acceptable.

6 Likes

Not to mention the reported downsizing of the team tasked with addressing CSAM on the platform

But no, flags are the problem

4 Likes

Then, there’s this.

Image.

Conflating cartoons and dolls with abuse looks delusional.

3 Likes

It’s not about doing actual good, it’s about doing something that most people see as good (even if it is actually harmful) so that you can distract them from your massive failures.

6 Likes

Exactly. On paper, the idea that indulging in child sexual fantasy (lolicon, dolls, etc.) SOUNDS dangerous, like it could lead to a “normalization” of IRL CSA. A “gateway drug” type thing.

But in reality, statistics prove that this isn’t the case. People have been arguing for decades that pornography = an increase in sexual violence against women. But statistics indicate that the more available porn becomes, the LESS women are sexually assaulted.

The same is true for “simulated” CSA and CSAM (sex dolls and cartoon pornography). The more available these things become, the less actual sex crimes occur. Even those who view CSAM are less likely to commit a hands-on offense themselves. Obviously, CSAM’s creation requires the abuse of children to create, so it shouldn’t exist. But the fact that CSAM viewers typically don’t abuse children themselves speaks volumes. If even viewing CSAM doesn’t make a person commit CSA, how in the world does viewing lolicon do so?

Allow me to reiterate what I posted here: CSAM in Germany: "Hardly any time left to pursue real crimes" - #9 by Giacobbe

3 Likes

We do have big noses and crooked teeth, though. :stuck_out_tongue:

2 Likes

Recently, a link to a Twitter competitor has been shown.

And

2 Likes

I thought the competition was Mastadon, which let’s face it, is much better since anybody can make their own censorship free instance?

3 Likes

Should I take that as indication that Tribel fears the cunny?

2 Likes