Twitter loli artist ban wave

Twitter’s trend of ignoring CSA prevention experts continues

6 Likes

At the same time Musk brought back and welcomed racists and literal Nazis whose account is all about worshiping Hitler on his platform. But well done for “not tolerating” people who have a divergent sexual orientation, clearly that crosses the line of what should be acceptable.

6 Likes

Not to mention the reported downsizing of the team tasked with addressing CSAM on the platform

But no, flags are the problem

4 Likes

Then, there’s this.

Image.

Conflating cartoons and dolls with abuse looks delusional.

3 Likes

It’s not about doing actual good, it’s about doing something that most people see as good (even if it is actually harmful) so that you can distract them from your massive failures.

6 Likes

Exactly. On paper, the idea that indulging in child sexual fantasy (lolicon, dolls, etc.) SOUNDS dangerous, like it could lead to a “normalization” of IRL CSA. A “gateway drug” type thing.

But in reality, statistics prove that this isn’t the case. People have been arguing for decades that pornography = an increase in sexual violence against women. But statistics indicate that the more available porn becomes, the LESS women are sexually assaulted.

The same is true for “simulated” CSA and CSAM (sex dolls and cartoon pornography). The more available these things become, the less actual sex crimes occur. Even those who view CSAM are less likely to commit a hands-on offense themselves. Obviously, CSAM’s creation requires the abuse of children to create, so it shouldn’t exist. But the fact that CSAM viewers typically don’t abuse children themselves speaks volumes. If even viewing CSAM doesn’t make a person commit CSA, how in the world does viewing lolicon do so?

Allow me to reiterate what I posted here: CSAM in Germany: "Hardly any time left to pursue real crimes" - #9 by Giacobbe

3 Likes

We do have big noses and crooked teeth, though. :stuck_out_tongue:

2 Likes

Recently, a link to a Twitter competitor has been shown.

And

2 Likes

I thought the competition was Mastadon, which let’s face it, is much better since anybody can make their own censorship free instance?

3 Likes

Should I take that as indication that Tribel fears the cunny?

2 Likes

I don’t know. I don’t know enough about it.

Personally I’m just going to assume “Maybe.”

2 Likes

If so, then its existence is of no value. If that’s the case, then they’re worse than Twitter by not shutting up that they are better. But again, I say this tentatively waiting for confirmation one way or another.

3 Likes

That is the problem with centralized social media that is in the hand of one (profit-oriented) organization. In the end there are just a handful of people deciding what is acceptable and what is not, often by completely arbitrary standards.

I’ve never heard of Tribel before, but I assume they fall into that category as well.

Even if they are tolerant today, there is no guarantee that they will not change their mind in the future and marginalize stigmatized minorities by excluding them from the public discourse like Twitter did. After all, that is exactly what happened with Twitter, which used to be a fairly tolerant and even somewhat safe space for MAPs all over the world. Reddit and tumblr would be two more examples.

That’s why I believe the way to go is decentralized networks like Mastodon, where there is no single instance that can decide who is and who isn’t worthy of having their voice heard.

4 Likes

There’s a story going around involving Elon Musk. Some people are suggesting that he has a “secret alt account” on Twitter. No one really knows if it actually belongs to Musk, but it’s still kinda weird either way.

People saw that profile pic of a little child next to Musk’s pfp and found it belonged to this one:

That’s apparently supposed to be Musk’s nearly three-year-old son.

Some more Tweets made by the account:

image

image

Again, no one knows if the account belongs to Musk or not. But, if it does, it would be really weird for a 51-year-old man to be doing this.

Not only that, but he apparently has another account:

What a weird world we live in!

1 Like

Just pix


20230501_040642

2 Likes

Looks like the Lolicon Defense Task Force were suspended.

3 Likes

Might be interesting to see how this goes.

I imagine this might mean sites that don’t ban will see an increase of traffic.

I think it’s a big mistake to send folks to the dark web to find the material that the creation of is no more harmful than drawing a circle.

4 Likes

Pixiv continues to shoot itself in the foot.

Looks like dlsite are doing the same as well.

All because “they’re made too fast.”

I guess those who’re appluading AI art being banned on Fanbox won’t have the same tune when the payment overlords demand pixiv ban certain things… again.

3 Likes

Even a broken clock is right twice a day. One of Elon’s new features actually gets things right:

1 Like

The german state-owned media Channel “ARD/ZDF” just put out a new video about Twitter. One of the points were that Twitter refuses to delete “Child Sexual Abuse Material” and they showed a citation from a lawyer who stated:

Twitter now refuses to delete CSAM: 9 reports; 0% taken down. Also, the actual age of the depicted person, or animated person does not matter. Only the context and suggested role [the appearance alone is enough, but ok].

In the show they only read up until the 0%, so they can make it seem like that it refers to actual CSAM. Nobody knows what got reported here, but the fact that this was mentioned leads me to believe it was not actual CSAM.

The show and lawyer then co-operated and sent a letter to the Federal Justice Departement saying:

We have filed 9 posts containing child or youth pornography, but Twitter refused to delete any of them. We believe that they are at the very least youth pornography, recognizable by either context, or description [They then linked all the posts].

It ended with a song to the Federal Ministre of Justice saying “Do your fucking job, do your fucking job”.
This shows once again how using the same terminology for something which is light years apart in terms of damage is bad. What actually got reported here? What is the message now? Combat CSAM, or Combat CSAM and stick figures? Does it not matter if it actually depicts children, or not?

This might actually lead to Twitter bowing over and banning loli for good since a proceeding against them has already been started.

2 Likes