@ska @dalias @futurebird @mattmcirvin Deplatforming is really important to reducing the availability of bad information. Big impacts around Twitter decisions, for example: https://www.washingtonpost.com/technology/2021/01/16/misinformation-trump-twitter/
@Emilyy @ska @dalias @futurebird @mattmcirvin
Deplatforming is great until it happens to you
"Deplatforming is great until it happens to you"
This assumes you are speaking to a group of people who have never been deplatformed. But this is far from the case.
There isn't any politics free abstract solution. Deplatforming a fun video game forum because "video games cause violence" is bad.
Deplatforming nazis is good.
It's very messy and we will need to have discussions about the boundary forever.
"Just allow everything" isn't a solution and it doesn't even really do what it claims
@futurebird I agree that “just allow everything” isn’t a good sentiment, but individuals should be allowed to sort out a majority of information for themselves and be able to openly debate why someone is incorrect.
You lose both of those things when you remove a a person’s ability to speak.
Additionally, you encourage others to move into echo chambers which I personally think makes things far worse.
Deplatforming doesn't "remove a person's ability to speak".
What it does is change their audience: initially usually to be smaller, sometimes indirectly in the long run to be bigger, depending on what happens next.
(Some people get a lot of publicity for their claims of being "cancelled".)
@Unit @unchartedworlds @futurebird It's okay because you agreed to it when you joined Twitter. These are private spaces that exist by invite only, and those invites can be rescinded.
And that rule applies to every private space, even your residence. You can see why it would be especially useful in, for example, a women's shelter or an alcoholics anonymous meeting.
Also, "telling others why [the user] is wrong" just *elevates* their speech, while deplatforming them makes their message less accessible. When you think about trolls and spammers, the benefits of that route become obvious; few bad actors argue in good faith.
A group choosing to disassociate with certain individuals is one of the ways a group defines itself, and is itself a form of speech.
@Clementulus It sounds like you feel a degree of ownership over the social spaces provided by the social media servers.... but that's an illusion.
You're in somebody else's house, and they can ask you to leave if you offend them, whether or not they think that's good for the rest of the guests at their house.
Sure, sometimes moderation decisions are made with the community in mind, but it's much easier to think of it as "Ew, I don't want to pay to host this crap on my server." With that context, the right to do so seems perfectly fair.
As users, if we don't agree with the decisions of the server's owners/administrators, we're free to find greener pastures from someone who more closely aligns with our values.
@LouisIngenthron I do understand why deplatforming happens in a lot of cases in the real world. I was just saying that I could not think of a *principled* reason to do so. Clearly I am not the one making these decisions at Twitter or anywhere else. But also I would not think it correct to participate in protest deplatforming, where some people go to public lectures or presentations and make a rukus to prevent the genuine guests from hearing what the speaker has to say; or petition public institutions to refuse spaces to certain people based on their ideas (and usually very skewed views of their ideas at that).
I think the mistake is looking for "first principles" in the first place-- a social media site is a kind of community, and there are probably some shared values. You hold open discussion and find a consensus.
Companies running social media should think about their image, what sort of place are they creating? Who would want to be there?
Now universities and other public institutions probably have the hardest questions. But again: these places have mission statements, what are your values?
What should reddit do?
Like you they initially tried to reason this out in an abstract way.
This meant that some very toxic people took over parts of the site. Other people left.
That's the thing about community standards: it's not just what you are excluding, but it's also about who you choose to value.
As for if people running their own social media can decide these things for adults? Obviously we can. And we do.
@futurebird reddit is a decent example of an open structure that in some ways mirrors reality, you can try to find communities that interest you and then participate in those communities and yes, they have shares values that are reflected in their moderation but the platform itself is neutral. Did I ever choose to go on rpolitics? No. But when I wanted a good chuckle at goofballs throwing all their money into gamestop stock, rwsb was the place to be. I think that I just use social media differently where I mostly look at OPs and rarely the comments, which I put zero stock in since its mostly just people trying to be obnoxious.
@LouisIngenthron @Unit @unchartedworlds @futurebird
The deplatforming question is an interesting question to me. If we set aside legitimate targetted harassment, intimidation and threats, I cannot work out good first principles for deplatforming on social media. For one, if you don't like what someone is saying, you can just ignore them, block them, dont follow them or whatever. Deplatforming is only done to limit other peoples access to a certain individuals ideas. In the case of children I could understand having parental controls on what ideas they come into contact with since they are not yet mature in their thinking. But for adults I can't come up with a non-paternalistic reason why I should be allowed to limit what other people see online and I also don't trust other people to make these types of decisions for me.