Follow

The problem with Johnstone being censored, over and above the imperative to respect free speech, is that it gives an aura of subversive legitimacy to a propagandist lunatic. @Caitoz isn't a "well known alt journalist", she is a red-brown crackpot, a left-branded wannabe Alex Jones, who weaponizes justified opposition to Western capitalism and imperialism to whitewash open support for fascist governments in Russia and China. She deserves not to be silenced, because she deserves the chance to be discredited by exposing her hypocritical nonsense to bright sunshine.

Heretical_i  
But you CAN HAVE George Takei on mastodon.social. The fact that it censored a well-known alt journalist, #CaitlinJohnstone, shoud send a chill up i...

@reviewer2 She's not being silenced, she's being deplatformed.

None of these people can stop her from starting up her own mastodon instance and continuing to post her content.

All they're doing is saying "not in my house", which is their own exercise of free speech, and a repudiation in its own right.

@LouisIngenthron I don't believe in "deplatforming" by anyone in the position of power, and mastodon.social is big enough so it is, other than in cases of demonstrable hate speech, harrasmment, calls for immediate violence, child abuse/porn & the like. I don't know the details of what Johnstone said, and wouldn't mind knowing more, maybe she did cross those lines. But I've seen "deplatforming" used both appropriately and less so. So I am overall in favor of a social media culture where fools can be shown as such openly, rather than being shoved to corners where they can fester like sores and build a following.

But we don't have that, and the fact that well-intentioned people consider Johnstone an "alt journalist", and that she can now wear the deplatforming as a badge of honor to bolster this faux image, is testimony to the failure. Instead we mostly have first-world-centric echo chambers, exercises of walking on eggshells, for which it's hard to tell when they actually work to protect the vulnerable & deplatform bad players, and when they serve to alienate, marginalize, and exclude those who don't conform to arbitrary and untransparent respectability criteria. (I left my first instance, an "inclusive" space for environmental activists, after I was asked to take down a post with a newspaper quote that included the word "insane". Which is, frankly, insane.)

@reviewer2 I disagree, because I think you're conflating the idea with the person presenting it.

Ideas should be discussed openly and disinfected with sunlight.

But people who continue to spread ideas that have already been debunked over and over aren't arguing in good faith. They're grifters looking for clout.

Mastodon social has a captive audience they've worked hard to build. Part of the reason they've been able to build that audience is because they made a choice to keep the grifters out (i.e. freedom of association, a key corollary to freedom of speech).

By refusing the grifters access to that audience, they're stopping their bullshit without preventing the discussion of the ideas that were presented by other users. Without a platform, it's 10x harder to build a following. So, in my opinion, anyone with such a platform has a moral duty to protect their audience from bullshit artists.

@LouisIngenthron Thank you for the response, I understand. My concern is that freedom of association is weaponized against freedom of expression by curating echo chambers rife with groupthink - as it's been the case, in the West at least, in online spaces everywhere on the political spectrum, way too often for comfort. However, I wouldn't be averse to adding "trolls with a consistent history of bad-faith engagement" to the list of exceptions to the free expression imperative. But this should be an innocent-until-proven-guilty kind of a thing. So, while I know enough about Johnstone's presence to be quite sure she is not at all innocent, I also don't know a lot about the particulars of this case.

The issue I have is, who is in charge of the criteria & how transparent are decision processes whereby one is classified as an unreformable troll? We live in times when people who share annoying, untolerated, but well-reasoned opinions are banned outright from all political activity in supposedly democratic countries - as just happened to Yanis Varoufakis in Germany for the speech below, whether or not you agree with it. I'm sure that people who banned him live in the world where he is, according to their criteria, an unreformable troll. And yeah Johnstone can't hold a candle to YV, but the criteria to exclude her need to be objective & transparent. I'd happily tolerate a few trolls to avoid any false positives in this regard.

youtube.com/watch?v=9JXXBhruGh

@reviewer2 If that's what you want, you should lobby the government to start a public-sector social media company, bound by laws and precedent.

Personally, I like the current model. The way that a site chooses to moderate informs me, the consumer, of its values and allows me to make an informed decision about which platform I want to participate in.

Some people, like you, want more opposing opinions, and that's fine. There are whole sites and federated instances that cater to that mentality (although, they do usually end up as Nazi bars).

But most people prefer a, for lack of a better term, safer space. They want to be able to discuss complex issues without contrarian or violent trolls butting into their replies. Some instances, on the fediverse specifically, take that way too far, in my opinion, but I recognize that they've created the space their community wants.

So long as there is a plurality of social media offerings, I think it's a very good thing that they use their powers of moderation to set themselves apart and cater to different communities. And, more importantly, their ability to do so is just more free speech.

@LouisIngenthron Hmm. I, actually, did not say much of that that, so from my POV you're imposing some boilerplate stereotype on me that I am not. Which is my usual experience in these spaces, so no worries, I'm interested in understanding what it's about.

I am, in fact, in favor of no more nor less than what the instance we're both on says it's doing: "Question Others to Teach Ourselves, An inclusive, Academic Freedom, instance, All cultures welcome". While I don't know how QOTO deals with deplatforming, IMV the above commitment requires doing so in transparent and democratic ways, regardless of who's in charge (and no, I don't think governments would be any better at it). Without that, there's always a danger of authoritarian imposition, which should make *anyone* feel unsafe. I don't think anyone can claim they're "creat[ing] the space their community wants", if the community can't oversee these decisions.

But yes, I do think that allowing well-reasoned non-conforming positions is a prerequisite for, like, actual discussion of complex issues, that doesn't devolve into parlor talk within a local Overton window, and I don't at all feel safe if someone acts as a cop to keep away undesirables who don't fit their image of community safety; for unreasonable contrarians there's always "mute" and "block". So what I'm saying is, I'm sure there was a process for deplatforming Johnstone, what was it, do we know? I may even agree with it.

I was also quite clear about excluding hate speech, so saying I'd be ok with instances that don't guard against becoming "Nazi bars" is *way* overdrawn, and I'm curious where that came from.

Just saying - I don't expect an answer, I've taken plenty of your time. Good day.

@reviewer2
> you're imposing some boilerplate stereotype on me that I am not

Which parts?

> the above commitment requires doing so in transparent and democratic ways

That would be great, but it's not feasible. Content moderation at scale is a *really* hard problem (read here for more details: techdirt.com/2019/11/20/masnic). If a few people are doing it, they can be pretty cohesive, but can't be scaled. But the more people you add to the process, the less consistent the results will be, inevitably leading to accusations of unfairness and shadowbanning and the like, warranted or not.

Likewise, transparency sounds nice in theory, but in practice, it takes the nuance out of the process and forces the platform to create ever-more-rigid rules that allow trolls to skate right up to the line without crossing it, ruining the platform for the users who, through reasonable desires for transparency, are now unable to take action against the trolls.

> yes, I do think that allowing well-reasoned non-conforming positions is a prerequisite for, like, actual discussion of complex issues, that doesn't devolve into parlor talk within a local Overton window

Right, but not everybody is looking for a political salon. Some just want to discuss anime without dealing with hate speech (which is a loaded term that is impossible to fairly define, which is why we can't outlaw it in the USA, btw).

> I'm sure there was a process for deplatforming Johnstone, what was it, do we know?

Yes, there was a process. We don't know what it was, but the results speak for themselves.

Like how an instance moderates? Join or federate with it.
Don't like how an instance moderates? Block it and refuse to participate.
The benefit of plurality is that we can vote with our feet.

> saying I'd be ok with instances that don't guard against becoming "Nazi bars" is *way* overdrawn, and I'm curious where that came from

The term "Nazi bar" refers to an old maxim. Basically, it goes something like this:
A bar decides that all legally permissible speech should be allowed on its premises. At first, this is heralded as a great move for free speech, and many people come to the bar to discuss the issues of the day. However, because most bars don't allow Nazis, and this one allows all points of views, it's one of the only bars in town the Nazis can go to, so more and more Nazis start visiting it to discuss how to best genocide or whatever Nazis talk about. This turns off the non-Nazi clientele, who wanted to discuss non-genocide matters, so they stop visiting the bar. Soon, there are only Nazis left. In the end, the "free speech" bar inevitably just becomes the "Nazi bar".
That's become a metaphor for social media platforms that claim they won't moderate objectionable-but-legal speech. Inevitably, they either start moderating or they become the Nazi bar.

> Just saying - I don't expect an answer, I've taken plenty of your time. Good day.

Not at all, I enjoy the discussion. Your points are thoughtful and well-articulated.

@LouisIngenthron
> Which parts?

This especially:

> Some people, like you, want more opposing opinions, and that's fine. There are whole sites and federated instances that cater to that mentality (although, they do usually end up as Nazi bars).

I'm assuming that since we're at the same instance, it satisfies both your need for safety and mine for openness - admittedly I'm pretty new here, but the academic freedom focus is encouraging.

Thank you for the article you posted, it's interesting reading. I don't think it applies here, though, because it deals with moderating individual posts at Facebook scale, whereas the present issue is deplatforming of users on sites several orders of magnitude smaller. Even at masto.social scale (240k people) I can't imagine they have to deal with deplatforming more than a few dozen people a week - not easy, but the huge numbers argument from the article doesn't really apply. And sure, I realize that this is still really hard and also instrinsically subjective, so that there'll be errors - which is even more the reason why (a) the process must be transparent and open to community scrutiny - otherwise IMV it has zero claim that it actually works for the community, and (b) it needs to err on the side of caution, to avoid false positives wherever possible - as a systematic stream of false positives reliably results in systematic exclusion of groups that don't fit the moderators' cultural assumptions. That's all I'm arguing for.

One question I think this boils down to is trust. I simply don't trust online communities that aren't transparent and open to being held accountable (from my time on Wikipedia I'm quite aware that this is also no panacea, but it's better than nothing). And while I have no expectation of this from large social media corps, it looks like much of Fediverse also doesn't really measure up. I don't think that the freedom of association argument, "if you don't like it, go elsewhere", is an adequate answer, especially for the largest instances. By their very size & centrality IMV they're providing a public service, since they'll be the first stop in the Fediverse for many people from cultural & social niches where finding an instance to one's liking or running one's own is a tall order (and tho I'm well-socialized in the West, I'm not foreign to this kind of an experience). Major instances which don't see it necessary to make their processes transparent and accountable, consistent with providing a general public service (pedos, stalkers, and fash trolls excluded) are almost guaranteed to end up systematically excluding those who don't fit some of their own cultural expectations, say, coz are not WEIRD (Henrich et al 2010). The supposed plurality of offerings then ends up as feudal fragmentation into echo chambers run by those who have the resources. Not at all surprising, and I don't think it should be controversial to point out that this can be a problem, which the Fediverse is positioned to address much better than the corporate social media.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.