People are weird.

Guy gets upset when I suggest that the world needs both those who'll state the truth boldly, and those who seek to change minds with a more "softly, softly" approach. (*)

Said guy complains how much time he's wasted on well-structured rebuttals that don't work.

Yes, that's the point. Logical rebuttals work poorly against illogical beliefs. That's why people use other approaches.

(*) More precisely, the approach to take is situational.

Follow

@sgf

Hm~ I don't know of the exact conversation you've had, and you've summarized your opinion here very briefly, so what I say might be obvious/beside the point/unsurprising.

The non-argument-based methods usually give less (or even no) comparative advantage to true claims over false claims. The more trusted and popular they become in a given group of people, all else being equal that group would usually become worse at agreeing on true statements (and individuals in that group are likely to be worse at distinguishing truth from falsity individually).

Now, one might have different reactions to that. One of them is to never use non-argument-based methods, because if one uses them to argue for true things people trust conclusions provided via those methods more. Another similar one is to object to others doing that.

I don't think these approaches are obviously wrong (i.e. that the assumptions that would lead to them being sensible are self-contradictory or very obviously incompatible with how the world could look like with a different but stable culture). They might be (I'm not good at predicting what people do around forming beliefs) totally impractical, but that in itself is a nontrivial question.

@robryk Where does your opinion around what effective ways to change the minds of people with nonsense views come from?

Bear in mind that the "non-argument-based methods" don't do away with facts, they just use them differently.

@sgf

I don't really have one; it's one of the things I can't effectively do unless the person cooperates quite a bit (by being at least somewhat curious).

> Bear in mind that the "non-argument-based methods" don't do away with facts, they just use them differently.

Hm~ I'm not sure which methods you mean exactly. For example, claiming that existence of an anecdote proves the general statement that the anecdote would be an example of kinda relies on facts, but nevertheless is extremely bad way to change one's beliefs to more truthful ones.

If you know of methods that don't "teach"/reinforce false implications, are more strongly "truth-aligned" than the anecdote examples above and yet are not argument based, I'd really want to know about them.

@robryk I think there's some interesting stuff here where it's not super-clear where the dividing line is between getting people to believe sensible things and to get their beliefs in a rational way.

My model is something like this:

Most people are not rational beings. The ones who believe conspiracy theories are certainly not. Trying to convince them with rational arguments, knowing they're not rational is, in itself, irrational! Yet "rational" people try this...

@robryk as they are dealing with the world as they want it to be, rather than as it is.

Yet for the world as it is, those with conspiratorial/cultish views have emotional, tribal and largely internally self-consistent views. This is why what I've read from those who do a lot of work on this stuff suggest building empathy, understanding their position and a while bunch of other stuff that wouldn't be necessary with rational beings.

Is using this kind of technique bad?

@robryk My view is no, it's fine.

Empirically, it's way more effective than trying to use pure logic. Would I rather have a person who isn't a rigorous thinker who believes in conspiracies, or who isn't a rigorous thinker and doesn't? I'd certainly go for the latter.

Second, using empathy and listening isn't a rejection of facts. Instead, it's a way to help find the right facts to loosen their views, and get them past their emotional defenses. It's still useful to have truth on your side!

...

@robryk And thirdly, if you do want to help someone improve their critical thinking, it's surely going to be a lot easier to do so once you've helped them out of conspiratorial thinking!

From another angle, I'm not sure which techniques you're thinking of that you think should help defend against conspiratorial thinking, but ironically I've found "rationalist" techniques to be very poor for this. Humans aren't rational, but they're extremely good rationalisers.

...

@robryk What this means is that I've met "rationalists" with huge mental blind spots who were utterly (irrationally) convinced they didn't have, because they knew themselves to be rational.

As such, I would be extremely wary of getting a conspiracy theorist too deep into theories of reasoning, as they may use them to rationalize their position!

In general, I'd much rather deal someone with a decent handle on Occam's razor and a healthy dose of realistic introspection.

...

@robryk To TL;DR it, my personal theory is it's fine to use empathy etc. to deprogram conspiracists. If you want to prevent people going down mental rabbit holes, better to teach them the ways your brain can trick you, than to teach them how to reason and let them delude themselves they are now immune to bad ideas.

Anyway, that's a great big brain-dump thread that I have no idea if it really relates to what you were saying. Does any of that make sense? Not make sense?

End of thread, anyway.

@sgf

It does make sense and is at least somewhat related. I'll probably say something more detailed in a day or a few, once I've thought about it some more and when it's not late at night :)

@sgf

Let me start by saying that I will have blind spots here, because this touches things that are close to one of the few really virtue-based/deontological rules that I have: I will not try to affect people's thinking/beliefs without being explicit about what I'm doing. (I have it partially for "good" reasons, partially because doing otherwise feels very uncomfortable, so it's easier for me to just not do that.)

I think that trying to be empathetic, understanding your interlocutor's position are good approaches in general (not because they allow you to convince people, but because they sometimes lead to you discovering something new[^]). I think that doing that _specifically because you want to convince them_ would not work if you told them that you're doing that in order to be more likely to convince them. If I were to try to imagine not obeying the no-underhandedness rule, I'd expect that the interlocutor would find my attention to them to be artificial, but this is very unconfident given I have little experience with that.

I think when trying to understand one's interlocutor's position is combined with willingness to change own mind if this provides convincing evidence, then this is a ~technique that is actually biased towards truth. If it's not combined with that, I think one cannot actually employ it convincingly and while not being underhanded. I'm not sure if that version is biased toward truth.

-

I think that we have been thinking of different things as "argument-based methods". I would've included Socratic questioning[^^] in those -- because it relies on (implicit if done well) creation of arguments, but I now expect you wouldn't (because there's no argument being provided to the person being convinced). This makes me think that we largely disagreed in phrasing and not substance, so I'll address what I think are disagreements in substance below, but they are smaller than I expected.

(The rough reason for me ~equating truth-biased with argument-based is that -- in areas other than maths where everything is usually argument based anyway -- ISTM that any truth-biased approach must include a way to notice that it's misapplied, and that seems to require exchanging reasoning and pointing out flaws in it.)

> If someone is out into conspiracy theories, I'd rather get rid of the conspiracy theory than start with improving critical thinking. To my mind, that can come later.

Intuitively I'd expect that doing the former is not very useful, because the mechanism that caused them to believe in this falsehood will probably cause them to believe in multiple others. The only(?) way I can see of achieving former in a durable fashion without latter is if the person gets convinced that their reasoning is faulty and that they should pick an authority and follow them, which I'm not sure is actually better (because it feeds noise into positive feedback loops of people being considered authorities).

> From another angle, I'm not sure which techniques you're thinking of that you think should help defend against conspiratorial thinking, but ironically I've found "rationalist" techniques to be very poor for this.

Examples:
- explicitly thinking about things that a particular belief rests on (so, if any of them turned out false, the belief should be reevaluated),
- "if the argument doesn't rely on property X, it should hold for situations that differ only in property X",
- the concept of bisecting arguments to find the implication you disagree with (and the concept of formulating arguments in a way that makes that easier)[^^^].

I'm not sure that I've addressed everything that you've said and I made some assumptions about what you think above, so please do point out where I overlooked something or erred.

[^] Sadly, when that happens too rarely, one is conditioned away from those approaches.

[^^] I'm not proposing it's reasonable in context; just providing an example for definitions

[^^^] I'm at a total loss when someone disbelieves in modus ponens, though.

@robryk There's a lot here, so I'll try to pick out some illustrative stuff.

First, a difference: "Intuitively I'd expect [...]". Don't! :) There are people with a lot of experience deprogramming cult members, helping conspiracy theorists, etc. They understand what works, through practical experience, and I think it's really worth listening to them.

Second, an overlap: Socratic questioning is pretty close to the approaches that these experts discuss.

...

@robryk

And yes, you won't do well unless you go in with honesty curiosity - even if you'll never think the moon landings were faked, you want to understand why someone else does, what they see differently.

Pulling back to the context of my original post, the guy was frustrated that his "well structured and pointed rebuttals" didn't work. Clearly not a socratic approach.

Called his counterparts "trumpanzee subhuman filth". No empathy, no curiosity. "Argument" as in argumentative, I guess.

@robryk (And I'm ok with people hating Trump fans, just a) dehumanisation is never ok, b) if you hate them, don't engage, you're not going to convince them, you're just going to stoke the hate. I figured the guy was having a bad day, wasn't going to be receptive, so just left him to it.)

@sgf

One thing I perhaps didn't make clear: I think that nearly always convincing a person of one particular fact is ~unimportant compared to improving their likelihood of acquiring convictions that are true in the future.

@robryk Yeah, I think this is probably a big gap in our view points. If someone is out into conspiracy theories, I'd rather get rid of the conspiracy theory than start with improving critical thinking. To my mind, that can come later.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.