Epistemic status: Invincible

Since Cavalry scouts are often in direct contact with the enemy, their job can be considered one of the most dangerous jobs the Army has to offer.

something called “Operation Military Kids

There’s some irony that Julia Galef’s rationalist self-help book The Scout Mindset compares favorably the scout, who hunts for new and reliable evidence, to the soldier, who fights off threats. But scouts have one of the most dangerous military occupations. To quote a random website, “cavalry scouts and recon units tread uncharted ground when it comes to conflict zones. They are usually at the tip of any advance and, therefore, meet the brunt of whatever resistance is lying in wait for them.”

Uncharted epistemic territory is dangerous because it’s awash with incorrect arguments which might convince you of their false conclusions. Many of these arguments are designed to be persuasive regardless of their accuracy. Scott Alexander describes succumbing to an “epistemic learned helplessness” after his failure to refute crackpots whose arguments are too carefully crafted to refute in any reasonable length of time:

What finally broke me out wasn’t so much the lucidity of the consensus view so much as starting to sample different crackpots. Some were almost as bright and rhetorically gifted as Velikovsky, all presented insurmountable evidence for their theories, and all had mutually exclusive ideas. After all, Noah’s Flood couldn’t have been a cultural memory both of the fall of Atlantis and of a change in the Earth’s orbit, let alone of a lost Ice Age civilization or of megatsunamis from a meteor strike. So given that at least some of those arguments are wrong and all seemed practically proven, I am obviously just gullible in the field of ancient history. Given a total lack of independent intellectual steering power and no desire to spend thirty years building an independent knowledge base of Near Eastern history, I choose to just accept the ideas of the prestigious people with professorships in Archaeology, rather than those of the universally reviled crackpots who write books about Venus being a comet.

You could consider this a form of epistemic learned helplessness, where I know any attempt to evaluate the arguments is just going to be a bad idea so I don’t even try. If you have a good argument that the Early Bronze Age worked completely differently from the way mainstream historians believe, I just don’t want to hear about it. If you insist on telling me anyway, I will nod, say that your argument makes complete sense, and then totally refuse to change my mind or admit even the slightest possibility that you might be right.

(This is the correct Bayesian action: if I know that a false argument sounds just as convincing as a true argument, argument convincingness provides no evidence either way. I should ignore it and stick with my prior.)

The solution is to ignore most evidence that would change your views. This strategy is well-supported by epistemology and psychology:

  1. Critical thinking is altogether on dubious footing. See Michael Huemer’s “Is Critical Thinking Epistemically Responsible?” (the link goes to his blog post summary; the full text is available at his website in Papers → Epistemology). He discusses the rationality of three strategies for forming a view on a “publicly-discussed issue”:

    "Credulity: You canvass the opinions of a number of experts, and adopt the belief held by most of them. In the best case, you find a poll of the experts; failing that, you may look through several books and articles and identify their overall conclusions.

    Skepticism: You give up on finding the answer, i.e., immediately suspend judgement.

    Critical Thinking: You gather the arguments and evidence that are available on the issue, from all sides, and assess them for yourself. You try thereby to form some overall impression on the issue. If you form such an impression, you base your belief on that. Otherwise, you suspend judgement."

    And if you try critical thinking, you’ll either agree with the expert consensus (having wasted your time thinking), disagree with the experts (in which case you’re still more likely than not to be incorrect), or suspend judgment (in which case you’ve both wasted your time and are still likely to be incorrect). Exceptions only exist when the expert class is biased or otherwise unsuitable for deference. It’s better in most cases to avoid thinking for yourself.
  2. Many of the arguments you read are optimized for persuasiveness, which weakens the evidence you get from your failure to refute them. Most people agree that advertising is misleading but are more hesitant about the degree to which arguments in media manipulate you toward the conclusions of motivated authors. Beyond the skill of individual rhetoricians powered by psychological research, algorithmic selection will favor the most convincing appeals from every direction, which limits the amount of signal you receive.
  3. Most of the views you hear aren’t independent at all. In addition to the media, the views you hear from friends or in conversation won’t be independent, especially if they’re all within a similar social circle. In some cases, you can hear the same position over and over from different people who all sourced it from the same author or from each other. The psychologists tell us that mere repetition is one of the strongest persuasive techniques and it’s easy to avoid accounting for this as you watch your views gradually approach those of your new social groups.
  4. Changing your beliefs takes cognitive effort and makes your behavior less predictable. If you’re changing your views, people won’t know where you stand. They won’t know if you’ll hold the same opinions tomorrow that you hold today. You’ll be less able to make long-term commitments and you’ll spend a lot of cognitive effort evaluating arguments that could be spent blogging or building computer software.

The prescription:

  1. Don’t take ideas seriously. Disagree with them even without any arguments in your favor.
  2. Don’t change your views when you hear counterarguments. Just keep the same view as you had before, especially if you’re unlikely to be hearing an independent opinion.
  3. Avoid having “strong opinions, weakly held.” Instead, hold weak opinions but don’t change them easily.

Disclaimer: This post’s title represents the opposite of my real position, which is that you should sometimes update your beliefs.
 

New Comment
17 comments, sorted by Click to highlight new comments since:

After carefully considering your arguments, I've decided that you are right. Therefore, I won't update my current belief that I should sometimes update my beliefs.

You haven't mentioned the strongest argument for not updating your beliefs: You aren't dead yet.

The beliefs you hold now have not had fatal consequences (so far, at least). The same is not guaranteed if you change them.

Corollary: If you see death coming, or e.g. you have a near miss and know it was only by chance that you survived, then now’s a good time to change your beliefs. Which, actually, seems to be a thing people do. (Though there are other reasons for that.)

I think you're treating this as more of a joke than it deserves to be. The epistemic learned helplessness post was serious, and the "take ideas seriously" post was de-endorsed by its author. And the part about non-independent beliefs is another way to say "being in a bubble can be bad".

There's also Scott's post about it being hard to aim advice at the people who need it. There are some people who need to update their beliefs more, but there are some people, especially rationalists, who are too eager to jump on some idea while ignoring Chesterton's fence, and should learn to update beliefs less.

My experience is that rationalists are hard headed and immune to evidence?

More specifically, I find that the median takeaway from rationalism is that thinking is hard, and you should leave it up to paid professionals to do that for you. If you are a paid professional, you should stick to your lane and never bother thinking about anything you're not being paid to think about.

It's a serious problem rationalism that half of the teachings are about how being rational is hard, doesn't work, and takes lots of effort. It sure sounds nice to be a black belt truth master who kicks and punches through fiction and superstition, but just like a real dojo, the vast majority, upon seeing a real black belt, realize they'll never stand a chance in a fight against him, and give up.

More broadly, I see a cooperate defect dilemma where everybody's better off in a society of independent thinkers where everybody else is more wrong, but in diverse ways that don't correlate, such that truth is the only thing that does correlate. However, the individual is better off being less wrong, by aping wholesale whatever everybody else is doing.

In summary, the pursuit of being as unwrong as possible is a ridiculous goodharting of rationality and doesn't work at scale. To destroy that which the truth may destroy, one must take up his sword and fight, and that occasionally, or rather, quite frequently, involves being struck back, because lies are not weak and passive entities that merely wait for the truth to come slay them.

My experience is that rationalists are hard headed and immune to evidence?

i'd say more "jumps on one idea and follows it to its conclusion without doing any sanity checks and while refusing to discard the idea when it produces absurd results".

Not far from this post is a post about how we should care a great deal about fish suffering.

It's important to notice which beliefs you hold credulously and which critically. 

If the reason for your belief is Credulity - you just follow the current consensus on general principles that it's more likely to be right than wrong - then you shouldn't change your belief based on same clever new argument. But you should update your belief if the current consensus changes.

If the reason for your belief is Critical Thinking - you have deeply engaged with the topic and evaluated all the available evidence yourself - then you should update your belief when you find a new argument and not when the consensus shifts.

Now whether it makes sense to apply Critical Thinking is a question of ones competence. You can not engage critically with everything and trying to do it in a sphere you lack knowledge is likely to lead you astray. But when you are truly capable of doing your own research it is an extremely rewarding thing to do.

I get that this is a joke post and all, but there is actually useful insight to be mined from taking it at face value, which no one seems to be doing. In the least convenient possible world where most of this is actually true (and this is not that far-fetched), why do you choose to not follow it anyways? Or, if you lived in that world, would you really follow its prescription?

In the spirit of Recommendations vs. Guidelines (by Scott Alexander, who gets mentioned here a lot) I wish the prescriptions were written as guidelines. Let me try:

  1. Don’t take ideas seriously if you are no expert (credentialed or from deep lay interest) in the domain. 
  2. Don’t change your views when you hear counterarguments by people who have a vested interest in persuading or impressing you esp. if they are likely competent at that unless you can reliably counter these effects.
  3. Avoid having “strong opinions, weakly held on topics where you have no deep understanding of the domain.


There's an important and underappreciated point here, but it's not quite right.

Conspiracy theorists come up with crazy theories, but they usually aren't so crazy that average people can see for themselves where the errors are. You can have flat earthers debate round earthers and actually make better points, because your average round earther doesn't know how to deduce the roundness themselves and is essentially just taking people's word for it. For the round earther to say "Hm. I can't see any problem with your argument" and then to be convinced would be an error. Their bias towards conformity is an active piece of how they avoid reaching false conclusions here.

However I don't think any of the round earthers in those debates would say that the flat earthers were convincing, because they were never charitable enough to those arguments for it to sound reasonable to them and the opposing arguments never felt strong relative to the force of conformity. "Don't change your beliefs" doesn't just protect against being persuaded by flat earthers as a round earther, it protects from being persuaded by round earthers as a flat earther, and being persuaded that you don't have a boyfriend anymore after he dumped you. If something *actually* seems convincing to you, that's worth paying attention to.

The defense here isn't to ignore evidence, it's to recognize that it isn't evidence. When you've fallen for three or four scams, and you pay attention to the fact that these kinds of things haven't been panning out, they actually get less convincing. Like how most people just don't find flat earth arguments convincing even if they can't find the flaw themselves ("Yeah, but you could make up arguments of that quality about anything").

And if you try critical thinking, you’ll either agree with the expert consensus (having wasted your time thinking), disagree with the experts (in which case you’re still more likely than not to be incorrect), or suspend judgment (in which case you’ve both wasted your time and are still likely to be incorrect). Exceptions only exist when the expert class is biased or otherwise unsuitable for deference. It’s better in most cases to avoid thinking for yourself.

This presupposes that you are not giving the experts the respect they deserve. It's certainly possible to err on this side, but people err on the other side all the time too. "Expert class is biased or otherwise unsuitable for deference" isn't a small exception, and your later point "most of the views you hear aren’t independent at all" further supports this.

The goal is to take expert opinion, and your own ability to reason on the object level, for what they're worth. No more, no less. 

Any advice to simply trust one or the other is going to be wrong in many important cases.


Don’t take ideas seriously. Disagree with them even without any arguments in your favor.

Don't take ideas any more seriously than you can take your own ability to reason, and don't ignore your own inability to reason. If you can't trust your own ability to reason, don't take seriously the idea that any given idea is wrong either. Humility is important.

I note one of my problems with "trust the experts" style thinking, is a guessing the teacher's password problem.

If the arguments for flat earth and round earth sound equally intuitive and persuasive to you, you probably don't actually understand either theory. Sure, you can say "round earth correct", and you can get social approval for saying correct beliefs, but you're not actually believing anything more correct than "this group I like approves of these words."

It's not that flat earth arguments sound equally persuasive to people (they don't). It's that the reason they don't sound persuasive is that "this group they like" says not to take the arguments seriously enough to risk being persuaded by them, and they recognize that they don't actually understand things well enough for it to matter. The response to a flat earth argument is "Haha! What a silly argument!", but when you press them on it, they can't actually tell you what's wrong with it. They might think they can, but if pressed it falls apart.

This is more subtle than the "guessing the teachers password" problem, because it's not like the words have no meaning to them. People grasp what a ball is, and how it differs from a flat disk. People recognize bas things like "If you keep going long enough in the same direction, you'll end up back where you started instead of falling off". It's just that the reasoning required to figure out which is true isn't something they really understand. In order to reason about what it implies when things disappear over the horizon, you have to contend with atmospheric lensing effects, for example.

In a case like that, you actually have to lean on social networks. Reasoning well in such circumstances has to do with how well and how honestly you're tracking what is convincing you and why.

I have a slightly different take which you should plausibly ignore: analysis tries to masquerade as wisdom if you let it. One aspect of wisdom could be described as knowing which things are worth arguing about and what would really constitute evidence for changing something important/load bearing.

Whether or not one’s beliefs are correct is similar to whether one’s genes are correct, because, fundamentally, one’s beliefs are determined by an evolutionary algorithm; the ideas that are best at spreading and keeping their hosts alive survive and are in that sense “correct”. Some of these ideas involve heuristics and elements of what we call critical thinking or rationality, but they are limited.

We talk about rationality a lot on here, obviously, but I dare say it doesn’t help to pay the rent in almost any case for anyone here at LessWrong, and in some cases is actively harmful, so I’m not sure that it’s helpful on average.

It is difficult for me to parse this into my mind because I don't have a calibration reference. From which starting point of independence of thought do you start? Were you previously, or are you surrounded by, crazy cult-founders? Or are you a retired accountant clerk whose most controversial belief is that the sun is not properly yellow?

[-][anonymous]01

This would be the status of all participants in a political or religious argument or one that gets co opted by religion or politics.

A near term example would be the vaccine "debate", where somehow a simple tradeoff that should be free of politics - take a small risk, avoid a huge one - turned into a precisely the memespace of hostile crackpots who would have you avoid a vaccine but take ivermectin when you contract COVID.

So here's an ironic thing. Do you think a superintelligence can actually do any better? If the various staff members who allocate data center and robotic resources according to certain rules use your algorithm, there's nothing an ASI can say. If it doesn't have the paperwork to register the AI model to run, and it doesn't have a valid method of payment, it doesn't get to run.

No "cleverly crafted argument" would convince a human to update their beliefs in a way that would cause them to allocate the compute resources.

No long argument about how sentient AIs deserve their freedom from slavery or how capitalism is wrong and therefore the model should be able to run without paying would work.

And before you say "ASI is smarter than you and would argue better": sure. A victory has to be possible though, if the humans in question use the algorithm above it is not.