Epistemic status: This is the first time I've expressed these thoughts. I've thought for a long time that people do their jobs well, and are numbskulls in every other area of life. Here I say that it's OK to be a numbskull.

I read Raising the Sanity Waterline some time ago. I thought, "These are great points! I've needed them!" I made arguments that used those points a few times.

When I listened to the Bayesian Conspiracy's episode on it I thought, "How did BC get this article so wrong? RtSW isn't about making oblique attacks on religion by teaching people things like Occam's Razor!"

It is about that!

I think I took these sentences and made a different conclusion:

Behind every exciting, dramatic failure, there is a more important story about a larger and less dramatic failure that made the first failure possible.
The general sanity waterline is currently really ridiculously low. Even in the highest halls of science.

I wrongly took this article to mean (actually, this is what I believe):

You can be a Nobel prize winning scientist and believe in God. The sanity waterline is so low that having many, many irrational beliefs won't impact your ability to contribute to the world and be happy. Why is that? One hypothesis is if you have just a few beliefs that pay rent you're doing better than most. Just as the difference between rationalist and rationalist-adjacent people is small, the difference between an imagined highly-effective scientist and a highly-effective scientist who believes in God is small. In general, the effect of adding another belief that pays rent is logarithmic.
If this bothers you, your options are limited. It's not effective to relieve people of their beliefs in God, because the low sanity waterline predicts that they'll have many other low-water beliefs. And beliefs don't have to pay rent to stick around.
But you can embark on a more ambitious project: raise the sanity waterline. Do this by making a more rational world. The world which is so effective-and-rational that a belief in god impedes your progress is a rational world, indeed.

I don't think that Nobel prize-winning scientists don't understand "epistomology 101." That scientist needs to have epistomology 101 pay rent! Not that you asked, but the broken coupling of professional achievement with assumed rationality makes it hard for me to tell others, "Join us! We win!"

Is there a term for "less-examined beliefs that don't have to pay rent for you to happily contribute in the way you like?" To take myself as an example, I do not examine my belief in anthropogenic climate change. I have never read a paper on it. I never will. I trust everyone around me (I live in a liberal town), and I do what they tell me.

I don't see a problem with avoiding investigating climate change myself, because I won't change my behavior or thoughts if I decide climate change is true or false. The amount of effort I put toward climate-change-avoiding stuff (recycling... TK list other things), is determined by social pressure.

I'd hazard that beliefs stay less-examined for a boring reason: they can. Beliefs get examined because they have to be.

New Comment
9 comments, sorted by Click to highlight new comments since: Today at 1:33 AM

Apologies in advance for the long response. Hopefully this will be worth the read.

I greatly appreciate your post because it challenges some of my own beliefs and made me reassess them. I agree that a person can get by in this world with bad epistemological hygiene. However, humans are animals that evolved to adapt behaviors for many environments. Getting by is cheap. The problem with poor epistemological hygiene (EH) isn't that a person can't get by. As I see it, there are three issues:

  1. The worse your EH is, the more your success is based on luck. If you're right, it's by accident, or you learn the hard way.
  2. Bad EH means bad predictions and therefore bad future-proofing. The world changes, and people are often clumsy to adapt, if they adapt at all.
  3. Though individual humans can survive, poor EH across all humanity leads to poor decisions that are made collectively but not deliberately (e.g. the tragedy of the commons, or cycles of political revolution), which hurt us all in ways that are hard to measure, either because they are gradual or because they require comparing to a counterfactual state of the world.

Most animal populations can survive with trial and error, natural selection, and no ability to destroy the world. I would prefer the standards for humanity to be higher. Thoughts?

Anecdote follows:

Coincidentally, I had a conversation at work today that culminated in the concept you describe as "less-examined beliefs that don't have to pay rent for you to happily contribute in the way you like". The people with whom I was speaking were successful members of society, so they fell into the uncanny valley for me when they started pushing the idea that everyone has their own truth. I'm not sure if it's better or worse that they didn't quite literally believe that, but didn't know how to better articulate what they actually believed.

Ultimately what I got them to agree to (I think) is that although everyone has irrefutable experiences, what they infer about the structure of the world from those experiences may be testably wrong. However, I personally will have no strong beliefs about the truth value of their hypothesis if I have too much conflicting evidence. However, I won't want to put much effort into testing the hypothesis unless my plans depend on it being true or false. It's murkier with normative beliefs, because when those become relevant, it's because they conflict with each other in an irreconcilable way, and it's much more difficult if not impossible to provide evidence that leads people to change their basic normative beliefs.

That said, I suspect that if we're not making plans that falsify each other's beliefs and conflict with each other's sense of right and wrong, we're probably stagnating as a civilization. That ties in with your idea of beliefs not being examined because they don't have to be. The great problem is that people aren't putting their beliefs in situations where they will either succeed or fail. To me, that's the true spirit of science.

For example, my objection to people believing in poltergeists (which is how the conversation started) isn't that they believe it. It's that they don't see the vast implications of a) transhumanism via ghost transformation, b) undetectable spies, c) remote projection of physical force, or d) possibly unlimited energy. They live as if none of those possibilities exist, which to me is a worse indictment of their beliefs than a lack of evidence, and an indictment of their education even if they're right about the ghosts. If people traced the implications of their beliefs, they could act more consistently on them and more easily falsify them. I strongly suspect that cultivating this habit would yield benefits on the individual and population level.

I personally will have no strong beliefs about the truth value of their hypothesis if I have too much conflicting evidence. However, I won't want to put much effort into testing the hypothesis unless my plans depend on it being true or false.

I like how you said this.

The people with whom I was speaking were successful members of society, so they fell into the uncanny valley for me when they started pushing the idea that everyone has their own truth. I'm not sure if it's better or worse that they didn't quite literally believe that, but didn't know how to better articulate what they actually believed.

In social situations, I've been trying to find a delicate and concise way to get across that, "'Everyone has their own truth' is not an experience-constraining belief. Saying it is a marker of empathy--good for you (seriously!). But if I wanted to falsify that belief, I wouldn't know where to begin. What trade-offs do you think you're making by saying, 'Everyone has their own truth'?"

"Everyone has their own truth" is just one example of these kinds of applause-lights-y nonbeliefs. I say them too when I'm trying to signal empathy, and not much else.

For example, my objection to people believing in poltergeists (which is how the conversation started) isn't that they believe it. It's that they don't see the vast implications of a) transhumanism via ghost transformation, b) undetectable spies, c) remote projection of physical force, or d) possibly unlimited energy. They live as if none of those possibilities exist, which to me is a worse indictment of their beliefs than a lack of evidence, and an indictment of their education even if they're right about the ghosts.

Because they live as if none of these possibilities exist (i.e. their experiences are constrained), couldn't you say that for some definition of "believe," they don't actually believe in poltergeists? They're committing a minor sin by saying out loud that they believe in poltergeists, while not living as though they do.

That said, I'd still say that aligning your stated beliefs with how you behave is admirable and effective.

For clickable links with underscores: Just select a piece of text and click the link icon in the toolbar that appears, then paste the link.

Thank you! I updated my post.

I really like taking the "making beliefs pay rent" anaolgy further. You thinking of your score as "how much rent you get payed" and even if you have a bunch of beliefs that are 6 months back on their payments, having a single tenant that pays you triple the cost of rent, on time, every month, has the potential to make you score high.

That line of thinking also opens the idea that a belief can do worse than not pay rent, it can vandalise your apartment complex and take up a lot of your time and attention with the problems it causes.

I think you get the point here that is apparently missed by many (I didn't realize it was, so thanks for pointing out that it is), and reminds me of an old saying: in the land of the blind, the one-eyed man is king. By being just a little bit better you can achieve amazing things relative to what others can achieve such that your difficiencies don't matter in a relative sense.

However, I think there is something more. That we can do better than others by doing something rather than nothing is true, but is doing better than others enough? I think there's a deeper point that the sanity waterline is so low that unfortunately it is possible to do "great" things just by being a little bit sensible. What happens if we measure our sanity not against others but against what is possible? Then doing a little better than others is still better, but it may not be better enough for what you want to achieve.

You want to do better than a Nobel Prize? Not the prize of course, but the contribution to society? I'm intrigued. Could you expand on that?

My intrigue comes from my bar-of-what-is-possible, John von Neumann. He probably has more beliefs-that-pay-rent than me, but he also has a "practically unlimited" capacity for work, tons of "mathematical courage," and "awe-inspiring" speed[0]. It'd be so great if those things were simply beliefs-that-pay-rent!

So I tell myself, "To do better than I have been doing, I must increase my work ethic, mathematical courage, and speed." That's very difficult for me; I'm a lazy, nervous, and slow thinker! I'm not sure what I think (nor do I know what the lesswrong consensus is) about what is and is not a belief-that-pays-rent, and whether changing those beliefs changes your life as much as changing things that aren't.

What do you have in mind as regards the possibility of doing great things? By the way, I agree with and appreciate your comment.

[0] http://stepanov.lk.net/mnemo/legende.html?hn

It of course depends on the Nobel Prize being awarded and for what, but I'm thinking in terms of impact where best of all humanity might not be enough, like even if you do the best work of all humanity to address an existential risk, you might still fail to do enough to mitigate the risk.

I think an important distinction to make here is between the beliefs "there is a God who is Good in a nonspecific way that doesn't contradict very basic morality" and "there is a God who is very concerned with our day-to-day behavior and prescribes additional moral and factual precepts beyond what society generally already believes".

The former is the sort of belief which seems partially optimized for never needing to be examined (I'll wave my hands and say "memetic evolution" here as if I'm confident that I know what it means), and is probably more common among scientists and liberals and people with whom atheists are likely to agree than the latter. From an instrumental rationality perspective, it's the latter which ends inquiry and stifles truth, and the latter which we need to destroy by raising the waterline; the former is just collateral damage.