[ Question ]

What is our evidence that Bayesian Rationality makes people's lives significantly better?

by Bae's Theorem1 min read28th Jul 201919 comments


Personal Blog

Anecdotally, it has streamlined my thinking process exponentially, and made me more self-aware. However, most proponents of most belief systems will make similar claims.

Our general techniques of using science to come to truth is inarguably valuable, but that's not unique to us.

What evidence can I show to a non-Rationalist that our particular movement (i.e. our particular techniques for overcoming biases, studying decision theory, applying Bayesianism, learning CBT techniques, etc.) is valuable for making their lives significantly better?

New Answer
Ask Related Question
New Comment

2 Answers

First of all, the community around LW2.0 can only be loosely associated to a movement: I don't think there's anyone that explicitly endorses *every* technique or theory appeared here. LW is not CFAR, is not the Alignment forum, etc. So I would caution against enticing someone into LW by saying that the community supports this or that technique.

The main advantage of rationality, in its present stage, is defensive: if you're aspiring to be rational, you wouldn't waste time attending religious gatherings that you despise; you wouldn't waste money buying ineffective treatments (sugar pills, crystals, etc.); you wouldn't waste resources following people that mistake fiction for facts. At the moment, rationality is just a very good filter for every product, knowledge and praxis that society presents to you (hint: 99% of those things is crap).

On the other hand, what you can or should do with all the resources you're not wasting, is something rationality cannot answer in full today. Metaethics and akrasia are, after all, the greatest unsolved problems of our community.

There were notorious attempts (e.g. Torture vs Dust specks or the Basilisk), but nothing has emerged with the clarity and effectiveness of Bayesian reasoning. Effective Altruism and MIRI are perhaps the most famous examples of trying to solve the most pressing problems. A definitive framework though still eludes us.

There is no obvious evidence laying around. I am unconvinced. Promising that bayesian rationality will better your life on the expectation woudl be anunfounded promise.

Take mathematics. I think mathematics is valuable and mathematical results are cool. And you can be a succesfull mathematician. But I would not say that you will be succesfull for choosing mathematics over other fields. You need to have some indiviudal traits to make mathematics a good fit for you. But it is not an automatic win button for all psychologies. Mathematics is far from "fictional" but it doesn't raise to the bar of autorecommend.

Rationality can have very potent results that others have trouble delivering. But there are some hazards and some requirements. Having low emotional tolerance for reflection would be very hard to live with. You are in danger of being part of a minority and your relevance to the population at large can be questionable or small or happen via very particular channels.

I would totally give a "ask your doctor whether rationality is right for you" but not "mortage your house to order yours today"

17 comments, sorted by Highlighting new comments since Today at 10:31 AM

What evidence can I show to a non-Rationalist that our particular movement (i.e. our particular techniques for overcoming biases, studying decision theory, applying Bayesianism, learning CBT techniques, etc.) is valuable for making their lives significantly better?

The question you need to answer first is, rather:

Why do you believe that “our particular movement (i.e. our particular techniques for overcoming biases, studying decision theory, applying Bayesianism, learning CBT techniques, etc.)” is valuable for making your life (or our lives) “significantly better”?

Before asking how to convince someone else, first ask why you are convinced. If you can answer that to your own satisfaction, that is a good first step; if you can answer that to the satisfaction of a third party, that is progress; and then the question of “how to convince others” should be easy.

I'm convinced mostly due to its effects on my own life, as stated in the opening paragraph. But I'm unsure of how to test and demonstrate that claim. My question is for my benefit as well as others.

Right, but how do you know? Are there specific stories of how you were going to make a decision X but then you used a rationality tool Y and it saved the day?

Yes, but they could all be explained by the fact I just sat down and bothered to think about the problem, which wouldn't exactly be an amazing endorsement of rationality as a whole.

I also don't look at rationality as merely a set of tools; it's an entire worldview that emphasizes curiosity and a desire to know the truth. If it does improve lives, it might very well simply be making our thinking more robust and streamlined. If so, I wouldn't know how to falsify or quantify that.

I don't understand how are you getting so many questions about your post instead of sensible replies to it. Did someone really say to you to change the question? Why would you ever do that if what you really want to know is how people are benefited by this way of thinking?

What if say to that guy: "no,no..." how about you tell me how you have benefited about Bayesian thinking since that's what I'm interested in knowing?

The questions are being asked (at least on my part) because I believe the best way to “convince” someone is to show them with the example of your own life.

No, the best way to convince me is to show me data. Evidence I can actually update on, instead of self-reporting on results that may be poisoned by motivated reasoning, or any number of other biases. Data I can show to people who know what they are talking about, that they will take seriously.

Your question was: “What evidence can I show to a non-Rationalist that our particular movement...”

I’m saying for non rationalists that’s one of the better ways to do it. They don’t need the kind of data you seem to require. But if you talk about your life in a friendly, open way, that will get you far.

Additionally, “example of your own life” is data. And some people know how to process that pretty remarkably.

Here's some evidence that "Bayesian Rationality" doesn't work: The fact that you have written the bottom line first with this question, instead of asking a question like "What evidence do we have about impacts of being part of the rationality community?" or some similar question that doesn't get you filtered info :)

I ask the question this way to hopefully avoid stepping on toes. I'm fully open to the idea that the answer is "we have none". Also, I am primarily addressing the people who are making a claim. I am not necessarily making a claim myself.

Fair enough.

CFAR has some data about participants in their workshops: https://rationality.org/studies/2015-longitudinal-study BTW, I think the inventor of Cohen's d said 0.2 is a "small" effect size.

I think some LW surveys have collected data on the amount people have read LW and checked to see if that was predictive of e.g. being well-calibrated on things (IIRC it wasn't.) You could search for "survey [year]" on LW to find that data, and you could analyze it yourself if you want. Of course, it's hard to infer causality.

I think LW is one of the best online communities. But if reading a great online community is like reading a great book, even the best books are unlikely to produce consistent measurable changes in the life outcomes of most readers, I would guess.

Supposedly education research has shown that transfer learning isn't really a thing, which could imply, for example, that reading about Bayesianism won't make you better calibrated. Specifically practicing the skill of calibration could make you better calibrated, but we don't spend a lot of time doing that.

I think Bryan Caplan discusses transfer learning in his book The Case Against Education, which also talks about the uselessness of education in general. LW could be better for your human capital than a university degree and still be pretty useless.

The usefulness of reading LW has long been a debate topic on LW. Here are some related posts:







You can also do keyword searches for replies people have made, e.g.


What evidence can I show to a non-Rationalist that our particular movement (i.e. our particular techniques for overcoming biases, studying decision theory, applying Bayesianism, learning CBT techniques, etc.) is valuable for making their lives significantly better?

Notice the typical mind fallacy/generalizing from one example: you assume that if your life got significantly better from learning the Gospel of Bayes, then so would everyone's. That is emphatically not so: there are many many happy religious people who feel happiness from living their life according to their understanding of their God's laws.

Maybe consider starting by identifying a subset of currently not very happy people who might benefit from learning about LW/CFAR-style rationality and focus on those.

Personally, I have enjoyed reading and learning what Eliezer wrote between 2009 and 2015 or so (fiction and non-fiction), and what Scott A has been writing, and an occasional other post, but I would be hard pressed to say that any of that made my life significantly better. If anything, learning to understand people's feelings, including my own, has had a far larger impact on my life.

I think it's easier to test in advance, as an experiment. (The trick might be getting a control group.)

Are you using calculations, or something more hand wavey?

A strong correlation between adopting the virtues and established methods of rationality, and an increased quality of life, but yeah; more handwavey. I don't even know what calculations could be made. That's sorta why I'm here.

If you're not doing calculations then you are not doing "Bayesian Rationality". Therefore, you very likely cannot explain to someone how "Bayesian Rationality" has worked out for you.

I see Bayesian Rationality as a methodology as much as it is a calculation. It's being aware of our own prior beliefs, the confidence intervals of those beliefs, keeping those priors as close to the base rates as possible, being cognizant of how our biases can influence our perception of all this, trying to mitigate the effects of those biases, and updating based on the strength of evidence.

I'm trying to get better at math so I can do better calculations. It's a major flaw in my technique I acknowledge and am trying to change.

But as you noted earlier, none of this answers my question. If I am not currently practicing your art, and you believe your art is good, what evidence do you have to support that claim?