Have any of you found that being rational (as LW promotes it with its unique lingo) genuinely helps your well-being? Your emotional life?

New Answer
New Comment

5 Answers sorted by

Viliam

Apr 22, 2022

120

Emotionally, redefining the concept of "luck" to "probability distribution" (and further to "Everett branches") helped me deal with randomness in real life. Also the idea that you should judge your decision by the information you had available at given moment, not in the light of random events that happen later.

For example, every week someone wins a lottery. Yet I do not regret not having bought the ticket, because there was no way for me to buy a ticket that wins predictably. That allows me to remain calm even if a lottery winner (e.g. Nassim Taleb) is laughing at me.

Another thing is politics. I still care about some political issues, but I no longer care about people debating politics, as I see the debate as an expression of our tribal instincts and signalling loyalty, rather than an attempt to find truth. The fact that someone has a wrong political opinion is now just a fact about human nature.

Many concepts I found in the Sequences were actually things that I have already intuitively half-discovered for myself. That brought me the relief that I am not imagining things; that at least someone on the opposite side of the planet sees things similarly, and OMG it is actually more than just one person, it is actually a large group of people (sadly not so large in my country).

I probably also made an actual rational decision once or twice, but it is hard to say, because I don't know what decisions would I have made without reading LW; maybe different, maybe quite similar.

nim

Apr 22, 2022

60

Yes.

A particularly common instance of this in my life is that the tools of thought which I learned from the Sequences cause me to actually use spreadsheets more often. It goes something like this:

  • I think that I want a thing.

  • I shop for the thing, and find that there are far too many options, all of which have some people claiming they're the worst thing ever (one-star reviews). I feel worried, intimidated, and afraid of actually getting the thing because I'll get the wrong one and be stuck with the wrong one and it'll be my own fault.

  • I step back, and think harder about what I actually want the thing to do. I attempt to formalize a framework for comparing the different options. I feel gently annoyed by my own uncertainty about what I actually want, but this annoyance transforms into confidence or even pride in my own thoroughness as I proceed through this step.

  • Surprisingly often, this more-intentional framing of the problem causes me to realize that I can actually solve the problem with stuff I have on hand. For instance, a home-row letter keycap on my laptop keyboard recently broke. Intentionally attempting to think rationally about the problem caused me to realize that I could move an infrequently-used symbol keycap to the home row and continue typing comfortably. When this happens, I feel brilliant, like I've discovered a tiny exploit in the interface between my expectations and the world.

  • When I still want to go get the thing, I attempt to quantify the relevant aspects of the thing into columns on a spreadsheet, and my options for getting the thing into the rows. By filling out each cell, I can compare, score, and sort the different options, and better visualize what information is omitted by advertisements which otherwise look highly tempting. I often feel surprised and annoyed that an option which looked like it'd probably be the best is actually a non-starter due to some essential trait being wrong or undocumented.

  • I then get the thing which appears to represent the best compromise between cost and features. I feel confident that I have gotten the best thing I could find, and that if it turns out to be inadequate, the problem will be due to factors outside of my control.

Before going down this rabbit hole of big-R Rationality, I knew enough about cognitive biases and similar effects to feel distrustful of brains, including my own, in situations where I noticed that such distortions seemed relevant. But concrete, everyday Rationality has given me tools to circumvent those biases -- mitigation rather than just detection, treatments rather than just awareness.

Strong yes, and this is true for a large number of people in my orbit (>30).

Jonnston

Apr 22, 2022

20

Yes and no.

On the yes side, I find rationality to be incredibly intellectually stimulating. Often I encounter a concept, framework, or abstraction that floors me and sets my mind on fire for days afterward. I think that sort of mental stimulation is really healthy.

In so far as rationality can be defined as "the study of correct thought", it's obviously a latchkey subject for anyone who enjoys thinking critically in any capacity. When doing intellectual work, I often find myself drawing on "Rationalist" concepts. This helps me organize and clarify my own thoughts in a genuinely impactful way.

On the no side, the vast majority of my life is not comprised of intellectual work. In areas like emotional well-being or interpersonal relationships, Rationality has, in fact, harmed me more than it has helped.

Scott Alexander has a beautiful essay about taking advice. You can read that post here (sorry I don't know how to create hyperlinks on mobile):

https://slatestarcodex.com/2014/03/24/should-you-reverse-any-advice-you-hear/

The gist is that people can be inclined to take advice that they don't need. I was the kind of person who already believed in taking a step back and "applying logic" to my emotions and relationships. After encountering Rationality, I doubled down on this approach. Practically the result was that I became more detached from my emotions, and struggled to relate to others in a non-superficial way.

I don't think that the fault lies with capital R Rationality, or with any of the ideas espoused on LessWrong. My conclusion was that some aspects of my own life just aren't well suited to applications of those ideas.

"Apply logic to your emotions" is not advice that I remember reading on LessWrong. CFAR-style rationality advice would be to apply Gendlin's Focusing. 

It sounds to me like more of a problem of doubling down on existing tendencies than one of taking rationalist advice. 

[-]nim2y10

Rationality has, in fact, harmed me more than it has helped.

This framing causes me to wonder whether I experience similar effects but attribute them to causes other than Rationality itself. Would you be willing/able to share some examples of harms you expect that you would not have experienced if you hadn't undertaken this study of correct thought?

3Jonnston2y
To put it as simply as possible, I think that indoctrinating yourself in rationality pushes you further away from the average person, which makes it more difficult to relate emotionally to them. The vast majority of my major problems stem from my difficulty connecting to other people. Therefore even though I'm really interested in Rationality, and I've enjoyed studying it, I think it's done me net harm. This won't be the case for everyone, but I think that many people would be better served spending their time doing something else if their goal is to improve their emotional well-being.
2nim2y
Thank you for explaining. What I hear in this is that rationality also works like an esoteric hobby, and for people who want more friendships built on commonalities, adding an uncommon use of time is counterproductive. I think I don't experience the same negative effects because my "it's good to interact cooperatively with people different from oneself" needs are met instead by some location-based volunteering hobbies. I live in an area with low enough population density that "vaguely competent and willing to show up and do stuff" buys one a lot of goodwill and quality time with others, which is a whole other social hack of its own :)

Strong yes, and this is true for a large number of people in my orbit (>30).