[ Question ]

A 'Practice of Rationality' Sequence?

by abramdemski 3 min read14th Feb 202025 comments

71


(This is not your typical factual-answer question, but I think it makes sense to format this as a question rather than a post.)

TLDR: Recommend some posts for a "practice of rationality" sequence I want to curate! Proposing posts that should exist but don't is also cool.

I've been thinking recently that it would be nice if rationality were more associated with a practice -- a set of skills which you can keep grinding and leveling up. Testable rationality skills (like accuracy or calibration in forecasting) are obviously a plus, but I'm not referring exclusively to this -- some very real things can be hard to evaluate externally, such as emotional wellness.

A model I have in mind is meditation: meditation is easy to "grind" because the meditator gets constant immediate feedback about how well they're focusing (or at least, they get that feedback if they meet a minimum of focus required to keep track of whether they are focusing). Yet it's quite difficult to evaluate progress from the outside.

(In fact, when I mentioned this desire for a "practice" of rationality to one friend, they were like "I agree, and in fact I think the practice should just be insight meditation.")

This is basically reiterating Brienne's call for tortoise skills (see also), except what I want to do is collect proposed things which could be part of a practice.

Obviously, some CFAR content could already qualify. CFAR doesn't exactly teach it that way -- as far as I've observed, CFAR's focus is on mindset interventions. "Mindset intervention" is the fancy psychology term for getting someone to think differently by having them do something once. For example, the point of "growth mindset" interventions is that you explain it once and this has long-lasting impact on someone's behavior. Another mindset intervention is: you ask people to write about what matters to them. Doing this once has shown long-term results.

In my first CFAR experience (which was an MSFP, fwiw), the phrase "It's not about the exercises!" was kind of a motto. It was explained at the beginning that CFAR teaches exercises not because people learn the exercises and then go out and use the exercises, but rather, going through the exercises a few times changes how you think about things. (The story was that people often go to a CFAR workshop and then improve a bunch of things in their life, but say "but I haven't been doing the exercises!".)

But many of the things CFAR teaches could be used as a practice, and (again referring to my first CFAR experience) CFAR does do some things which encourage you to look at them that way, like the follow-up emails which encourage you to overlearn one exercise per week (practicing that one thing a bunch so that it becomes an automatic mental motion).

Another example pointing at what I want here is bewelltuned.com. The content may or may not be right, but the sort of thing seems exactly right to me -- actionable skills you can keep working on regularly after getting simple explanations of how to do it. And furthermore, the presentation seems exactly right. LessWrong has a tendency to focus on wordy explanations of intellectual topics, which is great, but the bewelltuned style seems like an excellent counterbalance.

I'm using the "question" format so that answers can recommend specific things (perhaps represented by existing LW posts, perhaps not), whereas comments can discuss this more broadly (such as what more general criteria should be applied to filter suggestions, or whether this is even a good idea). The answer list here could serve as a big repository. I'll probably create a sequence which can be my own highly opinionated curation of the suggestions here, plus my own writing on the subject.

I originally intended Becoming Unusually Truth Oriented to be the start of a sequence on the subject written entirely by me. However, some resulting discussion made me question my approach (hence the motivation for this question).

One friend of mine (going off of some of the discussion in comments to that post) voiced a concern about the rationality community falling into the same pitfalls as martial arts. Several articles about this have been written on LW. (I'm not finding all the ones I remember! If you put links to more of them in the comments I'll probably edit this to add them.) The concern is that a martial art of rationality could lead to the same kinds of epistemic viciousness which are seen in literal martial arts -- a practice divorced from reality due to the constraints and incentives of training/teaching.

That same friend suggested that the solution was to focus on empirically verifiable skills, namely forecasting. But in the in-person rationalist community in the bay area, I've encountered some criticism of extreme focus on forecasting which suggests that it's making the very mistake we're afraid of here -- Goodharting on the problem. One person asked me to give any examples of Superforecasting-like skills resulting in actual accomplishments, suggesting that planning is the far more valuable skill and varies significantly from forecasting. Another person recounted their experience sitting down with several other rationalists to learn superforecasting skills. It was a group of rather committed and also individually competent rationalists, but they quickly came to the conclusion that while they could put in the effort to become much better at forecasting, the actual skills they'd learn would be highly specific to the task of winning points in prediction tasks, and they abandoned the project, concluding that it would not meaningfully improve their general capability to accomplish things!!

So, this seems like a hard problem.

What could/should be a part of a 'practice' of rationality?

71

New Answer
Ask Related Question
New Comment

7 Answers

I started writing out some notes on my current impressions of the "rationality skill tree". Then I had a vague sense of having written it before. It turned out to be background thoughts on why doublecrux is hard to learn, which (surprise!) I also thought were key background skills for many other rationality practices. 

I haven't rewritten this yet to be non-double-crux-centric, but think that'd be good to do someday. (The LW team has been chatting about wikis lately, and this feels like something I'd eventually want written up in a way it could be easily collaboratively added to)

Background beliefs (listed in Duncan's original post)

  • Epistemic humility ("I could be the wrong person here")
  • Good Faith ("I trust the other person to be believing things that make sense to them, which I'd have ended up believing if I were exposed to the same stimuli, and that they are generally trying to find the the truth")
  • Confidence in the existence of objective truth
  • Curiosity / Desire to uncover truth

Building-Block and Meta Skills

(Necessary or at least very helpful to learn everything else)

  • Ability to gain habits (see Trigger Action Plans, Reflex/Routines, Habits 101)
  • Ability to notice things (there are many types of things worth noticing, but most-obviously-relevant are)
    • cognitive states
    • ways-that-ideas-fit-together
    • physiological states
    • conversational patterns
    • felt senses (see focusing).
  • Ability to introspect and notice your internal states (Focusing )
  • Ability to induce a mental state or reframe [note: alas, the original post here is gone]
  • Habit of gaining habits

Notice you are in a failure mode, and step out. Examples:

  • You are fighting to make sure an side/argument wins
  • You are fighting to make another side/argument lose (potentially jumping on something that seems allied to something/someone you consider bad/dangerous)
  • You are incentivized to believe something, or not to notice something, because of social or financial rewards,
  • You're incentivized not to notice something or think it's important because it'd be physically inconvenient/annoying
  • You are offended/angered/defensive/agitated
  • You're afraid you'll lose something important if you lose a belief (possibly 'bucket errors')
  • You're rounding a person's statement off to the nearest stereotype instead of trying to actually understand and response to what they're saying
  • You're arguing about definitions of words instead of ideas
  • Notice "freudian slip" ish things that hint that you're thinking about something in an unhelpful way. (for example, while writing this, I typed out "your opponent" to refer to the person you're Double Cruxing with, which is a holdover from treating it like an adversarial debate)

(The "Step Out" part can be pretty hard and would be a long series of blogposts, but hopefully this at least gets across the ideas to shoot for)

Social Skills (i.e. not feeding into negative spirals, noticing what emotional state or patterns other people are in [*without* accidentaly rounding them off to a stereotype])

  • Ability to tactfully disagree in a way that arouses curiosity rather than defensiveness
  • Leaving your colleague a line of retreat (i.e. not making them lose face if they change their mind)
  • Socially reward people who change their mind (in general, frequently, so that your colleague trusts that you'll do so for them)
  • Ability to listen (in a way that makes someone feel listened to) so they feel like they got to actually talk, which makes them inclined to listen as well
  • Ability to notice if someone else seems to be in one of the above failure modes (and then, ability to point it out gently)
  • Cultivate empathy and curiosity about other people so the other social skills come more naturally, and so that even if you don't expect them to be right, you can see them as helpful to at least understand their reasoning (fleshing out your model of how other people might think)
  • Ability to communicate in (and to listen to) a variety of styles of conversation, "code switching", learning another person's jargon or explaining yours without getting frustrated
  • Habit asking clarifying questions, that help your partner find the Crux of their beliefs.

Actually Thinking About Things

  • Understanding when and how to apply math, statistics, etc
  • Practice thinking causally
  • Practice various creativity related things that help you brainstorm ideas, notice implications of things, etc
  • Operationalize vague beliefs into concrete predictions

Actually Changing Your Mind

  • Notice when you are confused or surprised and treat this as a red flag that something about your models is wrong (either you have the wrong model or no model)
  • Ability to identify what the actual Crux of your beliefs are.
  • Ability to track bits of small bits of evidence that are accumulating. If enough bits of evidence have accumulated that you should at least be taking an idea *seriously* (even if not changing your mind yet), go through motions of thinking through what the implications WOULD be, to help future updates happen more easily.
  • If enough evidence has accumulated that you should change your mind about a thing... like, actually do that. See the list of failure modes above that may prevent this. (That said, if you have a vague nagging sense that something isn't right even if you can't articulate it, try to focus on that and flesh it out rather than trying to steamroll over it)
  • Explore Implications: When you change your mind on a thing, don't just acknowledge, actually think about what other concepts in your worldview should change. Do this
    • because it *should* have other implications, and it's useful to know what they are....
    • because it'll help you actually retain the update (instead of letting it slide away when it becomes socially/politically/emotionally/physically inconvenient to believe it, or just forgetting)
  • If you notice your emotions are not in line with what you now believe the truth to be (in a system-2 level), figure out why that is.

Noticing Disagreement and Confusion, and then putting in the work to resolve it

  • If you have all the above skills, and your partner does too, and you both trust that this is the case, you can still fail to make progress if you don't actually follow up, and schedule the time to talk through the issues thoroughly. For deep disagreement this can take years. It may or may not be worth it. But if there are longstanding disagreements that continuously cause strife, it may be worthwhile.

Getting oriented fast in complex/messy real world situations in fields in which you are not an expert

  • For example, now, one topic to get oriented in would be COVID; I think for a good thinker, it should be achievable to have big-picture understanding of the situation comparable to a median epidemiologist after few days of research
      • Where the point isn't to get an accurate forecast of some global variable which is asked on metaculus, but gears-level model of what's going on / what are the current 'critical points' which will have outsized impact / ...
      • In my impression, compared to some of the 'LessWrong-style rationality', this is more heavily dependent on 'doing bounded rationality well' - that is, finding the most important bits / efficiently ignoring almost all information, in contrast to carefully weighting several hypothesis which you already have

Actually trying to change something in the world where the system you are interacting with has significant level of complexity & somewhat fast feedback loop (&it's not super-high-stakes)

  • Few examples of seemingly stupid things of this type I did
    • filled a lawsuit without the aid of a lawyer (in low-stakes case)
    • repaired various devices with value much lower than value of my time
    • tinkering with code in a language I don't know
    • trying to moderate Wikipedia article on highly controversial topic about which two groups of editors are fighting

One thing I'm a bit worried about in some versions of LW rationality & someone should write a post about is something like ... 'missing opportunities to actually fight in non-super-high-stakes matters'', in the martial arts metaphor.

I nominate this thing johnswentworth did. In addition to the reasons he gives, I'll add that being able to learn on your own, quickly, seems like a good skill to have, and related to (though maybe not the same thing as) rationality.

I would add active and empathic listening, and nonviolent communication. By improving our skills at communicating and connecting with others, we improve both our effectiveness in cooperation as well as the quality of our relationships.

Prediction

Abram pointed out concerns about focusing rationality of prediction. I agree with those concerns, and have said before that many of the skills involved in prediction can be Goodharted well past the point of usefulness more generally. For example, frequently updating, tracking source of information to quickly capture when a question should have been considered resolved, or tracking the group aggregate are all effective strategies that are minimally helpful for other parts of rationality.

On the other hand, to argue via analogy, while martial arts are clearly too narrowly focused on form, and adapted to constraints rather than optimizing for winning, the best mixed martial artists, and I suspect many of the best hand-to-hand fighters in general, are experts in one or several martial arts. That's because even if the practice of any martial art was Goodharted well past the point of usefulness, and waste time because of that, fighters still need all of the skills that martial arts teach.

Similarly, I think that the best rationalists will need to be really good forecasters. The return will drop as you optimize too much for forecasting, obviously, but I imagine that there's a huge return on being in the top, say, 1% of humans at forecasting. That's not an incredibly high bar, since most people are really bad at this. I'll guess that the top 1% level would probably fall somewhere below the median of Good Judgement Open's active participant rankings - but I think it's worth having people interested in honing their rationality skills participate and improve enough to get to at least that level.

The usual caveats of "what do you mean by rationality?" seem likely to crop up immediately here. (i.e. epistemic vs instrumental). "Being able to form accurate beliefs" and "Being able to form good plans in confusing domains" seem like two main things you might want to train.

I think it's plausible that superforecasting (and "forming accurate beliefs" in general) doesn't lead to overwhelmingly great good life outcomes, but is still, like, a skill that is worth gaining for the same reason many other skills are: it's valuable to other people, and you might get paid for it (either by being directly economically valuable, or longterm-public-good valuable so philanthropists would subsidize it). 

How to Measure Anything seems to lay out one particular set of skills that fit at the intersection of epistemic and instrumental rationality. It doesn't give "exercises" but I think is designed for an environment (i.e. making decisions for organizations) where you have a reasonable stream of actions+feedback loops, albeit on a slower timescale.

The Hammer Time sequence is the obvious LessWrong place to start.

My light review of the pedagogical literature suggests four things with large effect size: deliberate practice, test taking, elaboration of context (cross-linking knowledge), and teaching the material to others.

I also suspect debate would make the cut if tested. I think there's too little of the good kind of fighting in a lot of discourse and I sort of blame California culture for not being a good influence here. I think the intuition of comparing to sparring is right as a sort of collaborative fight, also that fighting can and should be playful and exploratory. This is less scalable since it requires skill matched real time collaboration.

On the object level I'll reiterate that we're still failing to engage with korzybski's assertion that we'd be radically less confused if we trained up in noticing type errors in language/representation.

More speculatively: most people are excessively tense most of the time. Example: right now check your brow, jaw, throat, shoulders, gut, pelvis. Given the interaction between physiology and mindset, and given the need for exploratory research, this winds up being of deceptive importance. Relaxation is a trainable skill.