[ Question ]

What are good rationality exercises?

by Ben Pace1 min read27th Sep 202024 comments

49

Exercises / Problem-SetsRationality
Frontpage

I want to know what are good rationality exercises.

I was just on a call with Liron and PhilH, hanging out after the weekly LessWrong weekend event, and we discussed exercises that could happen on LessWrong.

Here is the list we generated:

  • Thinking Physics
  • Fermi Estimates
  • Project Euler
  • Calibration Training
  • Basic probabilistic reasoning
  • Basic have-you-read-the-sequences knowledge test (e.g. "Which of the following is an example of 'belief as attire'?")

Another user on the call (whose name I forget) suggested it could be fun to have a daily Fermi Estimate on LessWrong, where everyone submits their number and the model they used to reach the number. I think this would be quite exciting.

Please write answers with other exercises that you think are or might be great for rationality training, some explanation of why you think it could be good, and a suggestion of how it could be incorporated into LessWrong. I'll probably add some of the above myself.

New Answer
Ask Related Question
New Comment

10 Answers

Things that interest me:

  • Let's go exploring. Eliezer took a pretty low-bar activity (fan fic) and created something original (HPMOR). Why don't we pick some notorious areas of the internet where we think a little LW-style overthinking could go a long way?
  • A rational approach to cultivating imagination, creativity, and meditation. We have so many tools here for modeling questions of fact. Can't rationality help us develop the right side of the brain as well as the left?
  • Business ideas we could collaborate on, that hinge primarily on rational thinking, learning how to learn, and conscientiousness.

I would not participate in activities that boil down to arbitrary left-brain problem solving.

"Doing impossible things"

  • Get 100 strangers to show up at a specific place at a specific time.
  • Make $5,000 counterfactual dollars in a weekend.
  • Be featured in a major print publication in less than a month.
  • etc.

Answer: Check My Understanding

Here's how it'd work. Suppose I want to improve my understanding of Aumann's Agreement Theorem. I would write up my thoughts, doing my best to explain what I know about it. Then other people would comment on what I'm missing and where I went wrong.

This seems useful for a few different reasons:

  • As an author, the comments provide you with personalized feedback and allow you to "fill in the gaps".
  • As an author, the act of doing the initial write-up seems like it'd be very beneficial. Ditto for readers writing out their comments. (I have the Feynman Technique in mind.)
  • As a reader, you may have a decent understanding of Aumann's Agreement Theorem, but seeing it explained by a different author might help some things "click" for you (I have Non-Expert Explanation in mind).

Answer: Writing Your Hypothetical Apostasy

See Write Your Hypothetical Apostasy on Overcoming Bias.

Imagine, if you will, that the world's destruction is at stake and the only way to save it is for you to write a one-pager that convinces a jury that your old cherished view is mistaken or at least seriously incomplete.  The more inadequate the jury thinks your old cherished view is, the greater the chances that the world is saved.  The catch is that the jury consists of earlier stages of yourself (such as yourself such as you were one year ago).  Moreover, the jury believes that you have been bribed to write your apostasy; so any assurances of the form "trust me, I am older and know better" will be ineffective.  Your only hope of saving the world is by writing an apostasy that will make the jury recognize how flawed/partial/shallow/juvenile/crude/irresponsible/incomplete and generally inadequate your old cherished view is.

I'm not sure exactly how this fits in to group rationality practice. I personally am always more motivated to write when it's something that I will publish, so having a place where we publish hypothetical apostacys could be useful for motivational reasons. It would also be useful because you'd get feedback on your thought process, although that point could be made for many other exercises.

I was thinking that if the sequences and other LW classics were a high school class, we could make something like an SAT subject test to check understanding/fluency in the subject, then that could be a badge on the site and potentially a good credential to have in your career.

The kinds of questions could be like:

1.

If a US citizen has a legal way to save $500/year on their taxes, but it requires spending 1 hour/day filling out boring paperwork on 5 days of every week, should they do it?

a. Virtually everyone should do it

b. A significant fraction (10-90%) of the population should do it

c. Virtually no one should do it

2.

With sufficient evidence and a rational deliberation process, is it possible to become sure that the Loch Ness Monster does/doesn't exist?

a. We CAN potentially become sure either way

b. We CAN'T potentially become sure either way

c. We can only potentially become sure that it DOES exist

d. We can only potentially become sure that it DOESN'T exist

Answer: Discussing Updates

See the Updates Thread. Basically, taking note of the belief updates you perform and discussing why you performed them. What did you previously believe, what do you currently believe, and why did the data you observed move you from there to here?

Answer: Betting With Real Money

From the end of Inadequate Equilibria:

I don’t have good, repeatable exercises for training your skill in this field, and that’s one reason I worry about the results. But I can tell you this much: bet on everything. Bet on everything where you can or will find out the answer. Even if you’re only testing yourself against one other person, it’s away of calibrating yourself to avoid both overconfidence and underconfidence, which will serve you in good stead emotionally when you try to do inadequacy reasoning. Or so I hope.

Eliezer seems to be referring to real money here. And I recall him talking elsewhere about how it is useful to put real money on the line.

This meshes with my experiences playing poker. It's one thing to study and learn that X is a mistake. It's another thing to make the mistake of X and lose a big pot because of it. There's something about losing real money that cements it in your head. And I'm not just referring to my own experiences. From talking to other poker players, it seems that this is the norm.

However, real money is a touchy subject and I'm not sure how we would actually pull this off. But I figure that there is still value in bringing it up.

Making bets is good exercise too. If you can't find other people to bet with you can also make public predictions.

Answer: Fermi Estimates

Fermi estimates are attempts to answer a quantitative question using order-of-magnitude style reasoning. These are questions like "How many people fly on airplanes each day?" or "How many atoms are in my arm?". In contrast to things like calibration practice, these are much more generative, attempting to tie together parts of your world model to come up with a model that answers a question.

On LessWrong, this could be practically implemented by having a set of 100-1000 questions that users can do either in a weekend blitz, or spaced out over time. A user who got 100 correct (within a factor of 2x) could have a sign on their profile indicating that they completed this task. It could also be implemented as a daily/weekly question for users to answer and then compare notes on.

When I first read the sequences, I thought "What do I know and how do I think I know it?" was pretty banal and useless -- didn't everyone know that? Philosophy 101, question your beliefs, look for hidden assumptions, etc.

The older I get the more I come to think that no, not everyone knows this, and even the people who know it don't practice it enough. I'm not sure though.