[ Question ]

What should experienced rationalists know?

by deluks9172 min read13th Oct 202017 comments

67

Rationality
Frontpage

The obvious answer is 'the sequences' but imo that is neither necessary nor sufficient. The Sequences are valuable but they are quite old at this point. They also run to over a million words (though Rationality AtZ is only ~600k). Here is a list of core skills and ideas:

 

1 - Rationality Techniques

Ideally, an experienced rationalist would have experience with most of the CFAR manual. Anyone trying to learn this material needs to actually try the techniques; theoretical knowledge is not enough. If I had to make a shorter list of techniques I would include:

  • Double Crux / Internal DC
  • Five-minute Timers
  • Trigger Action Plans
  • Bucket Errors
  • Goal/Aversion Factoring
  • Gears Level Understanding
  • Negative Visualisation / Murphy-Jitsu
  • Focusing

 

2 - AI Risk: Superintelligence

The rationality community was founded to help solve AI risk. Superintelligence gives an updated and complete version of the 'classic' argument for AI-risk. Superintelligence does not make as many strong claims about takeoff as Elizer's early writings.  This seems useful given that positions closer to Paul Christiano's seem to be gaining prominence. I think the 'classic' arguments are still very much worth understanding. On the other hand, Superintelligence is ~125K words and not easy reading.

I think many readers can skip the first few chapters. The core argument is in chapters five through fourteen.

5. Decisive strategic advantage

6. Cognitive superpowers

7. The superintelligent will

8. Is the default outcome doom?

9. The control problem

10. Oracles, genies, sovereigns, tools

11. Multipolar scenarios

12. Acquiring values

13. Choosing the criteria for choosing

14. The strategic picture

 

3 - Cognitive Biases: Thinking Fast and Slow

Priming is the first research area discussed in depth in TF&S. Priming seems to be almost entirely BS. I would suggest skipping the chapter on priming and remembering the discussion of the 'hot hand fallacy' seems incorrect. Another potential downside is the length (~175K words). However, I don't think there is a better source overall. Many of the concepts in TF&S remain fundamental. The writing is also quite good and the historical value is extremely high. Here is a quick review from 2016.

 

4 - Statistics

It is hard to be an informed rationalist without a basic understanding of Bayesian thinking. You need to understand frequentist statistics to evaluate a lot of relevant research. Some of the most important concepts/goals are listed below.

Bayesian Statistics:

  • Illustrate the use of odd's ratio calculation in practical situations
  • Derive Laplace's rule of succession

Frequentist Stats - Understand the following concepts:

  • Law of large numbers
  • Power, p-values, t-tests, z-tests
  • Linear Regression
  • Limitations of the above concepts

 

5 - Signalling / The Elephant in the Brain

The Elephant in the Brain is a clear and authoritative source. The ideas discussed have certainly been influential in the rationalist community. But I am not what epistemic status the community assigns to the Hanson/Simler theories around signaling. Any opinions? For reference here are the topics.

PART I Why We Hide Our Motives 

  • 1 Animal Behavior
  • 2 Competition
  • 3 Norms
  • 4 Cheating
  • 5 Self-Deception
  • 6 Counterfeit Reasons

PART II Hidden Motives in Everyday Life 

  • 7 Body Language
  • 8 Laughter
  • 9 Conversation
  • 10 Consumption
  • 11 Art
  • 12 Charity
  • 13 Education
  • 14 Medicine
  • 15 Religion
  • 16 Politics
  • 17 Conclusion

 

What am I missing? Try to be as specific as possible about what exactly should be learned. Some possible topics discussed in the community include:

  • Economics
  • The basics of the other EA cause areas and general theory? (at least the stuff in 'Doing Good Better')
  • Eliezer says to study evolutionary psychology in the eleventh virtue but I have not been impressed with evo-psych.
  • Something about mental tech? Maybe mindfulness, Internal Family Systems, or circling? I am not confident anything in space fits.
Rationality2
Frontpage

67

New Answer
Ask Related Question
New Comment

5 Answers

Econ in general, no. The specific model of rational actors optimizing for outcomes, the intuition for why markets often succeed at delivering on desires (at least for those with money), and the practice of making multi-stage models and following an impact in one area through to others, yes. Nobody needs macro. Lots of people need to reflexively think of how other actors in a system respond to a change, and econ is one of the more effective ways of teaching this. Critical if you want to actually have a good understanding of multi-polar scenarios. What I'm talking about is rarely adequately understood just by studying game theory, which usually analyzes games too simple to stick in student's minds, but it matters.

In addition to understanding what statistics are on a theoretical level, an intuitive understanding that getting the right numbers yields real benefits, and that the right numbers are very precisely defined, seems important.

These are both softer skills, but I think that they're important.

This is an excellent question. Here's some of the things I consider personally important.

Regarding probability, I recently asked the question: Why is Bayesianism Important? I found this Slatestarcodex post to provide an excellent overview of thinking probabilistically, which seems way more important than almost any of the specific theorems.

I would include basic game theory - prisoner's dilemma, tragedy of the commons, multi-polar traps (see Meditations on Moloch for this later idea).

In terms of decision theory, there's the basic concept of expected utility, decreasing marginal utility, then the Inside/Outside views.

I think it's also important to understand the limits of rationality. I've written a post on this (pseudo-rationality), there's Barbarians vs. Bayesians and there's these two posts by Scott Alexander -  Seeing as a State and The Secret of Our Success. Thinking Fast and slow has already been mentioned.

The Map is Not the Territory revolutionised my understanding of philosophy and prevented me from ending up in stupid linguistic arguments. I'd suggest supplementing this by understanding how Conceptual Engineering avoids the plague of counterexample philosophy prevalent with conceptual engineering (Wittgenstein's conception of meanings as Family Resemblances is useful too - Eliezier talks about the cluster structure of thingspace).

Most normal people are far too ready to dismiss hypothetical situations. While if taken too far Making Beliefs Pay Rent can lead to a naïve kind of logical positivism, it is in general a good heuristic. Where Recursive Justification Hits Bottom argues for a kind of circular epistemology.

In terms of morality Torture vs. Dust Specks is a classic.

Pragmatically, there's the Pareto Principle (or 80/20 rule) and I'll also throw in my posts on Making Exceptions to General Rules and Emotions are not Beliefs.

In terms of understanding people better there's Inferential Distance, Mistake Theory vs. Conflict Theory, Contextualising vs. Decoupling Norms, The Least Convenient Possible World, Intellectual Turing Tests and Steelmanning/Principal of Charity.

There seems to be an increasingly broad agreement that meditation is really important and compliments rationality beautifully insofar as irrationality is more often a result of lack of control over our emotions, than lack of knowledge. But beyond this, it can provide extra introspective capacities and meditative practises like circling can allow us to relate better with humans.

One of my main philosophical disagreements with people here is that they often lean towards verificationism, while I don't believe that the universe has to play nice and so that often things will be true that we can't actually verify.

I still think that this problem is intractable so long as people refuse to define 'rationality' beyond 'winning'.

https://www.thelastrationalist.com/rationality-is-not-systematized-winning.html

I, in general, try to avoid using the frame of 'rationality' as much as possible precisely because of this intractability. If you talk about things like existential risk, it's clearer what you should know to work on that.

See my caveats in the comment section, so those being said, I'd say the most useful thing I know/read on my path to being something of an "experienced" rationalist was Godel, Escher, Bach. The older I get the more I realize how much was in that book that set me down my path, and yes there were lots of opportunities to get various things wrong and confused along the way, but in the end I think it might be the single best source I've seen that might turn someone to the rationalist mindset if they really grok what it has to say.

Computational complexity theory (if only the rudiments), I think.