There is a tactic in formal debate where you make several independent arguments to support a claim. Any one of the arguments is sufficient to prove your case. The arguments are redundant. Your wins even if all but one of your arguments for it is struck down.

Rationality is the opposite of debate. A rationalist should make only the strongest argument.

Suppose I believe "There is no monkey in my closet." There are two arguments I could put forth to support my claim.

  • I live in the Pacific Northwest. Monkeys don't live in the Pacific Northwest.
  • I looked in my closet and observed that there was no monkey in it.

If I was in a formal debate then I would put forth both claims. But I am a rationalist. My primary objective isn't to persuade other people. It is to identify my own reasons for believing things. I want to pinpoint the facts which, if inverted, would change my mind.

Which of my two contentions is stronger? Which bit of evidence, if inverted, would cause me to change my mind?

  • If I discovered that monkeys actually do live in the Pacific Northwest then I would continue to believe there is no monkey in my closet.
  • If I looked in my closet and saw a monkey then I would believe there is a monkey in my closet, ecology be damned.

A rationalist's arguments should be stripped down to the bare essentials. The whole lattice should collapse with the removal of a single argument. If you can't cut an argument down to its cruxes then you haven't identified your cruxes. If you haven't identified your cruxes then you don't know why you believe what you believe.

Statistical Evidence

What about something like "anthropogenic global warming is real"? Doesn't a lot of evidence go into a conclusion like that?

Since I'm not a climate scientist, I yield to the scientific consensus on climate science. My crux is: "The scientific consensus believes anthropogenic global warming is real." If I discovered the scientific consensus disbelieves "anthropogenic global warming is real" then I would change my mind.

I believe there is scientific consensus which believes in anthropogenic global warming because various trustworthy sources assure me there is one. No single trustworthy source is a crux. If <trustworthy news source > reported there was no scientific consensus then my confidence that there is a scientific consensus would be weakened, but it would not instantly break.

Does this violate "present only your strongest argument"? No. While you should limit arguments to your cruxes, it is acceptable to aggregate lots of evidence into a single statistic. Changing a single statistical datapoint need not invalidate your argument. It is sufficient for a single datapoint to merely weaken the statistic.

Arguments should be cruxy. Data is allowed to be redundant. Put all of your data behind your single cruxiest argument.


New Comment
8 comments, sorted by Click to highlight new comments since: Today at 2:46 PM

It is not perfectly clear whether this post is describing what the actual structure of your beliefs should be, or how you should present them. I think I disagree strongly in both cases.

Suppose I believe that my friend Albert has very recently converted to Christianity, because (1) Albert has told me so and (2) our mutual friend Beth tells me he told her so. These are both good evidence. Neither is conclusive; sometimes people make jokes, for instance. Neither is a crux; if it turned out, say, that I had merely had an unusually vivid dream in which Albert told me of his conversion, I would become less sure that it had actually happened but Beth's testimony would still make me think it probably had.

In this situation I could, I guess, say that I believe in Albert's conversion "because Albert and Beth both told me it was so". But that's purely artificial; one could do that with any set of justifications for a belief. And merely contradicting this belief would not suffice to change my opinion; removing either Albert's or Beth's testimony, but not the other, would falsify "both told me" but I would still believe it.

This is unusual mostly in being an artificially clean and simple case. I think most beliefs, or at any rate a large fraction, are like this. A thing affects the world in several ways, many of which may provide separate channels by which evidence reaches you.

This is true even in what's maybe the cruxiest of all disciplines, pure mathematics. I believe that there are infinitely many prime numbers because of Euclid's neat induction proof. But mathematics is subtle and sometimes people make mistakes, and maybe you could convince me that there's a long-standing mistake in that proof that sometimes [EDIT: of course this should have said "somehow"] I and every other mathematician had missed. But then I would probably (it might depend on the nature of the mistake) still believe that there are infinitely many prime numbers because there are other quite different proofs, like the one about the divergence of  or the one using  that proves an actual lower bound on how many primes there are, or the various proofs of the Prime Number Theorem. To some extent I would believe it merely because of the empirical evidence of the density of prime numbers, which (unlike say the distribution of zeros of the zeta function, the empirical evidence of which is also evidence that there are infinitely many primes) seems to be of a very robust kind. To make me change my mind about there being infinitely many prime numbers the proposition you would have to refute is something like "mathematics is not all bullshit".

(Sometimes a thing in pure mathematics has only a single known proof, or all the known proofs work in basically the same way. In that case, there may be an actual crux. But for theorems people actually care about this state of affairs often doesn't last; other independent proofs may be found.)

Outside mathematics things are less often cruxy, and I think usually sincerely so.

Finding cruxes is a useful technique, but there is not the slightest guarantee that there will be one to find.

Perhaps one should present one's beliefs cruxily even when they aren't actually cruxy, either in order to give others the best chance of presenting mind-changing evidence or to look open-minded? I don't think so; if your beliefs are not actually cruxy then lying about them will make it less likely that your mind gets changed when the evidence doesn't really support your current opinion, and if you get caught it will be bad for your reputation.

I think a unified approach could be "you should provide the right amount of evidence -- too little will leave people unconvinced, but too much will waste their time".

If you need 10 bits of evidence to convince someone (number completely made up), and you have two pieces of evidence providing 5 bits each, you need to provide them both. If you have a piece of evidence providing 10 bits, and five more pieces providing 1 bit each, just provide the strongest one. (Otherwise there is a risk that the person will consider the five weakest pieces first, and they say "sorry, I remain unconvinced, and I am out of time I decided to spend on this topic".)

Generally, start with the strongest evidence, and continue until the other person is convinced or you run out of time.

I disagree.  Reality is self-correlated, and many facts can lead to the same conclusion, which different people weigh differently.  Also, there may have been many updates to your prior which led you to a belief, all of which are important.   FINDING the crux takes work and trying out different dimensions of evidence.  Your crux may well be different from your partner's.

There may be some contexts where you have to compress your reasons for your beliefs in order to save time or conversational bandwidth.  That's fine, but depending on the dialog you may want to go deeper than the single best reason for a belief.

The problem with "I looked in my closet and observed that there was no monkey in it" is that it's based on subjective evidence that nobody else can check. 

Let's say I want to convince you that it's possible to cure allergies with hypnosis. I can tell you that I did a process with a friend who had an allergy to cats and then the next time he was exposed to a cat he didn't show symptoms and that I also have a friend who did the process multiple times successfully with other people. 

Together, that's quite strong evidence for me. On the other hand, I don't expect anybody on LessWrong to be convinced by the preceding paragraph. To make a case that convinces others I would actually seek more objective evidence that doesn't rely on my own experience and that readers can independently verify.

Often the best evidence is having seen how something works for yourself but unfortunately, that's not easy evidence to convey to other people.

In some cases, real-world reasons may be multifaceted.  If you present to me a plan that has three-hundred separate issues that might foil it, each of which has an independent 5% probability of occurring, it may be hard for me to state a single reason why I don't think your plan will work.

I think the main harm here to be avoided is that if people use a lot of "clutter" then that is a a very low ratio of beliefs to language used. The clutter could come from true scottmanning from one defeated position repeatedly or making overtly disjunctive claims or any such bias.

However I think the important thing there is that the claim is central rather than the strongest. If your main reason to belief something is weak that is not an excuse for not going with it. If you have a lot of non-impactful technicalities that are easily defended but your real crux is frankness seeking conversation will put the weak crux forward.

I think having single claims where truth or belief hinges violates conservation of evidence. But because some things are reasons to believe something doesn't mean they are so equally. The sin is in burying a high-weight claim/factor under or over a low-weight one. If you are asked to list 3 reasons why you belief a claim and you list your 4th, 5th and 6th that hides the true cruxes. But if you give only 1 and claim that it would be erroneuos to have 2nd and 3rd you are commiting a kind of black and whiteness that erases nuance you could easily be aware of.

Say that the claim was that there is a unicorn in my closet. Then even if I "saw a unicorn" in my closet I would still think that it is a animatronic costume or a fraudulent tuned up horse quite likely even if I can't come up with any more striking or "direct" evidence to the direction of there being a unicorn.

While it can be an error to not have considered some things I do think that "mootness structures", not having really thought about some things are real. In those cases you really only start to think about it when the base claims to make the question meaningful get believed. Expecting people to provide the hinge question on all of their claims implicitly means they have thought about the logical structure of their beliefs. Logical omnisience is nice but it is also hard. Rather in discussion "surprising implications" are not a sign of lazyness or dishonesty per se. People that make non-central claims on deeply debated topics or on fields they should know about are deceptive because they talk about the aspects they know/feel they are right about rather than parts they know or should know are wrong about. And this in effect is a failure to apply mootness. If people knew/ were aware of the more central stuff they would not be motivated to talk about the fringe stuff. But with some attention control we end up talking about stuff that should be moot.

Beliefs are quantitative, not qualitative. The more evidence you pile in favor of the claim, the stronger your confidence in it should be. Observing that there is no monkey is much stronger evidence than the geography based argument, and it's probably enough, but the belief is not binary so having both arguments should result in higher probability assigned to it than with having just one argument, not matter how much stronger that single argument is. .

In practice, thing about it that way - what if the monkey heard you coming and managed to hide so well that you couldn't find it even after looking? This is a very unlikely scenario, but still a possibility - and its less likely to happen in the Pacific Northwest than in, say, India. So the geographic argument reduces the probability of a hidden monkey scenario - even if only by a little bit - and thus increases the overall probability of having a monkelyless closet.

(your first point is not very convincing, it can easily be a dead monkey anyway; "I have not put a monkey in my closet" seems a better choice?)

New to LessWrong?