Rationality Quotes September 2014

  • Please post all quotes separately, so that they can be upvoted or downvoted separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
  • Do not quote yourself.
  • Do not quote from Less Wrong itself, HPMoR, Eliezer Yudkowsky, or Robin Hanson. If you'd like to revive an old quote from one of those sources, please do so here.
  • No more than 5 quotes per person per monthly thread, please.
  • Provide sufficient information (URL, title, date, page number, etc.) to enable a reader to find the place where you read the quote, or its original source if available. Do not quote with only a name.
379 comments, sorted by
magical algorithm
Highlighting new comments since Today at 1:33 AM
Select new highlight date

A good rule of thumb might be, “If I added a zero to this number, would the sentence containing it mean something different to me?” If the answer is “no,” maybe the number has no business being in the sentence in the first place.

Randall Munroe on communicating with humans

Related: When (Not) To Use Probabilities:

I would advise, in most cases, against using non-numerical procedures to create what appear to be numerical probabilities. Numbers should come from numbers. (...) you shouldn't go around thinking that, if you translate your gut feeling into "one in a thousand", then, on occasions when you emit these verbal words, the corresponding event will happen around one in a thousand times. Your brain is not so well-calibrated.

This specific topic came up recently in the context of the Large Hadron Collider (...) the speaker actually purported to assign a probability of at least 1 in 1000 that the theory, model, or calculations in the LHC paper were wrong; and a probability of at least 1 in 1000 that, if the theory or model or calculations were wrong, the LHC would destroy the world.

I object to the air of authority given these numbers pulled out of thin air. (...) No matter what other physics papers had been published previously, the authors would have used the same argument and made up the same numerical probabilities

For the opposite claim: If It’s Worth Doing, It’s Worth Doing With Made-Up Statistics:

Remember the Bayes mammogram problem? The correct answer is 7.8%; most doctors (and others) intuitively feel like the answer should be about 80%. So doctors – who are specifically trained in having good intuitive judgment about diseases – are wrong by an order of magnitude. And it “only” being one order of magnitude is not to the doctors’ credit: by changing the numbers in the problem we can make doctors’ answers as wrong as we want.

So the doctors probably would be better off explicitly doing the Bayesian calculation. But suppose some doctor’s internet is down (you have NO IDEA how much doctors secretly rely on the Internet) and she can’t remember the prevalence of breast cancer. If the doctor thinks her guess will be off by less than an order of magnitude, then making up a number and plugging it into Bayes will be more accurate than just using a gut feeling about how likely the test is to work. Even making up numbers based on basic knowledge like “Most women do not have breast cancer at any given time” might be enough to make Bayes Theorem outperform intuitive decision-making in many cases.

I tend to side with Yvain on this one, at least so long as your argument isn't going to be judged by its appearence. Specifically on the LHC thing, I think making up the 1 in 1000 makes it possible to substantively argue about the risks in a way that "there's a chance" doesn't.