We use heuristics when we don't have the time to think more, which is almost all the time. So why don't we compile a big list of good quality heuristics that we can trust? (Insert eloquent analogy with mathematical theorems and proofs.) Here are some heuristics to kick things off:

Make important decisions in a quiet, featureless room. [1]

Apply deodorant before going to bed rather than any other time. [1]

Avoid counterfactuals and thought experiments in when talking to other people. [Because they don't happen in real life. Not in mine at least (anecdotal evidence). For example with the trolley, I would not push the fat man because I'd be frozen in horror. But what if you wouldn't be? But I would! And all too often the teller of a counterfactual abuses it by crafting it so that the other person has to give either an inconsistent or unsavory answer. (This proof is a stub. You can improve it by commenting.)]

If presented with a Monty Hall problem, switch. [1]

Sign up for cryonics. [There are so many. Which ones to link? Wait, didn't Eliezer promise us some cryonics articles here in LW?]

In chit-chat, ask questions and avoid assertions. [How to Win Friends and Influence People by Dale Carnegie]

When in doubt, think what your past and future selves would say. [1, also there was an LW article with the prince with multiple personality disorder chaining himself to his throne that I can't find. Also, I'm not sure if I should include this because it's almost Think More.]

I urge you to comment my heuristics and add your own. One heuristic per comment. Hopefully this takes off and turns into a series if wiki pages. Edit: We should concentrate on heuristics that save time, effort, and thought.

New to LessWrong?

New Comment
113 comments, sorted by Click to highlight new comments since: Today at 3:00 AM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

Decisions with an utility equivalent of less than 0.50$ should be made after at most 10 seconds, by coin flip if necessary.

The time is more valuable.

2RobinZ15y
That corresponds to valuing a marginal increment of your time at $180/hr, which seems a bit high - the base concept makes sense, though.
1jimmy15y
Not quite. Say your time is worth $90/hr. If you spend 20 seconds thinking about the answer, you've done worst than instantly picking at random. You've done just as bad as instantly picking the wrong answer. If it's worth spending any time at all thinking about the question, it's worth spending considerably less than 20 seconds. On a binary question, you should spend <10 seconds even if you approach certainty at t = 10s (at $90/hr). Depending on your certainty/time profile, it could be even less.

In a field in which you personally are not an expert, the closest you can come to the truth is to accept the opinion of the majority of the experts in the age in which you live.

(Courtesy of my father.)

4Z_M_Davis15y
The problem is that the very fact that experts are listened to and respected creates incentives to become certified as an expert or to claim to be an expert, and that these incentives are non-truth-tracking. If you lust for fame and glory, or you want to seem original, or if you have a political agenda, or if you're worried about what your publisher thinks will sell---all these sorts of things might help your bid to be certified as an expert or hinder it, but they're not directly about the map that reflects the territory, and everything that's not directly about the map that reflects the territory just adds noise to the process. In a physical science with conclusions nailed down for decades, sure, don't even think about questioning the consensus. But on an issue people actually care about (sorry, physics nerds, but you know what I mean), if you have a concept of epistemic rationality, and you know about Aumann agreement and updating your beliefs on other people's beliefs as a special case of updating your beliefs on any data you get from your environment and you take all of this dead seriously, and you've read the existing literature, and you've spent many, many hours thinking about it, and you still find yourself disagreeing with the consensus---I'm not going to say you should forfeit your vision. You can't trust the mainstream, because the mainstream is insane. The fact that you're insane too doesn't mean you can just trust the authorities; it means you have to lower your confidence in everything. But please---don't take my word for it!
3Paul Crowley15y
I agree, except that a non-expert needs to have some rules by which they can distinguish fields which really do have experts (eg climate science) from those that don't (eg theology)
5FiftyTwo12y
Within theology, I will accept the views of theology professors (e.g. about the exact nature of the trinity) but not on the assumptions their field depends (e.g. whether God exists).
3FAWS12y
Weren't things like the current positions on the trinity arrived by political processes, including persecution of dissenters as heretics? Why should such positions be expected to be more likely to be true, even assuming divine beings? Or do you mean you will accept their expert opinion on what the church positions are, their history and so on?
5FiftyTwo12y
I am presuming a theology professor is more likely to have a view based on the arguments (if you can call them that) and textual evidence than a random member of the public or follower of that religion. (A priest would be a different matter as they have strong investment in the doctrine.) Analogously, I will likely trust the opinion of the head of a Harry Potter fandom group, who has likely been involved in debates on the topic, about some point of the minutiae of Harry Potter lore (how old his parents were when they died for example). But that doesn't entail accepting the premise 'Harry Potter is real.' Edit Upon more thought I think the issue may be that I was working from the premise "Theology professors are not invested emotionally in the results of a debate, but argue based on theory and textual evidence" which, while it has been my experience, may not be universal and may not be a premise you share.
2FAWS12y
I'd trust the head of a Harry Potter fandom group to get questions about the fictional character Harry Potter right, but not for questions about a hypothetical culture of real wizards, even if someone were claiming the books to have been based on such.
2FiftyTwo12y
But (assuming for the sake of argument the books count as documentary evidence) would you say they had a higher probability of being right than: 'someone who had read the books once' or 'someone who had never read the books.' Or would you expect them all to be equally likely to be right or wrong?
0FAWS12y
Someone who has read the books, but isn't a fan > a dedicated fan > someone who never read the books. I'd expect dedicated fans to over-count the books as evidence and to not give very different scenarios enough consideration, or fail to think of them at all.
0FiftyTwo12y
But surely they are also more likely to have inconsistent beliefs that a person who had engaged in discussion wouldn't? (E.g. misunderstanding a section in a way that could easily be noticed in discussion.) Analogously very few theology professors believe in the literal creation story, for obvious reasons, and are likely to have slightly more coherent conceptions of free will/sin/miracles.
3CronoDAS15y
An expert who disagrees with the majority opinion in his field is an iconoclast. Such experts are usually at least acknowledged as experts by other experts, and, sometimes, their opinions turn out to be right all along. A layman who disagrees with the majority of the experts in a field is a crank, and cranks that turn out to be right are rarer than winning lottery tickets.
2John_Maxwell15y
I'm not convinced that the initial disclaimer is necessary. Would it make sense for a non-expert to base his opinion on that of one expert or a large group? Why does it make sense for an expert to base his opinion on his perceptions only instead of looking at his entire group?
1CronoDAS15y
Note that a relevant application of this heuristic would be global warming.
3SilasBarta15y
Warning: once you couple an argument to a current political debate, people quickly lose their ability to think rationally about it...
2RobinZ15y
Only if they don't make their saving throw. Dawkins gets a lot of deconversion stories in his email.
1Vladimir_Nesov15y
Yet to understand the opinions on any nontrivial question, you have to become enough of an expert yourself to have at least some say in judging the validity of experts' opinions.
1Cyan15y
The heuristic seems to be about what to do when understanding the opinions of experts is not a practical option.

If you don't know what you need, take power. [1, Power can be converted into almost everything else. Also, money is power.]

3FiftyTwo12y
Does the quote have any origin beyond "Final words?" I started there but the search only brought me here. I would be interested in more discussion around it before I adopt it as a method.
1John_Maxwell15y
Certain kinds of knowledge are also power. Intelligence is power; you can build intelligence through dual-n-back.
2Mike Bishop15y
I believe this has some effect on some type of intelligence, but I remain unconvinced that the boost is large enough and generalizable enough that its worth the opportunity cost.
1John_Maxwell15y
Quote from brainworkshop.sourceforge.net: Fluid intelligence is considered one of the two types of general intelligence. The other is crystallized intelligence. See http://en.wikipedia.org/wiki/Fluid_and_crystallized_intelligence
7Z_M_Davis15y
' sputters What does that even mean? I know what it means for a rock to be 40% heavier than some other rock, or for a car to be travelling 40% faster than some other car, and I know what it means to go from the fiftieth percentile to the ninetieth percentile, but saying that subjects got 40% more items right on some particular test tells me nothing useful; we only care about the test insofar as it gives us evidence about this intelligence-thingy, and the raw score gives me no basis for comparison. Looking at the actual PNAS paper (hoping that I'm competent to read it), it looks like the experimental group saw a gain of 0.65 standard deviations (Cohen's d) on a test of Gf, said figure which actually tells me something---if we assume a Gaussian distribution, then a score in the fiftieth percentile among the untrained would be in the twenty-fifth percentile amongst the trained. (The control group also gained 0.25 standard deviations, probably due to a retest effect.) Huh. d=0.65 is pretty impressive ...

If it feels like someone won't accept your basic, obviously-true point, the culprit is a communication error.

This is as opposed to what it will feel like in the moment: they are stupid, they are obstinate, they won't listen, etc. If you have no good reason to believe the other party has stopped acting like a reasonable social being, then back up and find the communication error before proceeding. Maybe you are accidentally attaching riders to your point. Maybe they are reading too much into your point. Who knows. But it's probably not that whoever you're talking to suddenly turned into a bad human being, which points to a communication error of some sort.

0Dan_Moore15y
I think this is a good heuristic. However, another possibility is that either you or your discussant is unduly influenced by an informational cascade.

If the situation you are considering is novel, your intuition about it is probably wrong. Use more reliable, if less powerful, tools.

Don't trust simple solutions to old unsolved problems.

(Optional xkcd link.)

3Blueberry14y
Chesterton said something similar about reforming social institutions that you don't understand. I'm not a fan of that first XKCD, though. It seems to suggest that any possibility of alternative sexuality is doomed to failure, whereas many people do form alternative arrangements of various types.
3RobinZ14y
I'll agree with you that alternative sexuality is real and works for many people - but I'm fairly sure (based on other things Randall Munroe has said about e.g. gender) that the xkcd comic is not mocking anything like that. I think it's mocking ... well, the kind of thinking that produced The Open Source Boob Project fiasco.* Something which sounds like a good idea, something which maybe even works at first ... but which has been proven, at the very least, not be likely to scale smoothly and gracefully. And it results in drama, obviously. * Ursula Vernon's takedown is fairly good, if you're interested in that kind of thing.
3Blueberry14y
I've read a few different accounts about what occurred with the OSBP, and from what I understand, it was done among a very small number of women who mostly knew each other and were comfortable with each other, or who had agreed to participate by wearing a button, and everyone was very sensitive and careful about consent. So I'm reluctant to call it a "fiasco". It seems like the only people who were uncomfortable with it were the ones who misunderstood it after the fact. Though I wasn't there and don't really know for sure. If you mean sexuality is frequently emotionally complex and often results in drama, I'd agree, but that's true whether you change the rules or not. Relationships are hard, and people have to try to make rules that work for them. It's not as if there's an official book of rules anyway.
2RobinZ14y
I read the followup when I tracked down the link - I don't disagree with you. But, at the very least, the writeup meant that The Ferrett felt obliged to promise not to attend specific future events and to close comments, and that seems to me like more drama than most sexual relationships I've heard of. (I know nearly nothing, mind.)
2FiftyTwo12y
Reading up on it (severely after the fact admittedly) I found it hard to work out what the problem was. As far as I can tell no-one was involved against their will, and those involved were not put under any obligations. If everyone involved was consenting adults how did it become a 'fiasco?' Did people simply object aesthetically to it happening in the places they were, or were there plans to expand it in some seemingly detrimental way?
2RobinZ12y
The latter - the drama wasn't due to the original event, but due to the suggestion that it be formalized as an Event for the next year. Which, for reasons which were elaborated in many places, would likely have not been successful.
2FiftyTwo12y
But even then, if all participants are consenting adults, who could grope each other infromally anyway, who cares?
2Vaniver12y
As for why doing the project again would have been a mistake, asking people for consent is not a cost-free thing, and many such events work far better with fewer participants for reasons both obvious and subtle. The real mistake theferret made was posting about this on the internet. I was involved in a discussion about the OSBP on the xkcd forums when the post happened, and was amazed by the degree of misunderstanding and overreaction among people condemning it. That was the sort of reaction theferret should have seen coming, and kept the project an invite-by-referral thing rather than a public recruiting thing.
0RobinZ12y
The event as proposed did not control sufficiently for "consenting". (Or "adult", for that matter.) That was the exact problem, in fact.
2jimmy15y
I would change that to "easily conceived" instead of "simple", and make sure to distinguish between "unsolved" and "unagreed upon".
1RobinZ15y
Technically, "easily conceived" is more accurate, but the hindsight bias might make that hard to determine.
0RobinZ14y
(Even better optional xkcd link.)

Make important decisions in a quiet, featureless room

Might this prime you to make a quiet, featureless decision?

To be more specific and a little less snarky: I tend to be too socially withdrawn and a bit of a loner. To make a decision about, say, whether or not to go to a party, in a quiet, featureless room, would be a mistake.

Look things up if they are important.

1FiftyTwo12y
Corollary, test them if possible.

Smile.

Also, the post you mention in your last heuristic is here.

Edit: Missed the line about one heuristic per comment.

Don't trust heuristics, unless you can (1) re-derive them, (2) know their limits explicitly, or (3) are willing to accept the risks for the moment, but will reevaluate them later.

The limit of this heuristic is that it relies on self-knowledge, and so is vulnerable to self-deception. It breaks down when we start operating with heuristics for domains where we can no longer trust ourselves as much.

0Vladimir_Nesov15y
I'm not sure that I read your point (3) correctly. One feature of heuristics is that you need to trust them to do a better job than you can do without them. As you refine the understanding of where the heuristics are appropriate, the expected effectiveness of heuristics increases, but all the way heuristics need to pay rent. The useful side of heuristic needs to win over the bias part, which is more of an issue for heuristics than for declarative beliefs.
2MendelSchmiedekamp15y
Vladimir, (3) is the branch for when urgency prevents the use of cognitive or data collection resources needed to adequately trust the heuristic under normal circumstances, but that same urgency requires a decisions. Loosely speaking it is the emergency clause. So the heuristic for that branch is to use the best heuristic available under what resources you can muster, and schedule a reevaluation at a later time, to recover from the habit forming nature of what you might have just done. Of course, many heuristics are far to complex to personally derive or even to fully and explicitly describe their limits (at least in a single evaluation), so instead we need to keep calling (3) to manage them even outside of a proper emergency. What this means is that heuristics "paying rent" is a sub-heuristic of the heuristic I propose here. Of course the overall limit of this heuristic remains (it applies to the "rent charging" dynamic, as well as the other applications). To manage this limitation requires a higher level check (likely itself heuristic) to enable the aspiring rationalist to operate with greater caution in domains where self-trust is less reliable.

If you meet Omega, take one box - the transparent one. [1, Think about it: what is the probability of you meeting an actual Omega versus it being a prank by fellow rationalists. The opaque box probably contains a spring loaded boxing glove.]

4Vladimir_Nesov15y
Related to: The Parable of the Dagger.

YKUTWIDNTIMWYTIM. "Heuristic" is not synonymous to "tip".

Prefer food with fewer than five ingredients on the label.

0GuySrinivasan15y
This is a good heuristic? I wouldn't have guessed it, but then I don't pay a lot of attention to what exactly I eat. Why is this a good idea, do you think? For now I'll adopt your beliefs, but I'd like more evidence. :)
2Alicorn15y
Ideally, you want food with one ingredient (e.g. "cherries" or "peas" or "olive oil" or "oregano") and then you assemble it into multi-ingredient food yourself at home (or in the case of the cherries you can eat the one ingredient by itself). If you need to buy multi-ingredient things, then the fewer ingredients they have, the less likely they are to contain weird pseudo-food like coloring agents, the distressingly vague "natural flavors", more preservatives than you really want in your lunch, etc. This being a heuristic, not a comprehensive meal plan, it has to be simple and easy, so "fewer than five ingredients" is what I said instead of "avoid the following evil food additives". I go into a little more detail in this post of Improv Soup, 2a.
1CronoDAS15y
I absolutely cannot stand cooking. :(
3astray15y
Just um... think of it as deck construction? Get your land balance right and you'll have an excellent aggro dish. It sounded like a better suggestion in my head...
3CronoDAS15y
Mostly, I simply have no patience for it. Any minute spent on food preparation is a wasted minute I'll never get back. Even frying an egg is too much trouble for me to bother with, when I could just have a bowl of cold cereal instead. I do like good-tasting food, but not nearly enough to make it myself when I could just grab a slice of cheese or something and continue surfing the Internet instead.
0astray15y
This is a problem I often have myself. I will note that cooking for two ameliorates much of the pain, and cooking with two is even better.
1Alicorn15y
Why not?
0GuySrinivasan13y
Two years later: This is a good heuristic for cooking! Edit: it doesn't always work, especially when trying new atomic ingredients. I'd say stick to things you've at least kind of done before if you're feeding other people.

You're confusing deodorant with antiperspirant.

0CannibalSmith15y
Explain the difference please.
0CronoDAS15y
Deodorant kills odor-causing bacteria, and often contains perfumes and such. Antiperspirant prevents you from sweating in the area in which it is applied, preventing the bacteria from making odors. (Or something like that.) Most of the time, a stick of "deodorant" you buy in the store contains both.
0anonym15y
http://www.wisegeek.com/what-is-the-difference-between-antiperspirant-and-deodorant.htm Executive summary: ANTIperspirant tries to eliminate sweating by blocking pores, deodorant aims to eliminate or hide the bad smell of sweat.

If you previously committed to a decision for good reasons, don't reverse your choice without good reason. (Related to "When in doubt, think what your past and future selves would say", but applies in broader circumstances.)

Do what other people do in your situation.

3orthonormal15y
Better: Imitate what successful people do/did in your situation. Or perhaps: Adopt the most successfully tested strategy; if you think you've figured out something better, ask first why you don't see others doing it already.
2HalFinney15y
The problem is that you are more likely to know how things turned out for successful people than for unsuccessful ones. A policy which has a large chance of disaster but a small chance of great success might appear to be very good under this heuristic, since it worked great for everyone you've heard of.
0orthonormal15y
Excellent point. I was thinking more in terms of social strategies, which don't seem to have devastating black swan outcomes in the way that "guaranteed" gambling or investment strategies do. Is there a pithy way to make that distinction?
2Alicorn15y
I am skeptical that enough people do the best thing enough of the time to make this a good heuristic, even if you ignore the fact that "what other people do in your situation" isn't always available information.
0HalFinney15y
You can also replace "do" with 'believe". One interesting question is whether you should believe what the experts do, or what the majority of people do, in situations where they differ. (See CronoDAS's suggestion on this page about believing the experts.)
1Drahflow15y
No, you should not believe what others believe unless they presented serious arguments. Otherwise * information cascades * memes gain strength. Doing is different here, as it is more costly than believing.
0HalFinney15y
The fact that this policy may contribute to an information cascade is (mostly) a cost to other people rather than a cost to yourself. If your goal is the truth, the presence of this cost is not relevant. The real question is whether the beliefs of others are a reliable guide to the truth, and if not, what is better. Judging the quality of arguments has IMO not been shown to be something that most people can successfully implement - too much opportunity for bias to creep in.
0John_Maxwell15y
I suggest the following revision: If you don't think it's worth your time to analyze your options, choose whatever option people seem to be choosing. Exceptions in the case of situations where too many people choosing one option is bad for all of them (for example, too many people with degrees in y is bad for all of them.)

Tolerate tolerance. 1

1CronoDAS15y
Nitpick: To what extent should I tolerate tolerance of evil? (For example, I'd condemn a reporter who writes an uncritical piece about some new kind of medical quackery, such as the healing powers of magnets, or what have you.)
1Alicorn15y
Giving something positive publicity is not just tolerating it. You might well criticize the journalist who writes an article about the practitioners of magnetic healing without ever mentioning that it doesn't work under controlled circumstances. You should probably not criticize the guy who never bothers to write about magnets because they don't seem newsworthy.

Given an important decision and unlimited time, think until your thoughts repeat themselves, and no more.

Never decide what to do until you've thought of at least half a dozen alternatives beyond the ones you immediately thought of. [Sometimes the obvious thing is the best, but do it because you actually made that decision.]

3Vladimir_Nesov15y
Link: Hold Off On Proposing Solutions.
2SilasBarta15y
Didn't you just violate that heuristic? Don't you pretty much have to, unless you want to live your live in permanent decision paralysis? Limit it to large, important decisions and I'd agree.
1Richard_Kennaway15y
It's a heuristic. It's up to one's judgement how or whether to apply it in any situation. Myself, I'd draw the line wider than just large, important decisions.
3SilasBarta15y
Yes, it's a heuristic, but that means it needs to be usually correct. Yours is rarely correct. You make numerous decisions throughout the day, such as how to word your comment. Coming up with 6 alternatives to everything would guarantee that you would Lose. But if you're just going to fall back on "but you apply it with your judgment", then you miss the point of a heuristic, which is to assist with your judgment. Why not have just one universal, all-encompassing heuristic: "Use judgment."
1HughRistik15y
Ok, this answers my question above. Perhaps it's useful, when discussing heuristics, to describe the type of problem they are best applied to. The worth of the heuristic doesn't just lie in itself, but also lies in knowing when to apply it.
0HughRistik15y
What types of problems do you expect this heuristic to be successful with? If the problem is something like improvizing jazz, it will fail miserably.
1Vladimir_Nesov15y
If it's easy to judge that a given heuristic fails for a certain problem, then heuristic is not at fault: it can be easily seen to not apply there, and so won't introduce bias in that situation. The trouble lies where you think the heuristic applies but it doesn't.
0Richard_Kennaway15y
Problems that require decisions. I doubt that any of the heuristics mentioned here would have any relevance to jazz improvisation. More generally, I consider heuristics to be not substitutes for thought, but pointers to get thought moving in the most promising direction first.
0[anonymous]15y
But that's Think More.

Distrust any impression given by fragmented quotations, be they text, audio, video.

(The mere existence of the phrase "out of context" reflects the danger of trusting these. Note, however, that this doesn't apply merely to quotes. To give an example I personally fell for: a false impression as to who said what in a 'documentary' about a psychic detective was given by rapidly cutting between the accounts of the officers working the case and the account given by the detective.)

At the overcoming bias meetup a couple days ago, Robin Hanson mentioned that the singularity institute should devote half its people to working on AI problems and the other half to improving the tools used by the first half. Any way we could turn this into a heuristic?

Some questions: Should the tool-improving group also split itself in half so that half of them can help with the tools used by the tool-improvers? Has there been any academic research on what the right ratio of workers to tool-improvers is? How do things change when the group consists of o... (read more)

1anonym15y
I've always thought of this in terms of "improving the first derivative", or working not only on current knowledge but on the rate at which we are acquiring knowledge. Improved tools are a great way to increase the rate of change. Some other techniques are improving understanding of foundational topics (dependencies), inventing better representations of the problem domain (e.g., notation in mathematics and computer science), improving one's health (so as to operate at peak efficiency) through things like good diet and exercise (there are many cognitive benefits of exercise), and to the extent that fluid intelligence may be malleable, working to improve intelligence itself (e.g., dual n-back as in the 2008 Jaeggi et al. study).
0rwallace15y
How about this for a heuristic: Exploiting the resources, tools, techniques etc. that you presently have, and coming up with better ones for the future, are both important and neither should be neglected. "50/50 split" obviously shouldn't be taken too literally, the point is that it shouldn't be 1/99 or 99/1.
[-]djcb15y10

Good quality heuristics would indeed be useful.

But I thought heuristics were about experience-based techniques, of the type 'when X occurs, there's a pretty good chance that Y happens as well'. The example heuristics do not really follow that pattern.

'Sign up for cryonics' does not seem like a heuristic at all - how does it follow from experience? Also, for me to trust them, heuristics have to be supported by facts -- either my own experiences or some trusted other party. I'd only use Dale Carnegie lessons after some own experimentation with them - no mat... (read more)

0rwallace15y
It's true that we don't have any experience telling us we will survive if we sign up for cryonics, so there's no way to even estimate its chances of success. We do however have lots of experience making it very clear we are definitely dead if we don't.
0djcb15y
My point was not so much about cryonics perse, but with the fact that most of the example 'heuristics' and many of the ones posted, are not heuristics at all - but more like little 'wisdoms'. I understand the reasoning of the procryonics. But heuristics are not about reasoning - they are about experience. The interesting point of some heuristics is that we do not not really understand this reasoning -- we just see the correlations. But if there is no experience, no correlation, there is no heuristic. Even the examples that actually have some evidence are problematic. E.g., only by reasoning you can get from 'people are susceptible to priming' to 'Make important decisions in a quiet, featureless room' (example 1). For a real heuristic, we'd need to see correlation between the quality of decisions and the kind of room they were made in, not some psy paper + reasoning. This article would could have been better had it started with a clear definition of what it considers a 'heuristic' and then proceed from there.
0Vladimir_Nesov15y
If heuristic is adaptive, it takes a form depending on experience, more optimal than a fixed procedure, sometimes successful, sometimes terribly wrong. Simpler kinds may not be adaptive. You use a heuristic because it's useful, and "proof" of usefulness may involve any connection between concepts at all, only extreme cases of such connections constitute direct experience.

For purposes of making a decision, any statement which leads to the conclusion that the decision has no effect, is false.

0John_Maxwell15y
What if you're trying to figure out how much time to spend deciding? Don't read the following sentence if you follow jimrandomh's heuristic. What is your favorite three-color combination?

“Apply deodorant before going to bed” lacks information. If I hadn't seen the previous discussion, I would assume the point was "Do apply deodorant", not "...rather than in the morning".

0CannibalSmith15y
Fixed.
0SilasBarta15y
So deodorant can withstand a shower and even be stronger afterward? (I shower in the morning.)
2Alicorn15y
I've been doing this since I read the OB article. My showering times vary widely, but when I do shower in the morning, it still seems to work fine.
1MendelSchmiedekamp15y
According to the linked article, yes. The critical thing seems to be that the period of time between the application and the bathing where you perspire less, lets the deodorant enter your pores. So for people who perspire less when they sleep than when they are awake, they should apply deodorant before bed. Not being one of those people myself, I keep my own counsel on the subject.

I'm pretty sure that 'if presented with a Monty Hall Problem, switch', is a bad heuristic: you'd need to know what Monty's strategy for deciding whether or not to open any doors before you could make a sensible decision.

A better heuristic might be 'If presented with a Monty Hall problem, ask Monty why he decided to open a door and show you a goat'.

0Rune15y
Why? Regardless of his strategy, you do no worse by switching.
1Douglas_Knight15y
What if he only makes the offer to people whose initial choice of door was the car? I read somewhere that on the show itself, the odds were about 50-50. Here's an interview in which he doesn't quite say that.

"Avoid counterfactuals and thought experiments." Seems inconsistent with: "If presented with a Monty Hall problem, switch."

You're probably not going to encounter an actual Monty hall problem, but maybe something kind of similar. I think "If presented with a Monty Hall problem, Think" is a better heuristic.

Perhaps the most important heuristics are the ones that tell you when to stop using heuristics.

Distrust the point with the long-winded proof. In this post, it would be #3. (Because thought experiments have been historically useful, e.g. EPR.)

1CannibalSmith15y
Other heuristics link to articles that are longer than #3's proof.

If some talk includes obvious rhetoric tricks, flip the bozo bit on the whole talk

The speaker probably prepared for maximum effect on human brains. Thus the arguments in the talk are likely one-sided and omit essential data.

Also, by ignoring the talk you are likely to counterbalance the unduly influence over most of the rest of the audience.