"Outside the laboratory, scientists are no wiser than anyone else."  Sometimes this proverb is spoken by scientists, humbly, sadly, to remind themselves of their own fallibility.  Sometimes this proverb is said for rather less praiseworthy reasons, to devalue unwanted expert advice.  Is the proverb true?  Probably not in an absolute sense.  It seems much too pessimistic to say that scientists are literally no wiser than average, that there is literally zero correlation.

    But the proverb does appear true to some degree, and I propose that we should be very disturbed by this fact.  We should not sigh, and shake our heads sadly.  Rather we should sit bolt upright in alarm.  Why?  Well, suppose that an apprentice shepherd is laboriously trained to count sheep, as they pass in and out of a fold.  Thus the shepherd knows when all the sheep have left, and when all the sheep have returned.  Then you give the shepherd a few apples, and say:  "How many apples?"  But the shepherd stares at you blankly, because they weren't trained to count apples - just sheep.  You would probably suspect that the shepherd didn't understand counting very well.

    Now suppose we discover that a Ph.D. economist buys a lottery ticket every week.  We have to ask ourselves:  Does this person really understand expected utility, on a gut level?  Or have they just been trained to perform certain algebra tricks?

    One thinks of Richard Feynman's account of a failing physics education program:

    "The students had memorized everything, but they didn't know what anything meant.  When they heard 'light that is reflected from a medium with an index', they didn't know that it meant a material such as water.  They didn't know that the 'direction of the light' is the direction in which you see something when you're looking at it, and so on.  Everything was entirely memorized, yet nothing had been translated into meaningful words.  So if I asked, 'What is Brewster's Angle?' I'm going into the computer with the right keywords.  But if I say, 'Look at the water,' nothing happens - they don't have anything under 'Look at the water'!"

    Suppose we have an apparently competent scientist, who knows how to design an experiment on N subjects; the N subjects will receive a randomized treatment; blinded judges will classify the subject outcomes; and then we'll run the results through a computer and see if the results are significant at the 0.05 confidence level.  Now this is not just a ritualized tradition.  This is not a point of arbitrary etiquette like using the correct fork for salad.  It is a ritualized tradition for testing hypotheses experimentally.  Why should you test your hypothesis experimentally?  Because you know the journal will demand so before it publishes your paper?  Because you were trained to do it in college?  Because everyone else says in unison that it's important to do the experiment, and they'll look at you funny if you say otherwise?

    No: because, in order to map a territory, you have to go out and look at the territory.  It isn't possible to produce an accurate map of a city while sitting in your living room with your eyes closed, thinking pleasant thoughts about what you wish the city was like.  You have to go out, walk through the city, and write lines on paper that correspond to what you see.  It happens, in miniature, every time you look down at your shoes to see if your shoelaces are untied.  Photons arrive from the Sun, bounce off your shoelaces, strike your retina, are transduced into neural firing frequences, and are reconstructed by your visual cortex into an activation pattern that is strongly correlated with the current shape of your shoelaces.  To gain new information about the territory, you have to interact with the territory.  There has to be some real, physical process whereby your brain state ends up correlated to the state of the environment.  Reasoning processes aren't magic; you can give causal descriptions of how they work.  Which all goes to say that, to find things out, you've got to go look.

    Now what are we to think of a scientist who seems competent inside the laboratory, but who, outside the laboratory, believes in a spirit world?  We ask why, and the scientist says something along the lines of:  "Well, no one really knows, and I admit that I don't have any evidence - it's a religious belief, it can't be disproven one way or another by observation."  I cannot but conclude that this person literally doesn't know why you have to look at things.  They may have been taught a certain ritual of experimentation, but they don't understand the reason for it - that to map a territory, you have to look at it - that to gain information about the environment, you have to undergo a causal process whereby you interact with the environment and end up correlated to it.  This applies just as much to a double-blind experimental design that gathers information about the efficacy of a new medical device, as it does to your eyes gathering information about your shoelaces.

    Maybe our spiritual scientist says:  "But it's not a matter for experiment.  The spirits spoke to me in my heart."  Well, if we really suppose that spirits are speaking in any fashion whatsoever, that is a causal interaction and it counts as an observation.  Probability theory still applies.  If you propose that some personal experience of "spirit voices" is evidence for actual spirits, you must propose that there is a favorable likelihood ratio for spirits causing "spirit voices", as compared to other explanations for "spirit voices", which is sufficient to overcome the prior improbability of a complex belief with many parts.  Failing to realize that "the spirits spoke to me in my heart" is an instance of "causal interaction", is analogous to a physics student not realizing that a "medium with an index" means a material such as water.

    It is easy to be fooled, perhaps, by the fact that people wearing lab coats use the phrase "causal interaction" and that people wearing gaudy jewelry use the phrase "spirits speaking".  Discussants wearing different clothing, as we all know, demarcate independent spheres of existence - "separate magisteria", in Stephen J. Gould's immortal blunder of a phrase.  Actually, "causal interaction" is just a fancy way of saying, "Something that makes something else happen", and probability theory doesn't care what clothes you wear.

    In modern society there is a prevalent notion that spiritual matters can't be settled by logic or observation, and therefore you can have whatever religious beliefs you like.  If a scientist falls for this, and decides to live their extralaboratorial life accordingly, then this, to me, says that they only understand the experimental principle as a social convention.  They know when they are expected to do experiments and test the results for statistical significance.  But put them in a context where it is socially conventional to make up wacky beliefs without looking, and they just as happily do that instead.

    The apprentice shepherd is told that if "seven" sheep go out, and "eight" sheep go out, then "fifteen" sheep had better come back in.  Why "fifteen" instead of "fourteen" or "three"?  Because otherwise you'll get no dinner tonight, that's why!  So that's professional training of a kind, and it works after a fashion - but if social convention is the only reason why seven sheep plus eight sheep equals fifteen sheep, then maybe seven apples plus eight apples equals three apples.  Who's to say that the rules shouldn't be different for apples?

    But if you know why the rules work, you can see that addition is the same for sheep and for apples.  Isaac Newton is justly revered, not for his outdated theory of gravity, but for discovering that - amazingly, surprisingly - the celestial planets, in the glorious heavens, obeyed just the same rules as falling apples.  In the macroscopic world - the everyday ancestral environment - different trees bear different fruits, different customs hold for different people at different times.  A genuinely unified universe, with stationary universal laws, is a highly counterintuitive notion to humans!  It is only scientists who really believe it, though some religions may talk a good game about the "unity of all things".

    As Richard Feynman put it:

    "If we look at a glass closely enough we see the entire universe. There are the things of physics: the twisting liquid which evaporates depending on the wind and weather, the reflections in the glass, and our imaginations adds the atoms. The glass is a distillation of the Earth's rocks, and in its composition we see the secret of the universe's age, and the evolution of the stars. What strange array of chemicals are there in the wine? How did they come to be? There are the ferments, the enzymes, the substrates, and the products. There in wine is found the great generalization: all life is fermentation. Nobody can discover the chemistry of wine without discovering, as did Louis Pasteur, the cause of much disease. How vivid is the claret, pressing its existence into the consciousness that watches it! If our small minds, for some convenience, divide this glass of wine, this universe, into parts — physics, biology, geology, astronomy, psychology, and so on — remember that Nature does not know it! So let us put it all back together, not forgetting ultimately what it is for. Let it give us one more final pleasure: drink it and forget it all!"

    A few religions, especially the ones invented or refurbished after Isaac Newton, may profess that "everything is connected to everything else".  (Since there is a trivial isomorphism between graphs and their complements, this profound wisdom conveys exactly the same useful information as a graph with no edges.)  But when it comes to the actual meat of the religion, prophets and priests follow the ancient human practice of making everything up as they go along.  And they make up one rule for women under twelve, another rule for men over thirteen; one rule for the Sabbath and another rule for weekdays; one rule for science and another rule for sorcery...

    Reality, we have learned to our shock, is not a collection of separate magisteria, but a single unified process governed by mathematically simple low-level rules.  Different buildings on a university campus do not belong to different universes, though it may sometimes seem that way.  The universe is not divided into mind and matter, or life and nonlife; the atoms in our heads interact seamlessly with the atoms of the surrounding air.  Nor is Bayes's Theorem different from one place to another.

    If, outside of their specialist field, some particular scientist is just as susceptible as anyone else to wacky ideas, then they probably never did understand why the scientific rules work.  Maybe they can parrot back a bit of Popperian falsificationism; but they don't understand on a deep level, the algebraic level of probability theory, the causal level of cognition-as-machinery. They've been trained to behave a certain way in the laboratory, but they don't like to be constrained by evidence; when they go home, they take off the lab coat and relax with some comfortable nonsense.  And yes, that does make me wonder if I can trust that scientist's opinions even in their own field - especially when it comes to any controversial issue, any open question, anything that isn't already nailed down by massive evidence and social convention.

    Maybe we can beat the proverb - be rational in our personal lives, not just our professional lives.  We shouldn't let a mere proverb stop us:  "A witty saying proves nothing," as Voltaire said.  Maybe we can do better, if we study enough probability theory to know why the rules work, and enough experimental psychology to see how they apply in real-world cases - if we can learn to look at the water.  An ambition like that lacks the comfortable modesty of being able to confess that, outside your specialty, you're no better than anyone else.  But if our theories of rationality don't generalize to everyday life, we're doing something wrong.  It's not a different universe inside and outside the laboratory.

    Addendum:  If you think that (a) science is purely logical and therefore opposed to emotion, or (b) that we shouldn't bother to seek truth in everyday life, see "Why Truth?"  For new readers, I also recommend "Twelve Virtues of Rationality."

    New Comment
    351 comments, sorted by Click to highlight new comments since:
    Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

    But when it comes to the actual meat of the religion, prophets and priests follow the ancient human practice of making everything up as they go along. And they make up one rule for women under twelve, another rule for men over thirteen; one rule for the Sabbath and another rule for weekdays; one rule for science and another rule for sorcery...


    I thought those rules were the outcome of competition between different factions. The factions with the better rules were more likely to win. For example, a century or two ago, part of the Jewish community decide... (read more)

    Not only competition, but what seemed logical. I'm only 5 years late to this, but I figure I'll add this regardless: Shrimp made people sick, so it only made sense to make rules against eating shrimp, regardless of the reason behind it making people sick. A lot of the old testament is pretty much a survival guide. That link is however a church, and as far as I can tell does not represent the Jewish faith. From what I know, it's not that shrimp were bad, and hated by God, but that since people got sick, it was not a great idea to eat it. Same logic that founded rules about washing your hands before dinner - they didn't think God hated your hands, they just figured out some correlation between sickness, and filth. That said, it's not all good, but it seems to me that at least SOME rules were based on logic. And that whoever had the worse rules DID die more frequently.

    Eh, I see this as a purely selective / survivorship-bias process:

    All the little minority groups that didn't have weird rules got assimilated into the mainstream culture and lost their identity as little minority groups. They became Persians or Greeks or Romans or Christians or Muslims, when those empires were in ascendancy. Therefore, all the little minority groups that have remained distinct for thousands of years have weird rules.

    It's not that the weird rules were good for individuals' survival. Pretty often, you're better off individually if you join the mainstream. But weird rules are good for maintaining group identity.

    Interesting. I've not thought of it like that, but it would make sense - groups would drop their weird rules if they didn't fit the larger group which they were integrated into. However, in this case at least, it IS so that the weird rules increased survival. Rules about keeping clean were seen as weird, but were generally beneficial for the individual. Example linked to the discussion: During the Black Plague fewer Jews got infected, mainly due to the weird rules. Only negative was that this was suspicious, and these Jews were believed to be the cause... A bit of a lose-lose situation, with good intentions.
    What I hear there is that in one particular circumstance, the weird rules may have increased survival from disease, but decreased survival from persecution. Net result probably nil for the individual. But persecution also maintains the minority group's distinct status.
    True. In this case, it most likely did harm in the long run, but the intentions behind were good, and logical. It's not always rational to generalize, but you make a good argument. Though I'm not sure - for the most part, weird rules in religion seem to be based on public opinion as much as group identity or logic. In short: Can be good or bad depending on circumstances, no matter what it is based on. But it's late and I'm beginning to fear for my mind. I'll stop before I embarrass myself too much.

    Joseph, how did they get these "competing rules" in the first place? By making them up as they went along. So, in accordance with human psychology, they make up lots of different rules for different occasions that "feel different". Both sides (or all sides) of any religious battle do this, and it doesn't matter who wins, they still won't come up with a unified answer.

    Shouldn't that lead to at least some (if very poor) "testing" of rules over time? Some (such as taboos which strengthen social cohesion or which inadvertently help avoid dangerous behavior) would help the ground adapt, whilst others (which do neither) would be unlikely to continue.


    Q) Why do I believe that special relativity is true? A) Because scientists have told me their standards of evidence, and that the evidence for special relativity meets those standards.

    I haven't seen anything contract when moving close to the speed of light. I haven't measured the speed of light in a vacuum and found that it is independent of the non-accelerating motion of the observer. I haven't measured a change in mass during nuclear reactions. I simply hear what people tell me, and decide to believe it.

    George Orwell put it far more elegantly, and... (read more)

    Seems to me that a lack of patience is part of the problem. Some people would like to be able to really understand why special relativity is true and go through the argument and experiments but they'd have to invest quite some time doing so, before they'd find out for themselves. So too various other things people would like to know, but believe they haven't got the time to deeply examine. Couple that with a compressed curriculum in education where students now need to know more than ever before and know it in less time. Couple that with our society that puts information into increasingly small packets, that spends vast amounts of advertising dollars on convincing people in the shortest optimum time to buy some item, and it's revealed that people are time poor when it comes to deeply understanding and investigating what it is they want to know. Now with regards to "we still don't even know what kind of food is best to eat!" That is a question that we do know! But you probably won't find it in advertising material, you probably won't find it one particular book, and you most certainly won't find it in one particular eatery/restaurant. You will find the answer from a professional dietician/nutritionist (whatever your country calls them) that's spent about 3 years studying to find out the answer in all its complexity. Shall we trust that professional, shall we have faith in that professional? Or do we want to find out the answer for ourselves... whilst we struggle with paying the mortgage, getting the kids to school and meeting our work commitments? When we dismiss "faith" and "trust", and I don't mean in a deity, I mean when we dismiss faith and trust in other humans, we are left in a very precarious position of having to work it all out for ourselves.
    Are you asserting that there is no controversy among credentialed nutritionists about what kind of food is best to eat?
    I assert that suitably diligent nutritionists do make a series of measurements of a particular individual and then offer accurate advice on what are the best kinds of food to eat in that circumstance, that they will retest those measurements and refine their advice as appropriate. It's my opinion that much of the "controversy" with regards to what are the best kinds of food to eat is based in the fact that many people, including some of those who hold a certification in nutrition/diet make no measurements before they make a judgment. My opinion is that it's the generalist answer that is controversial, not the specific.

    Are you asserting that there is no controversy among credentialed nutritionists about what kind of food is best to eat?

    Nutritionist here. The protected word is "dietician", literally anyone can legitimately call themselves "nutritionists", whereas you actually have to have some relevant credentials before you're a credited dietician.

    As a nutritionist, my professional opinion is that bricks are quite healthy, due to their high iron content.

    Yes, academics largely train people to follow various standard procedures as social conventions, instead of getting people to really understand the reasons for those conventions. Apparently it is very hard to teach and test regarding the underlying reasons. That is the fact that really gives me pause.


    "Now suppose we discover that a Ph.D. economist buys a lottery ticket every week. We have to ask ourselves: Does this person really understand expected utility, on a gut level?"

    Tricky question. It we look purely at the financial return, the odds, then no. If we look at the return in utility, possibly yes.

    Is $1 too much to pay for a couple of days of pleasurable dreams about what one would do if one won? Don't we think that such fleeing from reality has some value to the one entering such a fantasy, a suspension of the rules of the real world?

    If we don't agree that that has some value then it's going to be terribly difficult to explain why people spend $8 to do to the movies for 90 minutes.

    I don't buy lottery tickets.. but I still dream about what I'd do if I won. I realised a while back that i don't actually have to pay to have those dreams.

    Yes, I agree. Part of my brain does not understand the difference between a small chance of something happening and a really small chance of something happening. Probably the same thing is true of most people, including PhD economists. It's doesn't seem unreasonable to spend $10 a year to humor one's inner moron. It might be a different story if those same PhD economists were spending thousands of dollars a year on lottery tickets. But even then, the most likely explanation is that the PhD economist has a gambling problem. And like most addicts, he knows that he's behaving irrationally; he just has a hard time controlling himself.
    Even if that's the justification, you can do better: http://lesswrong.com/lw/hl/lotteries_a_waste_of_hope/ It's not clear that lotteries are a good use of time: you aren't thinking 24/7 about your dreams, you dream for maybe a few minutes total, and from that perspective, $1 is far too much to pay when you can, say, download a totally engrossing movie from the Internet for $0. And that argument still serves to ban more gambling than say $10 a day, which many gamblers routinely violate.
    I partially AWYC, but unless there's some aspect to the experience that I don't get, I don't see why actually buying the lottery ticket is necessary. Going to a movie helps one escape into fantasy. A lottery ticket seems like a much less helpful prop for this. I can - and have, particularly as a child - fantasize about what I would do with wealth and status (although the means of achieving such, in my fantasies, has generally been the slightly lesser improbability of becoming a famous author or something similar) completely unaided. In fact, it might be better to do as I did and imagine achieving your incredible wealth by some means other than lottery winnings, precisely because winning the lottery is so improbable. Thanks to a horrifying history of akrasia on that front and some amount of realization that I really want to do science instead, I haven't actually made any effective moves towards becoming an author, but nevertheless it is, I think, an accepted fact that people will be more motivated to do what they fantasize about. Why not let them be motivated to do something actually useful?

    In sum, I agree, but one small issue I take is when you argue that someone acts contrary to their learning it demonstrates that they don't really understand it. I'm sure this is often the case, but sometimes it's a matter of akrasia: the person knows what they should do and why, even deep down inside, yet finds themselves unable to do it.

    Humans suffer heavily from their biases. I recall at in middle school I came to the conclusion that no deities existed, yet it took me a long while to act on it because of social pressures, so I continued to behave cont... (read more)

    Depending on the circumstances and your priorities, pretending to have religious beliefs might have been the most rational thing to do (not knowing either, I don't know if that's true of course).

    Tao was described (I wouldn't say invented...) long before Isaac Newton, and yet expresses Feynman's sentiments almost exactly. But then it isn't a religion, either.

    Apparently it is very hard to teach and test regarding the underlying reasons.

    Does "apparently" (in general) mean you aren't using additional sources of information? In this case, are you concluding that it's difficult simply from the fact that it isn't done? That only seems to me like evidence that it's not worth it. Unfortunately, the value driving the system is getting published, not advancing science.

    Douglas, I have found it hard to teach when I have tried, but I'm sure another reason it is rarely done is that academic rewards for it tend to be small relative to the costs.

    Tim Worstall, if a PhD economist has pleasurable dreams about winning the lottery, that is exactly what I would call "failing to understand probability on a gut level". Look at the water! A calculated probability of 0.0000001 should diminish the emotional strength of any anticipation, positive or negative, by a factor of ten million. Otherwise you've understood the probability as little symbols on paper but not what it means in real life.

    Also, a good economist should be aware that winning the lottery often does not make people happy - though one must take into account that they were the sort of people who bought lottery tickets to begin with.

    Tim Worstall, if a PhD economist has pleasurable dreams about winning the lottery, that is exactly what I would call "failing to understand probability on a gut level"

    In that case, wouldn't you say that anyone who suffers from akrasia (which is pretty much everyone at some time) has a failure of understanding on a gut level? My subconscious mind doesn't seem to understand that it's a bad idea to eat a box of pizza every night; so I have to rely on my conscious mind to take charge, or at least try to.

    Occasionally even health-conscious people eat stuff like pizza, which is arguably the equivalent of buying the occasional lottery ticket. In each case, the conscious mind is aware that one is doing something counter-productive. In the case of a lottery ticket, one is enjoying the fantasy of being free from his day-to-day financial worries,even though there is essentially zero chance of actually succeeding. In the case of pigging out, one is enjoying the feeling of being stuffed with tasty food, even though there is essentially zero chance that there will be a food shortage next week which will justify his having pigged out.

    What's wrong with healthy people (in particular, gluten-tolerant) eating pizza?
    It's high carb? It gives me heartburn (probably gluten intolerance?). If you are trying to go on a cut i.e. want a six pack it's a bad idea.
    And why is that a problem? You seem to be implying that a low-carb diet is The Only True Way which looks doubtful. The claim was about "health-conscious" people, not body-image-conscious.
    Because of the negative effects it has on your insulin response, leading to pancreas fatigue and type 2 diabetes. I was under the impression that a low body fat percentage was healthier. Perhaps I'm wrong. I must admit my beliefs are influenced by aesthetics. I'd bet on low abdominal fat been the optimal via a low-ish carb diet.
    We know that low-carb is effective at losing weight. The jury is still out on whether low-carb is healthy in the long term. Similarly, while it is clear that being obese is unhealthy, I don't think that there is any evidence to show that being very thin (having low body fat %) is healthier than being normal.
    See here, though it uses BMI rather than body fat %.
    Yes, and it does show the expected U-shaped curve. BMI is pretty useless as an individual metric, though.
    That was the point. (I also incorrectly remembered that the minimum was shifted a bit to the right of what's usually called “normal weight”, i.e. 18.5 to 25, but in the case of healthy people who've never smoked it looks like that range is about right.)
    Depends on what you mean by normal?
    The usual: 10-20% BF for men (you can have less if you're actually an athlete), 20-30% for women.
    Oh you mean healthy not normal? Few men are at 10-20%.
    I mean "normal" in the sense of "not broken", NOT in the sense of "average". Having said that, about 20% of US men under 40 have less than 20% body fat. Source
    In which case you should take “healthy people” to mean those who are not trying to go on a cut because they already have a six-pack.
    The main problem is that for a large percentage of people, pizza is a super-stimulus. i.e. it tastes far better that what was normally available in the ancestral environment so that it's difficult to avoid over-consuming it. Of course the health dangers of over-consumption of food are well known. If you think pizza is a bad example, feel free to substitute candy bars or coca-cola.
    I don't think this is true. Or, rather, if you think that pizza is a super-stimulus food, most food around is super-stimulus (with exceptions for things like stale cold porridge). Super-stimulus foods are ether very sugary or very salty. Pizza is neither. What pizza is, it's a cheap easily-available high-calorie convenience food. That makes it easy to abuse (=overconsume), but doesn't make it inherently unhealthy.
    I disagree, depending on how you define "most food around" of course. If you are talking about food that you can go into a restaurant or fast food joint and buy, then I would have to agree with you. If you are talking about the dinners mom cooked back in the 70s, then I would not agree. Well do you agree that pizza tastes really good? Do you agree that (generally speaking) small children LOVE pizza? It's unhealthy for the reasons I stated earlier. But let me ask you this: What is a food or drink which you do consider to be unhealthy?
    I define it as food I see and eat in my home as well as food in the restaurants. I like yummy food and I see no reason to eat non-yummy food. You seem to think that any tasty food is super-stimulus food. That's not how most people use the term. Depends. There's a lot of bad pizza out there. You can get very good pizza but you can also get mediocre or bad pizza. I don't see why this is relevant. Small children in general also like pasta and even you probably wouldn't consider it a super-stimulus food. The dose make the poison. In small amounts or consumed rarely, pretty much no food or drink is unhealthy (of course there are a bunch of obvious exceptions for allergies, gluten- or lactose-intolerance, outright toxins, etc.). With this caveat, I generally consider to be unhealthy things like the large variety of liquid sugar (e.g. soda or juice) or, say, hydrogenated fats (e.g margarine, many cookies).
    Or fatty. Shouldn't pretty much any cooked food be a super-stimulus considering the relevant ancestral environment and why we intricately cook food in the first place? Super-stimuli could be different for different age groups. I've never seen anyone love plain pasta, they like their ketchup and sauce too.
    Not sure about that. Fat makes food more tasty (mostly through contributing what's called "mouth feel"), but it doesn't look like a super-stimulus to me. Well, depends on how do you want to define "super-stimulus". I understand it to mean triggering hardwired biological preferences above and beyond the usual and normal desire to eat tasty food. The two substances specifically linked to super-stimulus are sugar and salt. Again, super-stimulus is not the same thing as yummy.
    I'm not sure it's that simple -- chocolate is more of a super-stimulus than fruits for most people.
    True. On the other hand, take away the sugar and see how many chocoholics are willing to eat 99% dark chocolate :-/
    Ever seen a child lick butter off a slice of bread? Don't tell me they would lick off just salt too.
    I've seen both. In the case of salt it's lick finger, stick it into the salt bowl, lick clean, repeat.
    Ah, now that you reminded me I've seen the latter too, dammit.
    Dunno about 99% (though if you set the bar as low as "willing to eat" I probably would), but I do find 85% dark chocolate quite addictive (as in, I seldom manage to buy a tablet and not finish it within a couple days). But I know I'm weird.
    A couple of days! :-) That's not what "addiction" means.
    I meant it in the colloquial ‘takes lots of willpower to stop’ sense, not the technical ‘once I stop I get withdrawal symptoms’ sense. (Is there a technical term for the former?) (OK, it does seem to me that whenever I eat chocolate daily for a few weeks and then stop, I feel much grumpier for a few days, but that's another story, and anyway it's not like I took enough statistics to rule out it being a coincidence,)
    Addiction vs physical dependence.
    The verb "like" and a variety of synonyms :-D
    Not quite -- I'm talking about the upper extreme of what Yvain here calls “wanting”, though that word in the common vernacular has strong connotations of what he calls “approving”.
    I know some chocoholics. Trust me, if it takes you a couple of days to finish a chocolate bar, you're not addicted :-D
    Is there something wrong with binging or compulsion? Withdrawal symptoms would imply dependence, but not necessarily addiction.
    Did our preferences mostly evolve for "tasty food" or for raw meat, fruit, vegetables, nuts etc? I thought super-stimulus usually means something that goes beyond the stimuli in the ancestral environment where the preferences for the relevant stimuli were selected for. I don't understand how you draw the line between stimuli and super-stimuli without such reasoning. I guess it's possible most our preferences evolved for cooked food, but I'd like to see the evidence first before I believe it. ETA: I don't think there's necessarily anything wrong with super-stimuli, so let's drop the baggage of that connotation.
    Well, I actually don't want to draw the line. I am not a big fan of the super-stimulus approach, though obviously humans have some built-in preferences. This terminology was mostly used to demonize certain "bad" things (notably, sugar and salt) with the implication that people can't just help themselves and so need the government (or another nanny) to step in and impose rules. I think a continuous axis going from disgusting to very tasty is much more useful.
    Well, sure. Similarly, a continuous axis designating typical level of risk is more useful than classifying some activities as "dangerous" and others as "safe." Which doesn't mean there don't exist dangerous activities.
    So you disagreed with the connotation. I disagree with it too, and edited the grandparent accordingly. I still like the word though, and think it's useful. I suppose getting exposed to certain kind of marketing could make me change my mind.
    According to what I read in Scientific American, the human digestive system has evolved to require cooked food; humans can't survive on what chimpanzees and other primates eat.
    Oh God! Please never utter those two words in the same sentence where an Italian can hear you. I was about to barf on the keyboard! :-) Then again, people (other than me, at least) don't usually binge on flat bread without toppings, either.
    Are you saying that plain pasta and bread without toppings are super-stimuli for you? Are you not even using oil? :) I can understand the bread part if it's fresh, but as far as I'm concerned pasta doesn't taste much like anything. Perhaps I've just eaten the wrong kind of bland crap.
    When I was a kid, my grandmother had some trick that caused her bland spaghetti (possibly with some oils and stuff, but mostly things that weren't visible after it was prepared) to be the best food that I knew of. If not superstimuli, then close to it. Unfortunately she's no longer alive, and she never passed the trick on to anyone else, so I can't say whether I would get the same pleasure out of it as an adult.
    Do you know if it was fresh? I hear that fresh pasta is comparable to fresh bread.
    Interesting. Links or stories? I am very much aware of the difference between fresh-baked bread and "plastic bread" from the supermarket. It's huge. Are people claiming freshly-made pasta is different to the same degree?
    It appears not. [1] [2] [3] Fresh pasta has a more pronounced flavor, and is generally made with a superior variety of flour (that doesn't keep as well), which means less of the flavor work is done by the sauce. (I don't think I've ever had fresh pasta, and so don't have any first-hand reports. I do think fresh bread is worlds better than supermarket bread, though.) Also, in America at least, making fresh pasta is a very grandmothery thing to do, and so my prior was high enough to be remarkable.
    Hmm... I am getting curious. Not yet to the degree of making fresh pasta myself, but I recall that there is WholeFoods nearby that sells it... On the other hand pasta is basically boiled wheat dough and I generally find dough as bread to be yummier than dough as pasta.
    No, unless I misremember terribly it was ordinary market spaghetti.
    Hmm. Well, you can vary the taste by throwing salt into the pot, but I've never found a level of salt that I thought would raise the quality more than a point on a ten point scale. Adding spices while boiling, like powdered garlic, will alter the taste somewhat but I think they're more effective in sauces / applied afterwards, and are often visible.
    If someone wants to experiment, my starting point would be this: Take some good olive oil (extra-virgin, first cold press, etc.) and grate fresh garlic into it. Stir and let it stand covered for an hour or so. Once your pasta is ready, drain it, and then toss with the garlic-infused olive oil.
    Apparently my mother tried to make some spaghetti according to my grandmother's instructions, but it never tasted the same to me. So either it was something really subtle, or there was a placebo effect involved (or both). ETA: Though now that I think of it, I'm not entirely sure of the "bland" thing anymore - there might have been a sauce involved as well. Damn unreliable memories.
    Olive oil, lots and lots of it. Thank me later. I have been drenching food with it and getting compliments on my cooking skills for years, and I also use to say it's a secret given my GF would freak out due to high calories. (disclaimer: I weight 260 pounds)
    No, I eat pasta with sauces other than ketchup. And I do eat much more plain bread than the average person e.g. when I'm at the restaurant and I'm waiting for the dishes to arrive, but I think it's more got to do with boredom and hunger than anything else -- it's not like I have to refrain from keeping any bread at home whenever I'm trying to lose weight lest I binge on it, the way I do with cookies. Anyway, my general point was that comparing pizza with toppings to pasta without toppings (in terms of how much people, in particular small children, enjoy them) isn't a fair comparison.
    I binge on (fresh) bread without toppings, but I find pasta much more enjoyable with ketchup or some sort of spice.
    It's ok! I'll prepare a tomato, garlic, and basil sauce with some Merlot cooked in, stat!
    Do you still believe that fatty equals not good for you? Plus who the hell puts ketchup anywhere near pasta?
    No. Why would you think that? People who torture kittens for fun. Both are an acquired taste.
    I suppose I just expect from people, even intelligent people on LW. The reverse correlation doesn't work because I torture kittens too.
    It doesn't?
    Probably depends on how much you eat it, and what kind. Let's not oversimplify things.
    I'm not sure what kind of food you keep in your home, but thinking on the fact that a huge percentage of American adults are overweight or obese, I would probably agree that "most food around" is super-stimulating. Well you asked me why I consider pizza to be a problem. If you don't want to use the word "super-stimulus," it doesn't really affect my point. Pizza tastes good enough to most people that it's difficult to resist the urge to over-eat. That's my answer. Oh come on. Please use the Principle of Charity if you engage me. When I assert that "pizza tastes really good," you know what I mean. Well small children are naive enough to come right out and express a strong preference for the foods they love. And they don't beg their parents for pasta parties. Well let me put the question a slightly different way: Do you agree that there exist certain foods which taste really good; which a lot of people have a problem with, which in many ways are like an addiction?
    From what I remember, I did occasionally beg for pizza around that age, but if I'm modeling my early childhood psychology right that had as much to do with cultural/media influence as native preference. Pizza is the canonical party food in American children's media, and its prominence in e.g. Teenage Mutant Ninja Turtles probably didn't help. Media counts for a lot! Show of hands, who here found themselves craving Turkish delight after reading The Lion, the Witch, and the Wardrobe without actually knowing what it was?
    Do you agree that part of the reason kids beg for pizza is that it tastes really good? Let me ask you this: If you gave lab rats a choice between pizza and oatmeal, which do you think they would choose?
    I don't know the answer to this, but I'd caution against using lab rats, which, keep in mind, have quite different dietary needs, as an indicator of human dietary preferences.
    Well you are capable of estimating some probabilities, no? I agree that caution is in order, but I feel pretty confident, perhaps 90% probability, that lab rats will choose pizza over oatmeal. Here's a study which might affect your probability assessments: http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0060407
    I'd take the other side of the bet. Anybody willing to test this?
    I think pizza, at least in the United States and during the years around my own childhood, occupied a cultural position that's not fully describable in terms of its nutritional content. Stimulus concerns are sufficient to explain favoring it over something like (plain) oatmeal, but not over something like spaghetti and meatballs or chicken-fried steak. I'm told curry occupies a similar position in Japan. Other cultures probably have their own equivalents.
    Ok, I guess I read your first post too quickly. You don't seem to dispute my basic claim that pizza tastes really good. You also don't seem to dispute my claim that children's preference for pizza is evidence of this. Because whatever food children beg for -- whether it's pizza, hot dogs, or curry -- is probably going to be something that tastes good. I do agree that children ask for pizza -- as opposed to other tasty foods -- for cultural reasons. But I don't think that contradicts any argument I have made.
    My kids didn't want pizza (pretty much ever), until they started school, and then they wanted pizza primarily when having friends over. I think its more social/cultural then anything else. Also, they are pizza snobs- I'm not allowed to order from a local place because its "too salty, and too greasy." They'd prefer no pizza, or a usual dinner (stir fry or something) to the wrong pizza. Also, I'm not sure if "super stimulus" food are super stimulus consistently. I hate fast food burgers, and have since I was little (but sit me down in a hole-in-the-wall mexican place and I'll eat until I wish I was dead). Just adding a few anecdotes.
    Well do you agree that despite your experiences, there do seem to be certain foods which are considered tasty and difficult to resist by large numbers of people?
    I actually live in a fairly healthy "bubble," I don't know many significantly overweight people. I know the stereotypes, I guess, that fat people guzzle sodas and pound mcdonalds. I guess the one example of someone who eats typical bad-for-you foods is my wife's sister who basically grew up only eating burgers (an extremely picky eater with very permissive parents. She still pretty much only eats burgers). But she weighs 125 lbs and runs marathons. But again, these are my selective anecdotes. I don't claim representative knowledge.
    And the overweight people you know don't seem to have any specific foods or types of foods which they have trouble resisting?
    Anecdote time! There was a period when I loved pasta but wouldn't eat pizza because I had not yet grasped that Tomatoes Are Awesome. Also that book made me classify Turkish Delight as a drug, and Drugs Are Bad don'tcha know. And then when I finally got some I realized it also tastes bad.
    Turkish Delight isn't just one thing. I've had mediocre bright-colored (and probably artificially flavored) turkish delight, and delicious fresh transparent turkish delight flavored with rose water. If you care about the subject, you should see if you have access to a middle eastern shop where you can get the good stuff. Tentative theory: the good stuff isn't packaged, so it has to be fresh. If it wasn't fresh, it would have dried out.
    Thanks for the tip! The only Turkish delight I remember having was bright-colored and came in a box.
    Sigh. So you really think that the cause of obesity is that food is just too yummy, too attractive? Before you answer, think about different countries, other than US. Japan, maybe? France? Please try to avoid the typical mind fallacy. People around me don't seem to have the urge to overeat pizza. A lot of them just don't like it, others might eat a slice once in a while but no more. Nobody is obsessed with pizza and I doubt many will agree that "pizza tastes really good" -- they'll either say "it depends" or shrug and say that pizza is basic cheap food, to be grabbed on the run when hungry. No one -- not a single person around me -- shows signs of having to exert significant will power to avoid stuffing her face with pizza. Presumably there is a logical "AND" between you sentence parts. Depends on what do you mean by "taste really good" (see above about pizza) and by "a lot". People generally overeat not because the food is too yummy. People generally overeat for hormonal and psychological reasons.
    What is your hypothesis for why obesity rates have exploded to such an extent in the last several decades?
    Oh, dear. There are what, a few dozens of books on the topic, not to mention uncountable papers and articles? I think it's complicated and not attributable to a single easy-to-isolate factor.
    Well, here's an easy one that I've even got some empirical evidence for: refined sugars being added to common foods where you simply don't expect sugars to be. I know that when I'm here in Israel, I have an easy time controlling my eating (to the point that skipping meals sometimes becomes my default), but when I'm in the States, I have a very hard time controlling my eating. I've noticed that when I even partially cut refined sugars from my diet, I get through the day with a much clearer mind, particularly in the realm of executive/self-disciplining functions. It's to the point that I'm noticeably more productive at work without refined sugar. There are lots of differences in diet between Israel and the USA, but the single biggest background factor is that in Israel, sweets are sweets and not-sweets are not sweetened. Whereas in the US, everything but the very rawest raw ingredients (ie: including sliced bread) has some added refined sugars. With a large background level of "derp drug" in your basic foodstuffs, it's probably quite easy to suffer blood-sugar problems, get cravings, and lose a degree of focus and self-control. It's certainly what I experience when I'm there.
    ISTM that for me in the short run it's the other way round, but that's probably got to do with the fact that most of my sources of refined sugars are sources of caffeine and water as well.
    Try unsweetened black tea or coffee. Seriously: it works wonders.
    Absolutely. (And too available.) I've been thinking about this question pretty intensely for a couple years now. Where did you get the impression that I am going just by my own experiences? Roughly what percentage of the people around you are overweight or obese? Of those who are overweight or obese, do they seem to have the urge to eat any foods or types of foods to excess? For purposes of this exchange, I will define "taste really good" as being at the high end of "yummy." Since you used the word "yummy" before, you presumably know what you meant. I will define "a lot" as more than 5 million Americans. Ok, now do you agree that there exist certain foods which (1) are considered to be very yummy by a majority of Americans; (2) which a lot of Americans have a problem with (in the sense that they have difficulty controlling their consumption of these foods); and (3) which are like an addiction (in the sense that some people feel compelled to overconsume such foods despite knowing or having received professional advice that they are consuming too much food)
    Well then, you have an unusual viewpoint :-) Any evidence to support it? Because you didn't offer any data or other evidence. It looked just like a classic stereotype -- look at all these fat Americans who can't stop shoving pizzas into their pieholes! 10-15%, maybe? Nope, not to my knowledge. Of course some might be wolfing down bags of cookies in the middle of the night, but I don't know about it :-) I will still say no because I don't think food is addictive. But let me try to see where to do you want to get to. Let's take full-sugar soda, e.g. Coca-Cola. There certainly has been lots of accusatory fingers pointed at it. The majority of Americans drinks it, so I guess (1) is kinda satisfied. Do people have difficulty controlling their consumption of it? Yep, so (2) fits as well. On the other hand, these people tend to have difficulty controlling a lot of things in their lives, for example credit cards, so I'm not sure there is anything food-specific going on here. Is it like an addiction? Nope, I don't think so. "Knowing professional advice" is way too low an incentive for people to change their ways.
    You're not doing it either, y'know. I think you have now (re?)defined at least two words, super-stimulus and addictive, to fit your purposes. Tobacco doesn't fit your definition of addictive either.
    I'm neither proposing nor defending a hypothesis. I did define "super-stimulus", but I don't think I tried to define "addictive" (and that's a slippery word, often defined to suit a particular stance).
    Have you read this relevant article? It's confusing when you say you're disagreeing with a definition, when you actually mean you're disagreeing with the connotation.
    I am not sure what are you referring to...?
    Addiction is "a slippery word, often defined to suit a particular stance". Super-stimulus is "mostly used to demonize certain "bad" things (notably, sugar and salt) with the implication that people can't just help themselves and so need the government (or another nanny) to step in and impose rules.". Sure, you finally explicitly said these things but you could have said you disagreed with the connotations in the first place, which would have made the discussion about definitions pointless and perhaps dissolved some disagreement.
    I do, but I prefer to stay focused on the subject at hand. Let's see if I have this straight -- any time someone makes a generalization about human nature without simultaneously volunteering data or other evidence, one can reasonably assume that they are engaged in the typical mind fallacy? Do I understand you correctly? And of those 10-15%, roughly what percentage have tried to lose weight and failed? So let's see if I understand your position: You deny that there are a lot of people who consume certain foods even while knowing that they are consuming too much food?
    If it contradicts one's personal experience then yes, one can reasonably assume. Subject to being corrected by evidence, of course. I don't know. None of them visibly yo-yos. Pretty much everyone once in a while says "I could lose a few pounds", but it's meaningless small talk on the order of "Weather is beastly today, eh?" No, I don't deny that, I just think that the word "addiction" is not the appropriate one.
    Well your personal experience contradicts mine. So please try to avoid engaging in the Lumifer Typical Mind Fallacy. Thank you. But you do know that none of them have a difficult-to-resist urge to eat certain foods or types of foods? Well please answer the question I asked and not the question you imagine I had asked. I asked (among other things) if there were certain foods which "are like an addiction (in the sense that some people feel compelled to overconsume such foods despite knowing or having received professional advice that they are consuming too much food)" I was careful to say "like an addiction" and to describe what I actually meant. So it seems you DO agree with me that there exist certain foods which (1) are considered to be very yummy by a majority of Americans; (2) which a lot of Americans have a problem with (in the sense that they have difficulty controlling their consumption of these foods); and (3) which are like an addiction (in the sense that some people feel compelled to overconsume such foods despite knowing or having received professional advice that they are consuming too much food) Right?
    Good. Do notice that, as opposed to you, I did not attempt to "make a generalization about human nature" on the basis of my personal experience. Of course not. I am not inclined to play fisking games (or lets-adjust-this-definition-to-split-the-hair-in-half games) on these forums. No, I do not agree with you. You have enough information to figure out how and why.
    Ummm, here's one thing you said before: 1. You didn't offer any evidence or data to back this up. 2. It contradicts my personal experience. Therefore you have committed the Lumifer Typical Mind Fallacy. Please try to avoid it in the future. Lol, then your personal experience doesn't even contradict my basic point. Say what? You just redefined my words so that you could answer a different question. I asked (among other things) if you agreed that there are foods which are "like an addiction (in the sense that some people feel compelled to overconsume such foods despite knowing or having received professional advice that they are consuming too much food)" You reinterpreted that question as though I was asking whether certain foods are addictive. So that you could easily answer "no" using your own definition of "addictive." Please answer the question I asked -- not the question you wish or imagine I asked. Yes, I have enough information to make a pretty good guess as to why you are evading my question.
    Contrary opinion: Also, while I don't find pizza to be at all addictive, my experience is that hamburgers are very much so. I've had experiences where I successfully avoided eating any meat for two months in a row, then succumbed to the temptation of eating a single hamburger and then ate some several times a week for the next month.
    Interesting. I just get such consistent meat cravings that I don't even bother trying to not eat meat. I just buy a certain amount and eat it as a basic food group.
    Yes, I am aware that such exist :-) It's really a definitions argument, about what one can/should apply the word "addiction" to. As such it's not very interesting, at least until it gets to connotations and consequences (e.g. if it's an addiction, the government can regulate it or make it illegal). It's human to succumb to temptations. Not all temptations are addictions.
    Succumbing to a temptation occasionally is one thing. But even a single case of that happening leading to a month-long relapse? That's much more addiction-ish.
    The government can regulate or ban things as public health risks which are not deemed addictions though, and things which are recognized as addictive are not necessarily regulated or banned.
    All true, but if you look at it from a different side: if you want to regulate or ban something, would you rather call it an addiction or an unfortunate exercise of the freedom choice? :-)
    Well, the latter characterization would certainly not aid me in my attempts to get it banned, but if calling it an addiction were likely to result in semantic squabbling, I'd probably just call it a public health risk.
    If you're liberal enough about what people are allowed to do, should you call anything an addiction? I'm not sure if politics connotatively hijacking scientific terminology is a good reason to change the terminology. Would you suggest something like that?
    Sure. I would call things which change your personal biochemistry in the medium term (e.g. opiates) addictive. I think it's a reasonable use of the term.
    There are opiate receptors in the brain because your brain produces transmitters that bind to those receptors. You should expect certain behaviours you engage to change your personal biochemistry in various time spans as well.
    A fair point. I should add probably the necessity of a positive feedback loop to the definition.
    Casomorphins in dairy have opioid effects, as does chocolate. Overconsumption of high-sugar high-fat foods alters opioid receptors in the brain. Naloxone, a drug for treating opiate overdose, is effective in reducing binging. It also seems that food scientists specifically try to make food as addictive as possible, which seems like an expected outcome from a capitalist food market -- whatever encourages the most consumption will win greater market share. Is it an addiction on par with heroin, alcohol, or tobacco? I doubt it, but using an addiction model might be helpful in treating overeating.
    Don't have links handy but my impression is that this was tried, lots of times, and failed badly. As to the general question of food being addictive, this is mostly an issue of how you define "addictive". I find it useful to draw boundaries so that food (as well as, say, sex or internet) do not fall within them. On the other hand, I don't see a sharp divide between "food" and "drugs". Eating certain kinds of food clearly has certain biochemical consequences.
    What word would you use for people who eat so much they can't move, get HIV from prostitutes, or play WoW with such dedication they die? These people clearly have something in common, and it's definitely more specific than stupidity.
    That is not self-evident to me. Sick (in the medical sense, I bet their hormonal system is completely screwed up). Regular guys with bad judgement and worse luck. Guys who do not know their limits.
    An unlucky choice of examples, I guess. Switch the question to "could brains that can't seem to be able to regulate their behaviour to the point they're severely damaged by it have something in common in their basic physiology that predisposes them to dysregulation when exposed to certain sensory stimuli?" This is still vague enough there's room for evasion, so if you want to continue that way, I suppose it's better we forget about this.
    Well, as I have said several times it's a matter of definition and how wide you want to define "addiction" is arbitrary. Sure, you can define it as positive-feedback loops that subvert conscious control over behavior or something like that -- but recall that all definitions must serve a purpose and without one there is no reason to prefer one over another. What's the purpose here? Note that the purpose cannot be "Can we call eating disorders addictions?" because that's a pure definition question -- however you define "addiction" will be the answer.
    The purpose is to recognize harmful behaviours that people could benefit from fixing and that those behaviours might have similarities that can be exploited. If you browse porn 12 hours a day, it's quite probable you realize you have a problem, but have significant difficulty in changing your behaviour. If you want to browse porn 12 hours a day, then that's fine too, and nobody should try to fix you without your permission. I don't care what you call them, it suffices that the above purposes are fulfilled and that people understand each other.
    I am highly suspicious of calling a variety of behaviors "addiction" as it implies both the lack of responsibility on the part of the subject and the justification of imposing external rules/constraints on him. I don't know of any successful attempts to treat obesity as if it were a true-addiction kind of disorder. One of the problems is that the classic approach to treating addiction is to isolate the addict from the addictive substance. Hard to do that with food and hard to avoid yummy stuff outside of a clinic.
    Taboo responsibility. What does this mean? That some people need bariatric surgeries to limit their eating is a pretty clear indicator they can't control their eating. The kind of isolation rehab you're talking about is an extreme measure even when treating drug addictions, and comprises a marginal proportion of addiction treatment. Think nicotine replacement and varenicline for tobacco addiction or naltrexone and disulfiram for alcoholism and we'll start to be on the same page. Note that I'm not implying these are hugely successful either. All addictions are difficult to treat. Also certain addiction vocabulary and self awareness techniques like identifying triggers could be relevant for treating compulsive behaviour.
    I'd guess it's got to do with affordability and convenience as well as taste. If I had to cook my own food or spend a sizeable fraction of my monthly wage on it, I would be much less likely to eat it unless I'm really hungry, no matter how good it tasted.
    I would agree, but the same thing could be said about pretty much any super-stimulating good or service. If a dose of heroin were available for a nickel at any convenience store, then probably a lot more people would abuse heroin.
    There are foods which, even when I'm not particularly hungry, once I start eating them it'd take a sizeable amount of willpower for me not to eat inordinate amounts of; these include chocolate, certain cookies, certain breakfast cereals, but not pizza. This doesn't mean I don't like pizza: I'm generally very happy to eat pizza for dinner, unless I've had copious amounts of pizza in the last few days.
    I do agree that problem foods are not the same for everyone. However if you talk to people who have difficulty controlling their eating, the same foods and kinds of foods seem to come up pretty regularly . Chocolate is one of them. As a side note, I get the sense that among people who have difficulty controlling their eating, some tend to have more difficulties with sweet foods like chocolate, cookies, cake, etc. Others seem to have more problems with foods which are fatty but not sweet, like potato chips, hot dogs, bacon, nachos, french fries, lasagna, and yes, pizza. Even so, the tastiness of all of these types of foods seems pretty universal.
    I don't think this is at all accurate as a generalization. Insofar as any food can be said to qualify as a superstimulus, some of the best contenders are savory foods which are high in fats and starches, which in our ancestral environment would have been valuable sources of calories, calorie overabundance being far too rare a problem for us to be evolutionarily prepared against. Peanut butter is a good example of a food which would have been an extreme outlier in terms of nutrient density in our ancestral environment (not for nothing is it the main ingredient in a therapeutic food to restore bodily health to people afflicted by famine) which is extremely moreish, despite not being especially high in either sugar or salt. Cheese is a similar case.
    Not an outlier at all. Paleo hunter-gatherers certainly ate nuts. And meat (not the lean muscle meat, but the whole-animal meat including organs and fat) is probably higher in nutrient density.
    Nuts would have been one of the richest sources of macronutrients by density in our ancestral environment, and they wouldn't have been available in great quantity, which is probably in large part why they're such an addictive food. (My girlfriend has a nut allergy, and since I've started having to keep track of nut content in foods, I've noticed that the "snack" aisles in grocery stores can be divided, with fairly little remainder, into chips, pretzels, and nut-based foods.) Liver is higher in micronutrients than nuts, or just about anything else for that matter, and I suspect that it avoids being a superstimulus to our senses because it would be one of the few food sources in our ancestral environment that it's actually possible to get a nutrient overdose on (many species' livers contain toxic concentrations of vitamins, not to mention the various toxins it's filtered out of its host's blood.) In terms of macronutrients, nuts have a higher calorie concentration than any animal tissue other than lard (a cut of flesh which is as calorie dense as nuts would have to be about two thirds fat by weight.) Lard of course is not known for being a very tasty food on its own (it's also very incomplete nutrition,) but is used extensively in cooking foods which people have a pronounced tendency to overeat.
    It can be: http://en.wikipedia.org/wiki/Lardo and http://en.wikipedia.org/wiki/Salo_(food).
    Except that, empirically speaking, there are lots and lots of people who actually can and do consume candy bars, soda pop, or pizza in moderation. Which makes me wonder about the actual mind-mechanisms behind "superstimulus", since we seem to be so very good at learning to deal with it. (Yes, I do have a hypothesis regarding obesity epidemics that's more complex than "Everyone in whole countries is getting caught in a superstimulus feedback loop with their eating habits.")
    It strikes me as an overstatement to say that "we" seem to be very good at dealing with it. In most Western countries, the rates of overweight and obesity are quite high and/or rising. Surely a large majority of those people are failing to eat some kinds of food in moderation. And I doubt those people are overconsuming fresh vegetables and oatmeal. Anyway, do you agree that there is a problem with a decent percentage of people overconsuming foods which tend to be far richer in calories/salt/fat/sugar/etc. than what was typically available in the ancestral environment? And if you agree, what do you think is the cause of the problem?
    I think that "decent percentage" is imprecise, but there's definitely something going on that's making people fatter. It could be bad habits. It could be superstimulus effects (though I'm suspicious regarding the lack of professional literature on a concept that primarily seems to be LessWrongian rather than empirically studied). It could be food additives. I don't know yet; I need to see some actual studies to make a judgement.
    Putting aside the "why" question, do you agree that if you look at people who are overweight or obese, their overconsumption problems tend to focus on certain types of foods, which tend to be very high in calories?
    Overconsumption means "high in calories" almost (if not quite) by definition. Someone who eats raw cabbage nonstop simply isn't going to get to overconsumption levels.
    So that means your answer is "yes"? Also, it sounds like you are saying that among people who have difficulty resisting the urge to eat, there is no particular preference for foods like ice cream, french fries and cookies over foods like cabbage, tomatoes, and broccoli, it's just that the former foods are more likely to cause obesity because they are higher in calories. Do I understand you correctly?
    I'm saying that I don't know of particular preferences within the set of high-calorie foods. There is also the problem of consuming mid-calorie foods like bread or pasta (which humans did for millenia without getting too damn fat until about the 1990s) in completely excessive amounts, for instance. So basically, I don't think you can yell "COOKIES ARE SUPERSTIMULUS, REDUCE COOKIE PRODUCTION NOW!" when in fact lots of fat people are consuming massive amounts of pasta while plenty of thin people consume small amounts of cookies. The picture is much more complicated than simply assuming some arbitrarily constructed reference class of "things not in the ancestral environment" (besides, ancestral hunter-gatherers often got plenty more calories than ancestral peasant farmers, despite coming earlier: which one is our "ancestral environment" here?), which we choose to label as "superstimulus" (does that term have a scientific grounding?), will automatically short-circuit people's decision making.
    This bears repeating. Also keep in mind, many people with western European ancestry have a much higher threshold for diabetes, due to that ancestry's post-agricultural dietary habits. After several thousand years, agriculture becomes part of the evolutionary environment. (In the long view, I often stop and ponder whose ancestral environment and population we are, and how the cultural and environmental choices we're making today will shape the genetic predispositions of our 61st century descendants.)
    Maybe our 61st century descendants will have genes, but if we haven't managed to beat the crap out of evolution and impose our own life-optimization criteria by the year 6000, I will be extremely disappointed.
    That doesn't seem to contradict my point. It sounds like you do agree with me that there are certain foods or types of foods which (generally speaking) tend to be difficult for obese people to resist eating. Right?
    Once again, no. Please attempt to understand my view here instead of trying to force your own. I do not necessarily believe, in the absence of evidence, that the obesity epidemic arises from certain foods (tasty, unhealthy, or otherwise) drugging people into addiction just by being more intense than prehistoric foods. No, food is not in and of itself a drug that can magically alter our decision-making apparatus in some way that doesn't wash out when placed next to the other elements of individual lifestyle. Some foods may contain drugs. Chocolate, for instance, contains theobromide, a mild stimulant and euphoric I find quite enjoyable. Beer contains alcohol, a fairly strong depressant. Some cheeses are said to contain opiates, which supposedly explain the "addictive" quality of cheeseburgers (though studies don't seem to indicate very much evidence beyond that expected of motivated reasoners). Yet nobody eats or drinks chocolate-laced beer with cheese in it. I think that attempting to talk about the obesity epidemic as a failure of rationality due to superstimulus in foods is an attempt to kick a sloppy variable and turn it into a stiff one. I think we need a competing alternate hypothesis. For one thing, it's not as if healthy foods are all dull! A simple chopped-vegetable salad made with fresh ingredients is tasty and healthy, for instance. (Of course, this assumes you live somewhere in which fresh, nutritious veggies are affordable in bulk.... hmm, another contributing factor to the obesity problem?)
    I am trying to understand your view, and you are not helping things by evading my questions. The question I asked you said nothing about the obesity epidemic or the causes of obesity. You read that into the question yourself. I will try one last time: Put aside the causes of obesity and the obesity epidemic. I'm simply asking if you agree with me that for obese people, there tend to be certain foods or types of foods which are difficult to resist eating. It's an extremely simple yes or no question.
    And, to the best of my knowledge, the answer is no. Obese people don't have a hard time not-eating some foods, they have a hard time not eating in general.
    Here's some research which may change your mind: http://jn.nutrition.org/content/133/3/835S.full By the way, is it a surprise to you that chocolate holds the spot as the most craved food as opposed to, say, raw cauliflower? Here's another big surprise for you:
    Since chocolate contains a stimulant/euphoric drug, no, this is not surprising, and I even mentioned it. What would be surprising is if we could see a correlation between obesity and cravings for specific non-chocolate items, or even some way of showing that people who don't eat chocolate are massively less likely to be obese.
    So are you conceding that at least chocolate is a specific food or type of food which many obese people tend to have difficulty resisting? And what of the claim that "Women in particular report extreme liking of or craving for foods that are both sweet and high in fat (e.g., candies, cakes or pastries, ice cream)" Do you dispute it? Is it a surprise to you?
    No, I'm saying that people have some difficulty resisting chocolate. That includes thin people.
    And "people" includes "obese people," agreed? Also, please answer my other question: Do you dispute the claim that "Women in particular report extreme liking of or craving for foods that are both sweet and high in fat (e.g., candies, cakes or pastries, ice cream)"? Is it a surprise to you?
    Are we trying to find things out anymore, or are you just trying to hammer home "HA! OBESITY IS CAUSED BY SUPERSTIMULUS! THERE'S SOME MINOR EVIDENCE OF THINGS THAT SOUND KINDA LIKE SUPERSTIMULUS BEING SUBJECT TO CRAVINGS! TAKE THIS, YOU IGNORAMUS!"? Because this is sounding like the latter.
    Yes, I am trying to nail down your position so that I can figure out exactly where we disagree. You keep trying to change the subject to the causes of obesity. Which is an important question but not the question I have been addressing. The threshold question is whether there are certain foods or types of foods which are particularly difficult to resist. If we agreed on that, then we could go on to discuss why such foods or types of foods are difficult to resist -- is it because they are super-stimulus foods or some other reason? We could also discuss the role such foods play in obesity at an individual or societal level. But those are different questions. You seem to have denied that there exist certain foods or types of foods which are difficult to resist. However, you seem to have made an exception for chocolate. I have presented evidence that there are other foods which are difficult to resist "foods that are both sweet and high in fat (e.g., candies, cakes or pastries, ice cream)" -- at least for women. You refuse to tell me if you dispute this evidence. Why are you playing hide the ball with your position? Trust me, the sky won't fall if you simply admit that you were wrong. Do you dispute the claim that "Women in particular report extreme liking of or craving for foods that are both sweet and high in fat (e.g., candies, cakes or pastries, ice cream)"? (And if not, is it a surprise to you?) This is the last time I will ask.
    Ok, I've spotted the issue. I thought you were linking the two things: "These foods are hard to resist because they are superstimuli. Here, let me prove there are foods that are 'hard to resist' (whatever that means). Now that I've done so, it must be because they are superstimuli." My problems with this are: you need to separate the experience of cravings in absence of food (ie: I can crave chocolate but not have chocolate) from the actual "difficulty to resist" (that needs definition) when the food item is in front of you. You then also need to define "superstimulus" such that the definition makes predictions, and justify belief in such a concept via showing that it applies to your examples of craved foods. I've made an "exception" for actual drugs, as separate from the other content of food. To show what I mean, it should be plain that if I lace a pitcher of water with morphine, you will slowly develop an addiction to the water in my pitcher. This is not because water is difficult to resist, it's because I drugged the water. The fact that theobromide or caffeine occur naturally doesn't make the food "hard to resist", it makes it contain a drug. I don't see a working definition of "difficult to resist", is the issue. Lots of people get cravings and don't act on them, so getting a craving is not evidence that these women actually display less power of self-control when confronted with, say, cake, versus a control group. In the same fashion, lots of people might say, "I need a damn drink!" when they're stressed-out, but the overwhelming majority of them don't become alcoholics, and most don't even actually take a drink! Basically, you seem to my eyes to be failing to differentiate between "People like X" and "People can't control themselves around X".
    It's reasonable to believe that if people report "extreme liking of or craving," for certain foods or types of foods, then a large percentage of people will find such foods difficult to resist. No reasonable person would dispute this without very strong evidence. But anyway, we can't even get to that point because you won't even concede that people (or at least women) report "extreme liking of or craving" for certain foods or types of foods. I asked you three times if you you disputed this claim and you ignored my question each time. Instead, you have decided to strawman me: There's a difference between "extreme liking or craving for X" and "liking X." There is also a difference between "people have difficulty resisting X" and "people can't control themselves around X." Sorry, but I have no interest in engaging with people who insist on playing hide the ball with their position. Nor do I engage with people who exaggerate my position to make it sound unreasonable. This exchange is concluded. Goodbye.
    So... what is it? Why do you think I have a definite position? My "position" here is that the vocabulary for hypotheses is ill-formed. We have effectively spent an entire conversation saying nothing at all because the terms were never defined clearly.
    The Rat Park experiments suggest otherwise, at least as regards morphine.
    That's the really mysterious bit to me. I don't think excessive quantities are likely to be the problem, though. I read a caloric breakdown once of the lifestyle of a 10th-century Scandinavian farmer; the energy requirements turn out to be absurd by modern standards, something like six thousand kcal just to stay upright at the end of the day in peak season. (Winter life was a bit more sedentary, but still strenuous by modern standards.) If you're consuming that much food regularly, an extra five hundred kcal here or there is a rounding error; it's implausible that everyone back then just happened to manage their consumption to within a few percent. Nor was the civilization as a whole calorie-bound, as best we can tell. But judging from skeletal evidence, they didn't suffer from many of the diseases of civilization that we do. The obvious diff here is exertion, but the nutritional literature I've read tends to downplay its role. Or you could blame portion sizes relative to exertion, but larger portions are only fattening because of the excess calories, which brings us back to the original mystery. So either some novel aspect of the post-1900 diet is making modern Westerners fat, or the archaeology or the nutritional science is wrong, or I'm missing a step. And I don't think I'm missing a step. If I had to venture a guess, I might blame lots of simple sugars in the modern diet -- honey was the only sweetener available for most of human history, and it was rare and expensive. But that's extremely tentative and feels a little glib.
    The really creepy part? Whatever it is, it's making Western animals fat. Including the ones that aren't fed scraps of human food.
    That is remarkably interesting-if-true. Data?
    This article contains links to several peer-reviewed research studies on the matter.
    I like to know how you'd justify this claim. Remember that pizza has been available in the United States since the beginning of the 20th century and has been popular since at least the 1950's, yet the obesity epidemic has ony happened relatively recently.
    Also, potato chips were invented in the 19th century; ice cream has been around for ages; ditto for french fries. Of course, obesity has also been growing as a problem over the years too. I think what's changed is that these types of foods have become much more easily available in terms of cost, convenience, and marketing.
    I don't think cost has changed much. Reportedly, in the 1950's a burger cost 15 cents (about $1.3 in today's money) and a slice of pizza cost 25 cents (about $2.2 in today's money). Convenience might have changed but not by a lot, and that may just be because people now just go out for food more often than making it at home. However, marketing could be the big factor here.
    What do you mean by that exactly? How many burgers could the median worker in 1950 buy with their hourly wage, and how many can the median worker today buy with theirs?
    That's a very very complex (and controversial!) topic because 'median worker' or 'median household' is not well-defined. Many households during that era were single-income (not nearly as many as popular opinion would suggest, but still far more than today). There's also the fact that there were more married couples and more children than today. You also have to consider that food hasn't made up the bulk of household expenditures during modern times. Today food accounts for 10-15% of the average family's living expenses, and from the limited information I was able to find, it was about 30% in 1950. To answer your question, I honestly don't know.
    Just based on my general observations, I would have to disagree. Just walking down the street in New York, there are lots of places where you can get a large slice of pizza for $1.00. That's about 8 minutes of work at the minimum wage. Back in 1985, I remember the minimum wage was $3.35 per hour, so 8 minutes of work would have been about 45 cents. I don't recall ever seeing a large slice of pizza for 45 cents back in the 80s. Also, during the 80s, I remember spending about $5.00 for a typical deli lunch consisting of a turkey sandwich and a can of soda. Twenty-five years later, it costs about $6.00 and there are still places where you can get it for $5.00. Or less.
    Besides that, EITC has increased the effective wage.
    It also occurs to me that portion sizes have perhaps increased. If you a Google image search for "portion" "sizes" "over" "time," you get all kinds of charts making this claim. I wasn't around in the 1950s, but it does seem that, at a minimum, soda sizes have increased. I vaguely remember that it was common to get a 10 ounce bottle of soda 30 or 40 years ago. I haven't seen a 10 ounce bottle in years; it seems that 16 ounces is the standard single serving bottle size and 20 ounces is pretty common too. Here's an article which seems to agree: http://abcnews.go.com/WNT/story?id=129685 So if you look at things in terms of dollars per calorie, the decline in the price of prepared foods may very well be even more dramatic than it seems on the surface.
    I guess it depends on whether you eat it for dinner, or as a snack in addition to whatever else you'd normally have for breakfast, lunch and dinner. I suspect he's thinking of the latter. (Likewise, I guess that so long as you're not lactose-intolerant a large cone of ice cream isn't particularly unhealthy as modern foods go, if it's all you're having for lunch.)
    Bad analogy. Eating pizza (or any other high-energy food that you happen to like) is intrinsically rewarding. You don't do it all the time because you trade off this reward with other rewards (e.g. not being fat and hence ugly and unhealthy). Buying a lottery ticket is not intrinsically rewarding if you don't win, which happens with a negligible probability. Well, buying a lottery ticket may be intrinsically rewarding if you suffer from gambling addiction, which means that you've screwed your reward system and by gambling you are doing a sort of wireheading. That's pretty much like doing drugs. At the level of conscious preferences, you don't want to do that.
    I don't know about you, but when I buy a lottery ticket, I usually end up having a few nice daydreams about hitting the $400 million jackpot or whatever. So I would say that for me (and probably many other people), it's intrinsically rewarding. FWIW I'm not a gambling addict. Agree, that's pretty much the point. Of course some forms of wireheading are so dangerous that even occasional indulgence is a bad idea, for example heroin and cocaine. Other forms are less dangerous so that occasional indulgence is safe for most people.
    I don't know, I've never bought lottery tickets, I may only gamble token amounts of money at events where it is socially expected to do so. Maybe I'm wired differently than most people, but what do you find rewarding about it? We are not talking of something like tasty food or sex, which your ancestors brains were evolutionary adapted to seek since the time they were lizards, gambling opportunities did not exist in the environment of evolutionary adaptedness, you need some high-level cognitive processes to tell a lottery ticket from any random piece of paper. It's true that people have difficulties reasoning informally about low-probablity high-payoff (or high-cost) events, which explains why gambling is so popular, but gambling is also one of the few high-uncertainty scenarios where we can apply formal methods to obtain precise expected (monetary) value estimations. Once you do the math, you know it's not worth the cost. But obviously you knew that already, so my question is, how can you still daydream about winning the lottery without experiencing cognitive dissonance?
    As mentioned above, the pleasant daydream of hitting the big jackpot. I disagree; for example one can easily envision a hypothetical caveman deciding whether to hunt for a big animal which may or may not be in the next valley. I don't know. But I can tell you that it's a pleasant feeling. Let me ask you this: Do you ever daydream or fantasize about things which (1) you wish would happen; and (2) are extremely unlikely to happen?
    Sure. But would this hypothetical caveman still decide to hunt if he was pretty much certain that the animal was not there? Uh, sexual fantasies aside (which I can blame my "reptile brain" for), I don't think so.
    I'm not sure, it would probably depend on his assessment of the costs, benefits, and risks involved. In any event, I don't see the point of your question. You asserted that gambling opportunities did not exist in the ancestral environment; that's not so. I think you are pretty unusual; my impression is that most people daydream as far as I know. But let me ask you this: Do you agree that there a decent number of people like me who are not gambling addicts but still occasionally buy lottery tickets? If you agree, then what do you think is the motivation?
    That's just decision making under uncertainty. I was talking about proper gambling, such as buying lottery tickets. My point is that you need some high-level ("System 2") processing to associate the action of buying a ticket to the scenario of winning vast riches, since these are not the sort of things that existed in the ancestral environment. But if you understand probability, then your System 2 should not make that association. Given army1987's comment I suppose it is possible to get that association from social conditioning before you understand probability. On further reflection I think I overstated my claim. I do speculate/daydream about fictional scenarios, and I find it rewarding (I used to that more often as a child, but I still do it). Therefore I suppose it is possible to counterfactually pretend to having won the lottery using suspension of disbelief in the same way as when enjoing or creatiing a work of fiction. But in this case, you don't actually need to buy a ticket, you can just pretend to have bought one! Yes. Habit created by social conditioning looks like a plausible answer.
    You either are using "System 2" with a narrower meaning than standard or are making a factually incorrect assumption. (There were no cars in the ancestral environment, and some people have driven cars while sleepwalking.)
    Once you learn how to drive a car, you can do it using only System 1, but you need System 2 to learn it.
    I still have no idea what your point was. "proper junk food" didn't exist in the ancestral environment; "proper pornography" did not exist in the ancestral environment either. So what? Do you need System 2 processing to associate an erotic story with sexual release? To associate the words "Coca Cola" with a nice sweet taste? Well when you were a child, did you play with toys, for example toy trucks ? And was the play more enjoyable if it were a somewhat realistic toy truck as opposed to, say, a block of wood? It's not very plausible to me. For example, if it were credibly announced that all of the winning tickets for a particular drawing had already been sold, I doubt that occasional lottery players would buy tickets for that drawing.
    I used to daydream a lot, in particular of winning the lottery, when I was a child, but I'm pretty sure it's something I was taught to do by family, teachers and mass media. (The first lottery with really big jackpots in my country had just been introduced, and everybody was talking about what they would do with all that money.) It's not like everything is either evolved or relies on cold emotionless System 2 only. I mean, it's easy for people to get hooked on TVTropes, but it's not like it fulfils any obvious ancestral desire.
    For what value of ‘intrinsically’? It sure isn't rewarding for a paperclip maximizer, and IIUC you seem to be implying that doing drugs isn't intrinsically rewarding for non-addicted people.
    I think for the value of "biologically hardwired into humans".
    (I was going to say ‘then so is alcohol’ (specifically, the feeling of being tipsy), then I remembered of this claim and realized I was probably about to commit the typical mind fallacy.)
    I'm not quite sure about this; there are certainly humans who find pizza inedible for cultural reasons. I suppose you could argue that the composition of pizza is such that it would appeal to a hypothetical "unbiased" human, but that might still be problematic.
    I think the argument is really for "any ... high-energy food that you happen to like", not for culture-specific things like pizza.
    Do I have to specify that I was talking about humans? Non-addicted people generally understand that addictive drugs like heroin or cocaine can give them short-term rewards but potentially hamper the satisfaction of their long-term preferences, hence they assign a negative expected utility to them. On the other hand, eating pizza in moderate amounts is consistent with the satisfaction of long-term preferences.

    A calculated probability of 0.0000001 should diminish the emotional strength of any anticipation, positive or negative, by a factor of ten million.

    I don't play the lottery, but I sometimes have pleasurable daydreams about what I'd do if I were some great success - found the cure for cancer, proved P=NP, won a Nobel prize... objectively speaking, the probability is extremely low, but it doesn't scale my pleasure down by a million times.

    A calculated probability of 0.0000001 should diminish the emotional strength of any anticipation, positive or negative, by a factor of ten million.

    And there goes Walter Mitty and Calvin, then. If it is justifiable to enjoy art or sport, why is it not justifiable to enjoy gambling for its own sake?

    if the results are significant at the 0.05 confidence level. Now this is not just a ritualized tradition. This is not a point of arbitrary etiquette like using the correct fork for salad.

    The use of the 0.05 confidence level is itself a point of arbitrary etiquette. The idea that results close to identical, yet one barely meeting the arbitrary 0.05 confidence level and the other not, can be separated into two categories of "significant" and "not significant" is a ritualized tradition indeed perhaps not understood by many scientists. There are important reasons for having an arbitrary point to mark significance, and of having that custom be the same throughout science (and not chosen by the experimenter). But the actual point is arbitrary etiquette.

    The commonality of utensils or traffic signals in a culture is important, even though the specific forms that they take... (read more)

    The vast majority of scientist, by your standard, don't really understand science. Humans have certain built-in biases and consistently make certain kinds of bad judgements. Even statisticians and mathematicians make common errors of judgement. The fact is that people are often not rational and are driven by emotion, biases, and other non-rational factors.

    While it's useful to study and understand these biases and it's healthy to try to avoid commoon errors of judgement, it's not accurate to declare that anyone who acts irrationally is not truly a scientist or doesn't actually understand science. You are merely observing that they are human.


    I apologize for responding to this where you are highly unlikely to see ... but you seem to be missing an essential point. It is not necessary to understand science to do science any more than it is necessary to understand control theory to balance on one leg. What is disappointing is that even the population of scientists - who would appear the most likely to understand science - make errors that demonstrate that they do not.

    Even so, we rationalists ought not to be deterred from improving our minds by their failure to. That would be an improper use of humility.

    Comments like the parent are the reason I'm glad we don't have a norm against responding to ancient comments.


    Probability theory still applies.

    Ah, but which probability theory? Bayesian or frequentist? Or the ideas of Fisher?

    How do you feel about the likelihood principle? The Behrens-Fisher problem, particularly when the variances are unknown and not assumed to be equal? The test of a sharp (or point) null hypothesis?

    It does no good to assume that one's statistics and probability theory are not built on axioms themselves. I have rarely met a probabilist or statistician whose answer about whether he or she believes in the likelihood principle or in the logical... (read more)

    John, I consider myself a 'Bayesian wannabe' and my favorite author thereon is E. T. Jaynes. As such, I follow Jaynes in vehemently denying that the posterior probability following an experiment should depend on "whether Alice decided ahead of time to conduct 12 trials or decided to conduct trials until 3 successes were achieved". See Jaynes's Probability Theory: The Logic of Science.

    The 0.05 significance level is not just "arbitrary", it is demonstrably too high - in some fields the actual majority of "statistically significant" results fail to replicate, but the failures to replicate don't get into the prestigious journals, and are not talked about and remembered.

    I'm sorry, that seems just wrong. The statistics work if there's an unbiased process that determines which events you observe. If Alice conducts trails until 3 successes were achieved, that's a biased process that's sure to ensure that the data ends with a least one success. Surely you accept that if Alice conducts 100 trials and only gives you the successes, you'll get the wrong result no matter the statistical procedure used, so you can't say that biased data collection is irrelevant. You have to either claim that continuing until 3 successes were achieved is an unbiased process, or retreat from the claim that that procedure for collecting the data does not influence the correct interpretation of the results.

    The universe doesn't care about Alice's intentions. The trials give information and that information would have been the same even if the trials were run because a rock fell on Alice's keyboard when she wasn't watching.

    Surely you accept that if Alice conducts 100 trials and only gives you the successes, you'll get the wrong result no matter the statistical procedure used

    Yes, he does.

    so you can't say that biased data collection is irrelevant.

    Here is where the mistake starts creeping in. You are setting up "biased data collection" to mean selective reporting. Cherry picking the trials that succeed while discarding trials that do not. But in the case of Alice the evidence is all being considered.

    You have to either claim that continuing until 3 successes were achieved is an unbiased process, or retreat from the claim that that procedure for collecting the data does not influence the correct interpretation of the results.

    The necessary claim is "continuing until 3 successes are achieved does not produce biased data", which is true.

    This is a question that is empirically testable. Run a simulation of agents that try to guess, say, which of a set of weighted dice are in use. Pit your 'care what Alice thinks' agents against the bayesian agent. Let them bet among themselves. See which one ends up with all the money.

    If Alice decides to conduct 12 trials, then the sampling distribution of the data is the binomial distribution. If Alice decides to sample until 3 successes are achieved, then the sampling distribution of the data is the negative binomial distribution. These two distributions are proportional when considered as functions of the parameter p (i.e., as likelihood functions). So in this specific case, from a Bayesian point of view the sampling mechanism does not influence the conclusions. (This is in contradistinction to inference based on p-values.) In general, you are correct to say that biased data collection is not irrelevant; this idea is given a complete treatment in Chapter 6 (or 7, I forget which) of Gelman et al.'s Bayesian Data Analyses, 2nd ed.

    I thought the exact same thing, and wrote a program to test it. Program is below:

     from random import random
    p_success = 0.10
    def twelve_trials(p_success = 0.25):
    >>>># Runs twelve trials, counts the successes
    >>>>success_count = 0
    >>>>num_trials = 0
    >>>>for i in range(12):
    >>>>>>>>if random() < p_success:
    >>>>>>>>>>>>success_count += 1
    >>>>>>>>num_trials += 1
    >>>>return success_count
    def trials_until_3(p_success = 0.25):
    >>>># Runs trials until it hits three successes, counts the trials
    >>>>success_count = 0
    >>>>num_trials = 0
    >>>>while success_count < 3:
    >>>>>>>>if random() < p_success:
    >>>>>>>>>>>>success_count += 1
    >>>>>>>>num_trials += 1
    >>>>return num_trials
    for i in range(100):
    >>>>num_tests = 10000
    >>>>twelve_trials_successes = 0
    >>>>for i in range(num_tests):
    >>>>>>>># See how often there are at least 3 successes in 12
    ... (read more)
    Upvoted for actually testing the theory :) I don't believe this is true. Every individual trial is individual Bayesian evidence, unrelated to the rest of the trials except in the fact that your priors are different. If you run until significance you will have updated to a certain probability, and if you run until you're bored you'll also have updated to a certain probability. Sure, if you run a different amount of trials, you may end up with a different probability. At worst, if you keep going until you're bored, you may end up with results insignificant for the strict rules of "proof" in Science. But as long as you use Bayesian updating, neither method produces some form of invalid results. Ding ding ding! That's my hindsight-bias-reminder-heuristic going off. It tells me when I need to check myself for hindsight bias, and goes off on thoughts like "That seems obvious in retrospect" and "I knew that all along." At the risk of doing your thinking for you, I'd say this is a case of hindsight bias: It wasn't obvious beforehand, since otherwise you wouldn't have felt the need to do the test. This means it's not an obvious concept in the first place, and only becomes clear when you consider it more closely, which you did. Then saying that "it's obvious in retrospect" has no value, and actually devalues the time you put in. Try this: (From the Comment Formatting Help)
    You have to be very careful you're actually asking the same question in both cases. In the case I tested above, I was asking exactly the same question (my intuition said very strongly that I wasn't, but that's because I was thinking of the very similar but subtly different question below). The "fairly obvious in retrospect" refers to that particular phrasing of the problem (I would have immediately understood that the probabilities had to be equal if I had phrased it that way, but since I didn't, that insight was a little harder-earned). The question I was actually thinking of is as follows. Scenario A: You run 12 trials, then check whether your odds ratio reaches significance and report your results. Scenario B: You run trials until either your odds ratio reaches significance or you hit 12 trials, then report your results. I think scenario A is different from scenario B, and that's the one I was thinking of (it's the "run subjects until you hit significance or run out of funding" model). A new program confirms my intuition about the question I had been thinking of when I decided to test it. I agree with Eliezer that it shouldn't matter whether the researcher goes to a certain number of trials or a certain number of positive results, but I disagree with the implication that the same dataset always gives you the same information. The program is here, you can fiddle with the parameters if you want to look at the result yourself. I did. It didn't indent properly. I tried again, and it still doesn't.
    Actually, it's quite interesting what happens if you run trials until you reach significance. Turns out that if you want a fraction p of all trials you do to end up positive, but each trial only ends up positive with probability q<p, then with some positive probability (a function of p and q) you will have to keep going forever. (This is a well-known result if p=1/2. Then you can think of the trials as a biased random walk on the number line, in which you go left with probability q<1/2 and right otherwise, and you want to return to the place you started. The probability that you'll ever return to the origin is 2q, which is less than 1.)
    Ah, but that's not what it means to run until significance -- in my interpretation in any case. A significant result would mean that you run until you have either p < 0.005 that your hypothesis is correct, or p < 0.005 that it's incorrect. Doing the experiment in this way would actually validate it for "proof" in conventional Science. Since he mentions "running until you're bored", his interpretation may be closer to yours though.

    I don't follow this. If we meet a scientist who is a Marxist, or a Democrat, or a libertarian, or a Republican, or whatever, we don't point out that there is no empirical proof that any of those political programs would achieve its desired aims, and that no true scientist would hold political beliefs. We accept that political decisions are made on different (some would say weaker) evidence that scientific decisions. More generally, there are many questions that the methods of science can't be used to answer.

    Assuming consistent priorities it should be possible to empirically determine (assuming sufficent ability to experiment and proper procedures) what policies will best achieve those objectives. It should also be possible to predict empirically (to a degree of accuracy superior to intuition) if a policy will achieve its goals.
    Perhaps. But how do you decide empirically what goals to pursue?

    I consider myself a 'Bayesian wannabe' and my favorite author thereon is E. T. Jaynes.

    Ah, well then I agree with you. However, I'm interested in how you reconcile your philosophical belief as a subjectivist when it comes to probability with the remainder of this post. Of course, as a mathematician, arguments based on the idea of rejecting arbitrary axioms are inherently less impressive than to some other scientists. After all, most of us believe in the Axiom of Choice for some reason like that the proofs needing it are too beautiful and must be true; th... (read more)

    Sorry, ambiguous wording. 0.05 is too weak, and should be replaced with, say, 0.005. It would be a better scientific investment to do fewer studies with twice as many subjects and have nearly all the reported results be replicable. Unfortunately, this change has to be standardized within a field, because otherwise you're deliberately handicapping yourself in an arms race. This probably deserves its own post.

    In my head, I always translate so-called "statistically significant" results into (an often poorly-computed approximation to) a likelihoo... (read more)

    I'd rather prefer two studies with 0.05% on the same claim by different scientifists to one study with 0.005%. Proving replicable of scientific studies with actually replicating them is better than going for a even lower p value.
    I wouldn't. Two studies opens the door to publication bias concerns and muddles the 'replication': rarely do people do a straight replication. From Nickerson in http://lesswrong.com/lw/g13/against_nhst/
    4Eliezer Yudkowsky
    Agreed. It's much easier for a false effect to garner two 'statistically significant' studies with p < .05 than to gain one statistically significant study with p < .005 (though you really want p < .0001).
    If you put the general significance standard at P<0.005 you will even further decrease the amount of straight replications. We need more straight replication instead of less. A single study can wrong due to systematic bias. One researcher could engage in fraud and therefore get a P<0.005 result. He could also simply be bad at blinding his subjects properly. There are many possible ways to get a P<0.005 result by messing up the underlying science in a way that you can't see by reading a paper. Having a second researcher reproduce the effects is vital to know that the first result is not due to some error in the experiment setup of the first study.

    "And there goes Walter Mitty and Calvin, then. If it is justifiable to enjoy art or sport, why is it not justifiable to enjoy gambling for its own sake?"

    You don't have to believe (at any level) that there's a higher chance of you winning than there actually is to enjoy gambling. You just have to consider that the "thrill" payoff inherent in the uncertainty itself is high enough to justify the money that will be statistically spent. I think exactly the same argument could be made about sport.

    OT: Eliezer, what do you think about null hypotheses? E.g. what's the correct null hypothesis regarding the probability distribution of the size of (cubic) boxes produced by the box factory, where a flat distribution over the length would produce a polynomial distribution over surface area and volume, for instance?

    pdf, that gets into the issue of ignorance priors which is a whole bagful o' worms in its own right. I tend to believe that we should choose more fundamental and earlier elements of a causal model. The factory was probably built by someone who had in mind a box of a particular volume, and so that, in the real world, is probably the earliest element of the causal model we should be ignorant about. If the factory poofed into existence as a random collection of machinery that happened to manufacture cubic boxes, it would be appropriate to be ignorant about the side length.

    Sorry, ambiguous wording. 0.05 is too weak, and should be replaced with, say, 0.005. It would be a better scientific investment to do fewer studies with twice as many subjects and have nearly all the reported results be replicable. Unfortunately, this change has to be standardized within a field, because otherwise you're deliberately handicapping yourself in an arms race.

    Ah, yes, I see. I understand and lean instinctively towards agreeing. Certainly I agree about the standardization problem. I think it's rather difficult to determine what is the best number, though. 0.005 is as equally pulled out of a hat as Fisher's 0.05.

    From your "A Technical Explanation of Technical Explanation":

    Similarly, I wonder how many betters on horse races realize that you don't win by betting on the horse you think will win the race, but by betting on horses whose payoffs exceed what you think are the odds. But then, statistical thinkers that sophisticated would probably not bet on horse races.

    Now I know that you aren't familiar with gambling. The latter is precisely what the professional gamblers do, and some of them do bet on horse races, or sports. Professional gamblers, unlike the ama... (read more)

    To add to the comment about gambling-- professional gamblers are well aware of the term Dutch book, if not necessarily with arbitrage (though arbitrage is becoming more commonly used).

    Heh. Fair enough, John, I suppose that someone has to arbitrage the books. I'll add it to Jane Galt's observation regarding the genuine usefulness of salad forks.

    I agree that 0.005 is equally pulled out of a hat. But I also agree on your earlier observation regarding there being some necessity for standardization here.

    Personally, I would prefer to standardize "small", "medium", and "large" effect sizes, then report likelihood ratios over the point null hypothesis. A very strong advantage of this approach is that it lets so... (read more)


    The point is not that scientists should be perfect in all spheres of human endeavor. But neither should anyone who really understands science, deliberately start believing things without evidence. It's not a moral question, merely a gross and indefensible error of cognition. It's the equivalent of being trained to say that 2 + 2 = 4 on math tests, but when it comes time to add up a pile of candy bars you decide that 2 + 2 ought to equal 5 because you want 5 candy bars. You may do well on math tests, when you apply the rules that have been trained into you, but you don't understand numbers. Similarly, if you deliberately believe without evidence, you don't understand cognition or probability theory. You may understand quarks, or cells, but not science.

    Newton may have been a hotshot physicist by the standards of the 17th century, but he wasn't a hotshot rationalist by the standards of this one. (Laplace, on the other hand, was explicitly a probability theorist as well as a physicist, and he was an outstanding rationalist by the standards of that era.)

    What makes you think it's a deliberate act to start believing things without evidence? What if it's somewhere along a spectrum of time required to make a rational decision. On x-axis on the far left we've got no time, on the far right we've got all the time of our lives. On the y-axis we've got the effectiveness of decision making, the higher it is the better the performance. Looks like the Yerkes-Dodson inverted "U" relationship. If we spend very little time on making the decision, then it's likely an ineffective decision. If we spend heaps of time making the decision, then it's possible the decision is over analysed and could well be a less effective decision than one where we've spent some optimum amount of time making the decision. How much time could we spend on deciding to eat an apple? We could just grab it off the shelf and eat it - that might be ok, or it could result in us taking a bite of an rotten apple. We could examine the apple for rottenness, we could examine the shop for their overall health standards, we could trace the journey of the apple back through the transport system, all the way back onto the tree, we could do a soil and pest analysis of the environment the apple grew in - this is probably over analysis. Instead we could have an optimum decision with only 30 seconds of observing the apple, squeezed it and it didn't squish, looked over it's surface and there are no obvious holes or yucky markings. The scientist does increase their time spent on making a decision within their field, they believe that their optimum amount of decision making process is moved to the right in the aforementioned graph, because that's their field, their job and reputation. When they turn off their "work" processes they will move back to the left on the graph. Are they now being irrational, or have they simply acknowledged that their optimum decision making no longer needs to be so strict. How much evidence is required to decide that the apple is safe? What

    Yes, there have been many great scientists who believed in utter crap - though fewer of them and weaker belief, as you move toward modern times.

    And there have also been many great jugglers who didn't understand gravity, differential equations, or how their cerebellar cortex learned realtime motor skills. The vast majority of historical geniuses had no idea how their own brains worked, however brainy they may have been.

    You can make an amazing discovery, and go down in the historical list of great scientists, without ever understanding what makes Science work. Though you couldn't build a scientist, just like you couldn't build a juggler without knowing all that stuff about gravity and differential equations and error correction in realtime motor skills.

    I still wouldn't trust the one's opinion about a controversial issue in which they had an emotional stake. I couldn't rely on them to know the difference between evidence versus a wish to believe. If they can compartmentalize their brains for a spirit world, maybe they compartmentalize their brains for scientific controversies too - who knows? If they gave into temptation once, why not again? I'll find someone else to ask for their summary of the issues.


    If 'spirituality' is a catch-all dumpster for 'a common explanation toward things I cannot show empirically at this very moment,' then the criticism is correct: this is a useless discarding of knowledge.

    On the other hand, if you don't start with an assumption of first principles, you have no science at your disposal. Certainly, new knowledge can build upon established knowledge; but what did that knowledge build upon? Sooner or later, we must come up with a blank. No question can be asked, and tested, unless it starts with something to build upon. The ... (read more)


    have you ever actually seen an infinite set?

    Wait, are you an finitist or an intuitionist when it comes to the philosophy of mathematics? I don't think I've ever met one before in person?

    Clearly you have to deal with infinite sets in order to apply Bayesian probability theory. So do you deal with mathematics as some sort of dualism where infinite sets are allowed so long as you aren't referring to the real world, or do you use them as a sort of accounting fiction but always assume that you're really dealing with limits of finite things but it makes the ma... (read more)

    Are you also an empiricist in mathematics, akin to Quine and Putnam?

    Sorry, posted too soon. I'm a little confused because you said that you rejected coherentist views of truth, but most mathematical empiricists these days use the idea of coherence to justify mathematics. (Mathematics is necessary for these scientific theories; these theories must be taken as a whole; therefore there is reason to accept mathematics, to grossly simplify.)

    Q) Why do I believe that special relativity is true? A) Because scientists have told me their standards of evidence, and that the evidence for special relativity meets those standards.

    I also know that GPS satellites work with high precision, and that they wouldn't if they didn't correct for relativity.

    GPS? You can do better than that! I believe special relativity because it's implied by Maxwell's equations, which I have experienced. Normal human speeds are enough to detect contraction, if you do it by comparing E&M.

    Does anyone know how to do this? Looks like Douglas_Knight2 hasn't been here for a while, so he probably isn't going to say. I don't think the path ahead of me is going to have its colors shift as I run faster, so the simplest approach isn't going to work. This would be a really cool science experiment if it were really possible.
    That seems rather bizarre. Was he making some kind of joke? Humans aren't fast, heavy, small or sensitive enough to notice anything that that advanced happening to ourselves.
    It takes very low speeds to see macroscopic magnetic effects from electric charges. I'm not sure that that 'implies special relativity', because it's also consistent with the previous theory. But from a relativistic point of view, that's a relativistic effect of much the same kind as time dilation/length contraction.
    I have a vague memory from an electrodynamics course more than twenty years ago that the electromagnetic field is a four-vector that transforms the same way that spacetime vectors transform under boosts. So what in Victorian physics were two separate things became components of one thing, in the same way that space and time merged into spacetime. And Maxwell's four equations in three dimensions + time became two equations in spacetime. With the old physics, if you had two stationary charged things, they'd attract each other by means of the electric field and there would be no magnetism involved. But two things moving side by side (i.e. the same situation but you've changed your idea of what it means to stop), attracting each other in exactly the same way, had to be explained by saying things like 'a moving charge generates a magnetic field, and the other charge, moving in a magnetic field, feels a force. Another way of saying that is that by moving, you can turn electric fields into magnetic fields. In relativistic physics, there's just the one thing, 'the electromagnetic field', and your motion affects your measurements of the two different components, in much that same way that there's only 'spacetime', and your motion affects your measurements of space and time. Because the electric and magnetic fields are so strong, this interchange is perceptible with simple instruments at low speeds. It was all a long time ago. Perhaps a passing physicist can explain better or correct my flailings?

    John Thacker:

    I consider myself a finitist, but not an ultrafinitist; I believe in the existence of numbers expressed using Conway chained arrow notation. I am also willing to reject finitism iff a physical theory is constructed which requires me to believe in infinite quantities. I tentatively believe in real numbers and differential equations because physics requires (though I also hold out hope that e.g. holographic physics or some other discrete view may enable me to go digital again). However, I don't believe that the real numbers in physics are rea... (read more)

    0Ramana Kumar
    Shouldn't you add probability theory to the list [physics, differential equations]? Only because probabilities are usually taken to be real numbers. I'm curious what you think of real numbers... how would you construct them? I guess it must be some way that looks a limit of finite processes operating on finite sets, right?
    Does this mean that infinite sets are not logical implications of some general beliefs you already have?
    Many schools of math philosophy do without infinity: * http://en.wikipedia.org/wiki/Finitism * http://en.wikipedia.org/wiki/Ultrafinitism * http://en.wikipedia.org/wiki/Mathematical_constructivism * http://en.wikipedia.org/wiki/Intuitionism
    So if someone (A) pubishes a proof of theorem T in a maths journal, it isnt actually true until someone else shows that it corresponds to reality in a lab, and publishes that in a science journal? Or maybe (B) all we need is for some theorems of it to work, in which case we can batrack and suppose the axioms are correct, and then foreward-track to all the theorems derivable from those axioms, which is a much larger set than those known to corresopond to reality? I havent seen e, i, pi or 23 either.

    Eliezer wrote: "Godel's Completeness theorem shows that any first-order statement true in all models of a set of first-order axioms is provable from those axioms. Thus, the failure of Peano Arithmetic to prove itself consistent is because there are many "supernatural" models of PA in which PA itself is not consistent; that is, there exist supernatural numbers corresponding to proofs of P&~P."

    This is getting far from the topic but... I really don't see how Completeness entails anything about PA's failure to prove itself consistent (m... (read more)

    "PA is not expressible as a first-order statement." Countable sets of first-order statements still count. But, yes, this is getting rather far off-topic.


    "Those not willing to examine this evidence are following in the footsteps of Cardinal Bellarmine with his refusal to look through Galileo's telescope. And the refusal is for the same essential reasons: sociology and arrogance."

    From the Crackpot Index: "40 points for comparing yourself to Galileo, suggesting that a modern-day Inquisition is hard at work on your case, and so on."

    Anyway, based on what I've read, Sheldrake's experiments do return statistically significant results, but there tend to be problems with the experiments themselves that suggest the results are not caused by anything currently unknown to physics. For more details, just check out www.randi.org and search for Sheldrake's name.

    Alot of these comments are getting pretty far off topic. Why not create new posts on these interesting topics?

    From the Crackpot Index: "40 points for comparing yourself to Galileo, suggesting that a modern-day Inquisition is hard at work on your case, and so on."

    Are you seeking truth, or seeking to confirm your current beliefs? Do you deny that the mainstream of the scientific establishment has sociological parameters and taboos, and that these are extremely hostile to the possibility of telepathy and related topics? In that case, you might find this essay by the editor of the Journal of Consciousness Studies, of interest (JCS is a mainstream journal ... (read more)

    The fact that Randi and Shermer, leave alone investigate Sheldrake's claim, did not even give it a primary reading, reminded me too much of Prof's Verres's pseudo rationality in HPMOR . In particular, they don't seem to follow this little dictum

    I know for a fact that some scientists, even some world-renowned scientists, are morons outside of their own field. I used to manage construction at a Big 10 University, and had many conversations like this one:

    BRILLIANT SCIENTIST, looking over my estimate for a remodelling project on his floor: "What the heck is this, $4000 for a door? A door? I just replaced the front door of my house for $500!"

    ME: "Sir, your house is made of wood, and the doors don't have to meet any particular fire code. This building is concrete and steel, and the doors have to be 90-minute fire-rated. This means, among other things, that the door slab has to be hollow metal, which means it is heavy, which means that the frame, hinges, latch, and handle all have to be much sturdier than the hardware on wood doors. Also, the carpenter who will install this door is probably getting paid more than carpenters who work residential, and he's going to have to spend more time on it because it is more complicated. Finally, the lock core has to match all the rest of the cores in this building, so as not to mess up the keying system."

    BS: "Don't give me that! This is ridiculous!"

    I wish I had a dime for every time this happened . . . .

    Do you have any idea of whether the first flash of stubborn anger (probably status driven) ever gets undercut by later reflection?
    I haven't seen anyone complain about the doors, but the chairs.... oh my, the chairs. They have to be certified as both totally fireproof/acid proof/base proof and highly ergonomic. Unlike the above case, you can see why, and if it didn't cost 100 times as much, I'd agree. But it does. One certified blessed fireproof / corrosionproof / rustproof / knidproof +5 ergonomics chair costs multiple thousands of dollars. A nice chair satisfying all of the first set of criteria costs multiple tens of dollars. It just won't be +5 ergonomic. Okay… but… We don't sit in one place for 20 minutes, let alone 8 hours! Give us reasonably comfortable metal chairs and stools! Sigh.

    But the proverb does appear true to some degree, and I propose that we should be very disturbed by this fact. We should not sigh, and shake our heads sadly. Rather we should sit bolt upright in alarm. Why? Well, suppose that an apprentice shepherd is laboriously trained to count sheep, as they pass in and out of a fold. Thus the shepherd knows when all the sheep have left, and when all the sheep have returned. Then you give the shepherd a few apples, and say: "How many apples?" But the shepherd stares at you blankly, because they weren't

    ... (read more)

    Hmm. Personally, as a Christian and a student of science (doing a Bachelor of Aviation Technology), I have to say that my thought processes were entirely different from what you described in your article.

    I went with Pascal's Wager, or at least a modified version of it. Any sort of existence is infinitely better than not existing at all; this eliminates atheism, Buddhism, and Hinduism from consideration, along with other reincarnation-oriented religions. Judaism is almost impossible to convert into, so it's out of the running. Of the religions that remain, ... (read more)

    Hmm? If Atheism is correct, I cease to exist after I die no matter what I believe in. If it isn't, I'll either wind up burning in Hell, going to a relatively mediocre afterlife, or ceasing to exist, depending on which religion is correct. What incentive could I possibly have to decide to be an atheist? It seems to be more likely to be true judging by most present science, but that doesn't automatically make it the most rational decision to make. The best-case scenario is that I'm wrong and I wind up as a minor functionary in the Celestial Bureaucracy or something.
    You are an atheist. You just said so. If you verbally self-identify as a Christian, then you'll be a lying atheist. EDIT: And if the reason that you verbally self-identify as a Christian is because you are enticed by Pascal's Wager, then you've made a (subtle) mistake. I can explain the subtle mistake if you want.
    To avoid being punished by the God of Rationality. Since there's no evidence for gods, It sends all theists to Hell.
    And I'd thank him for it, since it's better to spend eternity burning in Religious Hell than ceasing to exist. At least in Religious Hell, I'm still me. ;) Also, I should probably be going to bed since I live in Australia and it's half-past midnight and I have university tomorrow.
    That belief in an afterlife tends to go with belief in a deity doesn't make disbelief in an afterlife a logical consequence of atheism.
    Yes, but it seems fair to say that P(Afterlife|A deity exists) > P(Afterlife|~ A deity exists).
    why? how do you measure that P of caring personal god who saves human souls from extinction is higher that P of unthinking mechanism ("akashic chronicle", "reincarnation wheel") doing the same?
    I don't, but something like a reincarnation wheel or an akashic chronicle is not inconsistent with the existence of a deity so I don't need to.
    For real life example: one Russian kook preaches exactly this doctrine - strong atheism combined with strong belief of immortality of souls. Add holocaust denial, moon landing denial and admiration of Stalin as greatest hero that ever lived and you have something that sells dozens of books and gains many dedicated followers. Any more about him would belong to "irrationality quotes" thread if one existed...
    Interesting. Never head of this guy. Link?
    |Interesting. as interesting as picking up rocks and observing insects crawling under them, IMHO |Never head of this guy. Link? http://en.wikipedia.org/wiki/Yury_Ignatyevich_Mukhin most of his works are online, in Russian of course, links from Russian wiki page
    What, insects are fascinating!
    This seems far from exhaustive. Edit: To clarify, my objection is not that you've ignored certain current religions; my objection is that you've restrained the field to current religions in the first place, as if they were somehow inherently more plausible than the vast unexplored majority of religionspace.
    Good for you. Now you only have to renounce all pride, glory and luxury and spend your life praying for the gift of faith. It will eventually come, as Pascal reassures us. http://www.indepthinfo.com/extended-quotes/necessity-of-the-wager.shtml (scroll down to note 233 for Pascal's famous wager argument in its full context)

    I'm curious if you actually put as much thought into this as you claim to. I'm also curious if you grew up in a largely Christian environment. This entire piece sounds a bit like motivated cognition. In particular, I have to wonder whether your justification for throwing out Judaism as being "almost impossible to convert into" reflects an actual attempt to investigate this matter. Depending on the denomination/movement, the time it takes can vary from a few weeks or months (in some Reform versions) to as long as 2-3 years (in Orthodox forms). It also seems like you didn't do much research because under your framework there are much stronger reasons to reject Judaism. In particular, the vast majority of forms of Judaism don't believe in eternal damnation, and those that do generally severely limit the set of people whom it applies to. You seem to have an associated problem in generalizing about Christianity and Islam in that there are universalist or close to universalist forms of both those religions. Not only that, but even among non-universalists there is a chance for members of other religions to go to heaven. (If one were just looking at the Abrahamic religions for example and trying to minimize one's chance of hell, Judaism might make the most sense since many forms of Christianity and Islam are ok with that). But you seem to have also simply avoided thinking about many religious traditions, such as Mormonism and the Ba'hai.

    I'd like to think I put as much thought into it as I think so! :P I don't think I wrote down the answer first and then filled in the proof, but I suppose I can't be totally sure I didn't. I did get raised in a Christian environment, but we were hardly the type who'd go to Church every week. One of my friends as a teenager had his mother converting into Judaism; apparently people who convert into the religion have to go through the diet strictures and whatnot extra-strictly. That's what I meant by "almost impossible to convert into". My understanding of the Jewish view of the afterlife is that they either go to Heaven or cease existing (Sheol) which is infinitely worse than eternal hellfire, and a decent Christian will still get into Jewish heaven since we'd follow the Noahide laws, so that way I'm covered. Mormonism was rejected because the guy who founded it was a known con man, and the nature of the Book of Mormon is such that if it is true, you can't not believe in it and go to Heaven, and if it isn't, then you can't believe in it and go to Heaven. Since he was a con man and therefore it probably isn't true, it's probably not a good idea to believe in it. I'll admit that I don't think I did much looking into Bahai, other than seeing that they were basically a religion that splintered off of Islam. Looking it up on Wikipedia, though, it looks like they believe in reincarnation? Bleh. My mind is who I am; if it gets deleted when I go onto the next world, there's no point.
    This remark makes it sound even more like you didn't do much research. The belief that one ceases to exist was historically floating around in some sects but wasn't a prominent viewpoint from about 100 CE to 1800 CE where it again got picked up by the most weak theistic and deistic strains of Judaism (such as some Reform and Conservative types). Most Orthodox for example believe in a heaven and (temporary) hell pretty similar to that of Christianity (although even this is complicated by the lack of any strong doctrinal statements. There's a lot more fracturing without anything like the statements of faith or catechisms found in many forms of Christianity). Also, while it is clear that Muslims follow the Noachide laws by most approaches it is actually far from clear that Christians count as such. In particular, the belief in the divinity of a human, Jesus, according to many opinions runs afoul of the prohibition on idolatry. Islam doesn't have this problem when running into the Noachide laws because no claim is made that Muhammad is divine, indeed quite the opposite. ETA: Also the thing about converts keeping laws extra strictly is only true in some strains also. Note also that this simply amounts in some strains to actually requiring converts to keep the rules (for example in the United States only about half of all Conservative Jews keep kashrut but it is expected that converts keep some form. The Conservative Movement leaders believes that everyone should keep Kashrut but in practice they can't get most of their members to actually do so). That's not true. Many Mormons believe that non-Mormons can go to heaven. The only caveat is that non-Mormons don't progress as much as Mormons.
    Also, keeping kashrut only seems almost impossible if it's something you don't want to do. Obviously, there are a great many people who do it, though the feasibility depends greatly on where you live. The sort of conversion which seem to be extremely difficult is one which will get you Israeli citizenship.
    Especially since Mormons are in the habit of converting non-Mormons after their deaths.
    One other thought: If you are as concerned about continuing to exist as you say you are then you should be much more worried about religions in which believers don't stop existing and non-believers do stop at death. In that case, your options become a bit more limited. I take it you aren't either a Jehovah Witness or a classical Karaite?
    My value system is just the opposite. To me eternal hellfire is the worst thing possible, hence infinitely worse than nonexistence. But since the chances for it appear infinitesimal, I easily assign greater expected utility to the freedom from cognitive dissonance that consistent empiricism affords me.

    "everything is connected to everything else". (Since there is a trivial isomorphism between graphs and their complements, this profound wisdom conveys exactly the same useful information as a graph with no edges.)

    This seems like it is somehow connected to the fact that pantheism and solipsism are identical beliefs with different terminology for that which provides sensation.

    And yes, that does make me wonder if I can trust that scientist's opinions even in their own field - especially when it comes to any controversial issue, any open question, anything that isn't already nailed down by massive evidence and social convention.

    Not all scientists go around tallying up the expectations payed by their beliefs. If they have a freeloading belief they hadn't examined, one that doesn't affect their science, so what of it?

    There's something fundamentally different between a gambling economist and a theist scientist: the thinking requi... (read more)

    To use slightly different language I would suggest it is always a warning flag but only an actual problem when it causes real-life expectations or field related claims. I say always a warning flag because the kind of brain that can maintain religious belief despite scientific education and experience tends to have traits that I distrust.
    Placing a warning flag onto a theist scientist's work would only be justified if you had evidence in support of the claim: P ( good science | scientist is theist ) < P ( good science ) . Less Wrong provides many excellent philosophical examples in support of that claim. But what about real world examples? Do theist scientists actually tend to do lower-quality science?
    Don't confuse a prior with a priori. ;)
    Fixed. Thanks. I didn't realize that my statement read, "A priori reasoning can only be justified if it's a posteriori." Edit: so what about my actual statement? Or, are we done having this discussion?
    I've see statistics showing that scientists tend to be less theistic than the general population and that the best scientists (National Academy members, for example) tend to be less theistic than scientists in general. So that provides the correlation you are asking for. But, I strongly suspect that in this case, correlation does not imply causation. I have seen numerous examples, though, in which scientific enquiry with the choice of subject matter motivated by theism is of lower quality than science done without that motivation. However, the same kinds of bad results can arise from motivation by social activism or personal animosity or simply prideful intransigence.
    Absolutely. Hence, the warning flag. A scientist expecting to find the evidence of God doesn't just have freeloading beliefs, but beliefs that pay rent in wrong expectations. That's akin to a gambling economist. I'd say it's good evidence in favor of P ( good science | scientist is theist ) < P ( good science ) . Of course, your point about correlation not causation is very valid, too. Someone in the discussion once said that atheism on average adds ~40 to IQ (I might be remembering incorrectly). I suppose high IQ is correlated with both excellence as a scientist and an ability to reconsider and abandon theism if the question ever arose. My specific interest is whether or not atheism alone makes scientists better.
    Buster, that's the kind of brain you have. We're not built well, and not built too differently either. Even if you don't believe in a big dude in the sky who will preserve your identity after your physical form is destroyed, you have a brain that is completely suited to believing that, and your non-belief is a sign of the particular experiences you have had. The question is whether you believe that the set of experiences required to become a good scientist necessarily include those experiences that force one to adopt atheism. I think the number of important discoveries made by theists throughout history, and even in the modern day, indicates otherwise.
    Do not refer to me as buster. You may note that in the very sentence you quote I refer to experiences, a rather critical part of my claim. While I am not inclined to go into detail on personality research right now there is, in fact, a relationship between the strength of a person's compartmentalisation ability and other important traits. Genetics plays a critical part in the formation of beliefs from stimulus and there is some information that can be inferred from the expression of said beliefs.
    I apologize, but I am also confused. Is this an issue with gender, formality, or something else? I feel like I should be able to generalize you taking issue with that to other things, and also avoid all of those, but it would be helpful for you to explain. I still feel that, in MoreOn's terms, P ( good science | scientist is theist ) is close enough to P ( good science ) that starting from the position of distrust is probably over-filtering. I don't think that resorting to explaining the personality traits that might explain that relation are important, unless we know an individual's traits well enough to use those to estimate the kind of science she will produce.
    There is a positive correlation between an individual thinking well in one area and thinking well in another area, a relationship which I do not consider terribly controversial. A (loosely) related post is the Correct Contrarian Cluster.
    Like being able to judge if some knowledge is dangerous and public relations?
    Correlations. Not deductive certainties. A correlation that has perhaps been fully accounted for and then some in that case. And do we really need to bring that up? Really, it's all been said already...
    AAaand the graph gives me a coughing fit. Good job.
    Oh, trust me, I wouldn't defend this one. Some profs showed it as an example of a utility function for which gambling would make sense, rationally. I'd say if your utility function looks like this, you have problems far worse than gambling.
    That depends on whether you think a propensity to compartmentalize is a good thing or not.

    In learning we have two drives, to find truth and to avoid error. These rarely do come in conflict, and even more rarely do they come into any major kind of conflict, but there still are times that we have to decide whether we consider it to be more important to find truths or to avoid errors. Religion, for instance.

    Let us imagine that a time traveler arrives from some distant point in the future and teaches me five facts which are not currently known to the world. Four of these facts can be explained with current scientific knowledge, and when they are t... (read more)



    [This comment is no longer endorsed by its author]Reply


    [This comment is no longer endorsed by its author]Reply

    This article reminds me of a question one of my favorite teachers asked his classes. Are you learning to enrich your life or to avoid pain? What he wanted us students to question was our motivations for sitting in his class and taking notes and memorizing curriculum. Was it because we wanted to do what society tells us we need to do (get good grades, go to college, make a lot of money) or because we genuinely wanted to learn? Obviously the answer for the vast majority of student is the former. The same could be said of the scientists who operate differentl... (read more)