You are viewing a version of this post published on the . This link will always display the most recent version of the post.

    Homo sapiens' environment of evolutionary adaptedness (aka EEA or "ancestral environment") consisted of hunter-gatherer bands of at most 200 people, with no writing.  All inherited knowledge was passed down by speech and memory.

    In a world like that, all background knowledge is universal knowledge.  All information not strictly private is public, period.

    In the ancestral environment, you were unlikely to end up more than one inferential step away from anyone else.  When you discover a new oasis, you don't have to explain to your fellow tribe members what an oasis is, or why it's a good idea to drink water, or how to walk.  Only you know where the oasis lies; this is private knowledge.  But everyone has the background to understand your description of the oasis, the concepts needed to think about water; this is universal knowledge.  When you explain things in an ancestral environment, you almost never have to explain your concepts.  At most you have to explain one new concept, not two or more simultaneously.

    In the ancestral environment there were no abstract disciplines with vast bodies of carefully gathered evidence generalized into elegant theories transmitted by written books whose conclusions are a hundred inferential steps removed from universally shared background premises.

    In the ancestral environment, anyone who says something with no obvious support, is a liar or an idiot.  You're not likely to think, "Hey, maybe this guy has well-supported background knowledge that no one in my band has even heard of," because it was a reliable invariant of the ancestral environment that this didn't happen.

    Conversely, if you say something blatantly obvious and the other person doesn't see it, they're the idiot, or they're being deliberately obstinate to annoy you.

    And to top it off, if someone says something with no obvious support and expects you to believe it - acting all indignant when you don't - then they must be crazy.

    Combined with the illusion of transparency and self-anchoring, I think this explains a lot about the legendary difficulty most scientists have in communicating with a lay audience - or even communicating with scientists from other disciplines.  When I observe failures of explanation, I usually see the explainer taking one step back, when they need to take two or more steps back.  Or listeners, assuming that things should be visible in one step, when they take two or more steps to explain.  Both sides act as if they expect very short inferential distances from universal knowledge to any new knowledge.

    A biologist, speaking to a physicist, can justify evolution by saying it is "the simplest explanation".  But not everyone on Earth has been inculcated with that legendary history of science, from Newton to Einstein, which invests the phrase "simplest explanation" with its awesome import: a Word of Power, spoken at the birth of theories and carved on their tombstones.  To someone else, "But it's the simplest explanation!" may sound like an interesting but hardly knockdown argument; it doesn't feel like all that powerful a tool for comprehending office politics or fixing a broken car.  Obviously the biologist is infatuated with his own ideas, too arrogant to be open to alternative explanations which sound just as plausible.  (If it sounds plausible to me, it should sound plausible to any sane member of my band.)

    And from the biologist's perspective, he can understand how evolution might sound a little odd at first - but when someone rejects evolution even after the biologist explains that it's the simplest explanation, well, it's clear that nonscientists are just idiots and there's no point in talking to them.

    A clear argument has to lay out an inferential pathway, starting from what the audience already knows or accepts.  If you don't recurse far enough, you're just talking to yourself.

    If at any point you make a statement without obvious justification in arguments you've previously supported, the audience just thinks you're a cult victim.

    This also happens when you allow yourself to be seen visibly attaching greater weight to an argument than is justified in the eyes of the audience at that time.  For example, talking as if you think "simpler explanation" is a knockdown argument for evolution (which it is), rather than a sorta-interesting idea (which it sounds like to someone who hasn't been raised to revere Occam's Razor).

    Oh, and you'd better not drop any hints that you think you're working a dozen inferential steps away from what the audience knows, or that you think you have special background knowledge not available to them.  The audience doesn't know anything about an evolutionary-psychological argument for a cognitive bias to underestimate inferential distances leading to traffic jams in communication.  They'll just think you're condescending.

    And if you think you can explain the concept of "systematically underestimated inferential distances" briefly, in just a few words, I've got some sad news for you...

    New Comment
    106 comments, sorted by Click to highlight new comments since:
    Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

    The explanation from ancestral environment seems likely. However, there is also a rational argument for refusing to accept a claim unless all the steps have been laid out from your own knowledge to the claim. While there are genuine truth seekers who have genuinely found truth and who we therefore should, ideally, believe, nevertheless a blanket policy of simply taking these people at their word has the unfortunate side-effect of also rendering us vulnerable to humbug, because we are not equipped to tell apart the humbug from the true statements many steps removed from our knowledge.

    At the same time, people do not universally reject claims that are many steps removed from their own experience. After all, scientists have made headway with the public. And unfortunately, humbug also regularly makes headway. There have always been niches in society for people claiming esoteric knowledge.

    I think it's about the extent you have reason to believe you trust authority without evidence. What if someone meets 'omega' who is as 100% trustable as the laws of gravity, in their empirical experience? Then, it's 100% rational to trust them, perhaps over their own senses, which are sometimes illusory. Induce fear to get people to stick with the status quo or make a non-choice, and frustrated anger to get to them to take risks. When people are told something without explanation, they might react with fear out of awe, or anger out of frustration that you haven't presented something rational to them. Vice-versa is possible too. Therefore, I would predict that the inferential distance doesn't have a 1:1 relationship with the uptake of that information.

    Eliezer, this is a great insightful observation.

    The young seem especially vulnerable to accepting whatever they are told. Santa Claus and all that, but also any nonsense fed to them by their schools. Schools for the young are particularly effective instruments for indoctrinating a population. In contrast, the old tend to be quite a bit more resistant to new claims - for better and for worse.

    An evolutionary explanation for this is fairly easy to come up with, I think. Children have a survival need to learn as much as they can as quickly as they can, and adults have a vital role as their teachers. In their respective roles, it is best for adults to be unreceptive to new claims, so that their store of knowledge remains a reliable archive of lessons from the past, and it is best for the young to accept whatever they are told without wasting a lot of time questioning it.

    It is too easy to come up with a just so story like this. How would you rephrase it to make it testable?

    Here is a counterstory:

    Children have a survival need to learn only well-tested knowledge; they cannot afford to waste their precious developmental years believing wrong ideas. Adults, however, have already survived their juvenile years, and so they are presumably more fit. Furthermore, once an adult successfully reproduces, natural selection no longer cares about them; neither senescence nor gullibility affect an adult's fitness. Therefore, we should expect children to be skeptical and adults to be gullible.

    This counterstory doesn't function.

    A child's development is not consciously controlled; and they are protected by adults; so believing incorrect things temporarily doesn't harm their development at all.

    If you wish to produce a counterstory, make it an actual plausible one. Even if it were the case that children tended to be more skeptical of claims, your story would REMAIN obviously false; whereas Constant's story would remain an important factor, and would raise the question of why we don't see what would be expected given the relevant facts.

    I've just learned that there is interesting research on this topic. Sorry I don't have better links.
    Interesting. Although that strongly suggests that children in fact are more gullible are specifically religious stories. I'd have to wonder if they are actually more gullible than those, have been primed to think that religious stories are allowed to have more fantastic elements and still be true, or have found out that expressing skepticism of such stories is more likely to result in negative consequences. The last seems unlikely to me.

    As long as we're on the subject of evolutionary-psychology/sociobiology/whatever if someone tries to argue against it by saying it's just a bunch of reactionaries trying to justify inequity you can point to the data which says it ain't so. Another soldier sent against the army of reductionism defeated, surely a signal from Providence that all will be assimilated.


    For example, talking as if you think "simpler explanation" is a knockdown argument for evolution (which it is)

    I don't quite agree - by itself, X being "simpler" is a reason to increase my subjective belief that X is true (assuming that X is in a field where simplicity generally works) but it's not enough to prove e.g. creationism false. Rather, it is the total lack of evidence for anything supernatural that is the knockdown argument - if I had reason to believe that even one instance of say, ghosts or the effects of prayer were true, then I'd have to think that creationism was possible as well.

    This is certainly an insightful post. I'm not sure the example is that compelling though.

    If you argue with a young earth creationist, they could full well understand what you mean, but simply disagree and claim "God did it," is a simpler explanation still. In fact, if we were to presuppose an intelligent being of infinite power existed and created things, it seems it would actually be a simpler explanation.

    Most people, though perhaps not all, who have no belief in an omnipotent designer will pretty quickly accept evolution. So that might not be t... (read more)

    Not necessarily. The introduction of God into the story actually makes the theory quite a bit more complex, as far as amount of information stored goes. The length of time it takes to explain your theory does not necessarily correlate to how simple it is. "God did it" is monumentally more complex than "The random process of natural selection ensures that those organisms which have mutations that lend them a better chance of survival will, on average, be more likely to survive and pass those mutated genes on to the next generation than an organism without beneficial mutations, etc etc etc." Though actually, if you look closely at the two arguments above, they don't necessarily contradict each other. :3 I personally feel that "God did it" is a simpler explanation than "Amino acids magically combined via processes we don't understand and haven't been able to duplicate, creating life essentially ex nihilo"... but that doesn't at all mean taht either of these explanations are objectively simple!
    Is "God did it" a simpler explanation than "amino acids combined via complex and unlikely processes we understand and can even replicate crudely, creating life from a perhaps murky but essentially non-magical source"? What is your gut reaction?
    When isolated in this manner, my gut reaction is "no".
    6Paul Crowley
    Have you read Occam's Razor?
    I just reread it; thank you for allowing to see one of Eliezer's posts in a new light. Always a pleasure. However, I have other data at hand that seems to lend credence to the "God exists" theory; I don't have to reply on the results of one test. If I did, then by that same logic, we would always have to assume that a coin once flipped would be 100% biased toward the side upon which is landed. Your program, in order to describe the universe, has to be the best model of every single point in the universe. I'm sure there were people who argued that Newton's equations were simpler than General Relativity. But the data cannot be denied.
    I think there are two distinct concepts here: One of them is Bayesian reasoning, and the other is Solomonoff induction (which is basically Occam's Razor taken to its logical extreme). Bayesian reasoning is applicable when you have some prior beliefs, usually formalized as probabilities for various theories being true, (e.g 50% chance God did it, 50% amino acids did it), and then you encounter some evidence (e.g. observe angels descend from the sky), and you now want to update your beliefs to be consistent with the evidence you encountered (e.g. 90% chance God did it, 10% amino acids did it). To emphasize, Bayesian reasoning is simply not applicable unless some prior belief to update. Sounds like you're referring to Bayesian reasoning here. You're saying without that "other data", you have some probabilities for your various theories, but then when you add in that data, you're inclined to update your probabilities such that "God did it" becomes more probable. In contrast, Occam's Razor and Solomonoff induction do not work with "prior beliefs" (in fact, Solomonoff is often used, in theory, to bootstrap the Bayesian process, providing the "initial belief", from which you can start using Bayesian to update from). When using Solomonoff, you enumerate all conceivable theories, and then for each theory, you check whether it is compatible with the the data you currently have. You don't think in terms of "this theory is more probable given data set 1, but that theory is more probable given data set 2". You simply mark each theory as "compatible" or "not compatible". Once you've done that, you eliminate all theories which are "not compatible" (or equivalently, assign them a probability of 0). Now all that remains is to assign probabilities to the theories the remain (i.e. the ones which are compatible with the data you have). One naive way to do that is to just assign uniform probability to all remaining theories. Solomonoff induction actually states that you should assign
    There is no scientist who claims amino acids magically appeared on earth. We have been able to simulate amino acid synthesis using conditions and simple inorganic molecules present on the young earth. Read the Wikipedia article for abiogenesis for a primer if you want to educate yourself.
    Once you have posited a God to take care of the creation of the amino acids, "God did it" becomes much simpler an explanation of the rest - referring to an entity that has been established to exist is not a terribly long message.

    When you say "A clear argument has to lay out an inferential pathway, starting from what the audience already knows or accepts. If you don't recurse far enough, you're just talking to yourself."

    this strongly reminds me of what it is like to try talking, as an atheist, with a christian about any religious issue. I have concluded years ago that I just shouldn't try anymore, that reasonable verbal exchange is not possible...

    I suppose that I should recurse... but how and how far where I am not sure.

    I'm sure that the Christian feels the same way. ;D The problem there isn't inferential differences. It's belief in belief. The best way to disabuse a Christian of any false notions - under the assumption that those notions are false - would be to lead them to Less Wrong. :P Of course, you can lead a horse to water...

    The best way to disabuse a Christian of any false notions - under the assumption that those notions are false - would be to lead them to Less Wrong. :P

    I don't agree. I think the best way to disabuse them of such notions would be to lead them to extremely high status atheists including a community of highly attractive potential mates. You change group affiliation beliefs by changing desired group affiliation.

    I think our disagreement stems from a fuzzy definition of the word "best". I believe that it is better to believe something for good (read: valid) reasons than to believe it for bad reasons, regardless of the truth value of the thing being believed. So yes, your suggestion may lead more Christians to toss their Christianity, but mine makes them more rational thinkers, which (under the assumption that their Christian beliefs are wrong, which assumption I decline to assign a truth value in this post) leads them to atheism as a side benefit. Essentially, this is the question posed: Which is the greater sin, if Christianity is wrong? Christianity, or irrationality?

    So yes, your suggestion may lead more Christians to toss their Christianity, but mine makes them more rational thinkers

    The same influences that make people toss Christianity are also what will influence people to become more rational. Leading people to lesswrong on average makes them scoff then add things to their stereotype cache.

    Which is the greater sin, if Christianity is wrong?

    If Christianity is wrong then I'd say neither. ;)

    This, if true, is horribly sad, and I concede the point, letting go of my faith in the inherent open-mindedness of humanity. Of course, I might have known better; my own efforts have reaped no fruit except my wife thinking of Eliezer Yudkowsky as a rabid crackpot. :/ Ha! Then let me elucidate, and define the term "sin" to mean that action which runs against a given moral code.
    You often say things with a certain simple realism that jives with me. I've definitely learned to appreciate the style more since I joined LW, and 10 times moreso since really absorbing a few subskills of a few SingInst folk. How much social psychology-like stuff have you studied? I get a weak impression that it's not much more than the average LW regular but that unlike the average LW regular you have the good habit of regularly explicitly talking about (and thus assuredly explicitly thinking about) certain simple but oft-ignored phenomena of standard social epistemology---or perhaps they'd generally be better described as signalling games/competitions with an epistemic flavor. The very-related skill of "being constantly up a meta level" is really the only prerequisite skill for building the master-skill of being able to automatically immediately generate decent models of any real or imagined social epistemic scenario or automatically with-some-effort generate thorough complex models. You strike me as one of the people on LW who could build up this skill and make it a very sharp weapon, which would be generally useful to any community or organization in the coming years that is trying to raise its sanity waterline. (Vladimir_M also obviously has some kind of related skillset.) I could link you to a concrete example or two in LW comments if you don't quite follow what skill it is I'm getting at or how it's cool.
    Quite a lot but it is not specialised (into PUA etc). I've also probably forgotten a lot, since my interest peaked a few years back.
    This is probably because of the site design and not necessary.
    That no doubt makes a difference but my appeal was to universal human behavior. Exposure to new, unusual behaviours from a foreign tribe will most often invoke a rejection and tweaking of social/political positions rather than an object level epistemic update. Because that's what humans care about. (This doesn't preclude directing interested parties to lesswrong or other sources of object level information. We must just allow that there will be an extremely low rate of updating.)
    I think this would depend considerably on which particular non-Christian set of beliefs turned out to be right. Asking "how should we behave in a non-Christian universe?" sounds to me like asking "what should we feed to a non-cat?".
    I'll ask you to review the child of this post wherein I provide a clearer definition of the term "sin". It is a generally held consensus that there is in fact an objective morality which is causally disconnected from (or at least causally unaffected by) any extant religion. In that sense, my question is, I believe, sensical. The above is predicated upon my inference, from your comment, that you read into my use of the word "sin" a religious connotation. Another possible inference is that you legitimately believe that we live in a Christian universe, and therefore that supposing counterfactuals is useless. In that case, I wonder how you get by during the day without making any plans based upon hypothetical events. .... and I also, in that case, appreciate not being the only Christian on this site. ;D But that doesn't forgive your error.
    I did see the comment in which you defined sin. I'm not sure where our assumptions disconnect, so I'll just try to spell out as many of mine as I can think of. I assume that Christianity contains or constitutes claims about what the correct moral code is, such that accepting Christianity is true necessarily implies accepting a certain standard of right and wrong. I further assume that there exist at least two mutually-incompatible non-Christian claims about what the correct moral code is. That is, if we reject Christian moral values, we still have to decide between Buddhism and Hinduism.
    Let me verify your meaning before I respond in earnest: You are operating under the proposition that morality necessarily derives from religion?
    ...not exactly. It would be more accurate to say that I'm assuming that most religions, and Christianity in particular, imply moralities, but there may also be nonreligious moralities. I realize I'm hugely oversimplifying (for example, by treating "Christianity" as internally homogeneous), but I need to omit most of the variables in order to get anything done in finite time. This started with the phrase "if Christianity is wrong"; are you saying that this was not meant to imply anything along the lines of "if Christian morality is wrong", that it was meant entirely as an empirical proposition, holding moral values constant? [edit: ...holding terminal moral values constant?]
    Oh! I see. :3 Yes, that is what I'm saying. If I wasn't Christian, I certainly wouldn't start murdering people.
    Interesting. Do you believe, then, that God commands a thing because it is good, rather than that a thing is good because God commands it?
    Yes and no. :3 This is one of those "large inferential distances" things, but I'll take a stab at explaining. First, there are laws that God is bound to; laws of morality, not just laws of physics, although I think He's also, in all probability, bound by the laws of physics (not necessarily as we understand them). This is evidenced by the number of times that God has told us that He is "bound"; if He did not follow these rules, He would "cease to be God". On the other hand! God gave rules to the Jews (a la all of Deuteronomy) that do not apply to modern-day Christians, because Jesus' sacrifice "fulfilled" that law. God gives different commands at different times to different people: for example, God has at various times in history endorsed polygamy for various peoples, but He has indicated that polygamy outside His explicit instructions is sinful (cf. Jacob 2, D&C 132). So: Everything that God commands us to do is Good, but not everything that is Good is something that God has explicitly commanded us to do.
    Is reviving dead threads frowned upon here? That was an incredibly insightful comment to me because it explains my deconversion (from Catholicism) and Leah Libresco's conversion to it (she has a blog on patheos called unequally yoked)*. I wonder how general this is? *Status is obviously defined by the person whose group affiliation is changing. The high status atheists that changed my desired group affiliation were some atheists on, who were a lot more like me than any catholics I had met. The high status Catholics that changed Leah's desired group affiliation were her friends, the people in her debating club and her Catholic boyfriend, whom she went to mass with (willingly) for more than a year.
    No, by all means go ahead and comment wherever you have something to say.
    As wedrifid said, reviving "dead threads" is fully acceptable and even encouraged in many occasions, AFAICT. The one thing to be careful of is to enter argument mode or ask questions or offer specific, targeted insight to a particular poster on a very old post. Many of us have wasted some time early on by answering the questions or debating the assertions of an old comment originally made on Overcoming Bias before the transfer and where the author is long gone or never came to LessWrong in the first place.
    That is what happened to me.

    This reminds me of teaching. I think good teachers understand short inferential distances at least intuitively if not explicitly. The 'shortness' of inference is why good teaching must be interactive.

    I think Vygotsky's expression "zone of proximal development" means "one inferential step away", so in theory professional teachers should understand this. I prefer to imagine knowledge like a "tech tree" in a computer game. When teaching one student, it is possible to detect their knowledge base and use their preferred vocabulary. I remember explaining some programming topics to a manager: source code is like a job specification; functions are employees; data are processed materials; exceptions are emergency plans. Problem is, when teaching the whole class, everyone's knowledge base is very different. In theory it shouldn't be so, because they all supposedly learned the same things in recent years, but in reality there are huge differences -- so the teacher basicly has to choose a subset of class as target audience. Writing a textbook is even more difficult, when there is no interaction.

    Psychohistorian: it depends on what you mean by "simple" and "explanation". The sense in which "it's the simplest explanation" is a powerful argument for something is not one in which "God did it" is the simplest explanation for anything.

    Eliezer_Yudkowsky: I've seen the kinds of failures of explanation you refer to, and there's also the possibility that the explainer just isn't capable of explaining all of the inferential steps because he doesn't know them. In that case, the explainer is basically "manipulating symbols without understanding them". This is why I've formulated that principle (sort of a corollary to what you've argued here) that:

    "If you can't explain your idea/job/research to a layman, given enough time, and starting from reference to things he already understands, you don't understand it yourself."

    That seems so simple as to be tautological. After all, you were a layman yourself once.Ideas/jobs/researches don't spring whole-spun from the ether. You have to be led along that same path yourself - either by a teacher, or by your own mind bumping along down dark corridors.
    But it's not true. Consider by analogy: if you can't explain something to a 4-year-old, you don't understand it yourself. After all, you were a 4-year-old once yourself. No, actually, sometimes you can't explain something to someone because you don't have a good enough understanding of their mental processes. It doesn't matter if you once experienced those same mental processes; the relevant memories of that time are very likely lost to you now. Explaining math to novices is a different skill than understanding math. It requires the ability to figure out why the other person has got it wrong and what they need to hear. That isn't a mathematical skill. A distinguished math professor is probably inferior at explaining arithmetic to 8 year olds than an experienced mathematics educator, but it doesn't mean the latter has the better understanding of math. They just have a better understanding of 8 year olds.

    I have experienced this problem before-- the teacher assumes you have prior knowledge that you just do not have, and all of what he says afterwards assumes you've made the logical leap. I wonder to what extent thoughtful people will reconstruct the gaps in their knowledge assuming the end conclusion is correct and working backwards to what they know in order to give themselves a useful (but possibly incorrect) bridge from B to A. For example, I recently heard a horrible biochem lecture about using various types of protein sequence and domain homology to predict function and cellular localization. Now, the idea that homology could be used to partially predict these things just seemed logical, and I think my brain just ran with the idea and thought about how I would go about using the technique, and placed everything he said piece-wise into that schema. When I actually started to question specifics at the end of the lecture, it became clear that I didn't understand anything the man was saying at all outside of the words "homology" and "prediction", and I had just filled in what seemed logical to me. How dangerous is it to try to "catch up" when people take huge inferential leaps?

    Yes, this is good stuff, I wish I could identify the inferential gaps when I communicate!

    Silas, aren't there some things it is simply impossible for some people to understand?

    Yes (maybe?), but that lends no argument against Silas' corollary. If you cannot explain, then you do not understand. Therefore: If you do understand, then you can explain. If no one can understand, then the precedent in the above is false, meaning that we cannot give the consequent any truth value.

    TGGP: Yes for people below some IQ threshold. No for someone of the same IQ as the explainer.

    (I probably should have added the intelligence criterion the first time around, I guess, but I was simplifying a bit.)

    This is an excellent post, Eliezer!

    Taking this phenomenon into consideration not only gives me cause to go back over my own teaching technique (of a rather specialized trade) and make sure I am not leaving out any steps that seem obvious to me (the specialist), but, like Laura, it helps me to understand times when I was baffled by a speaker or writer whose tone implied I'd be an idiot not to follow along easily.

    When I write for a very bright "puzzle-solving-type" audience, I do the mental equivalent of deleting every fourth sentence or at least the tail part of every fourth sentence to prevent the reader from getting bored. I believe that practice helps my writings to compete with the writings around it for the critical resource of attention. There are of course many ways of competing for attention, and this is one of the least prejudicial to rational thought. I recommend this practice only in forums in which the reader can easily ask followup questi... (read more)


    Richard, you may or may not care that having read the above my willingness to read anything you write in future has somewhat decreased.

    I would add, Richard, that writing "dear reader" on a medium like this comes off as patronizing.

    Some of your claims about the EEA are counterintuitive to me. Basically, it's not obvious that all information not strictly private would have been public. I'm thinking, for example, of present-day isolated cultures in which shamans are trained for several years: surely not all of their knowledge can be produced in a publicly comprehensible form. There must be a certain amount of "Eat this herb -- I could tell you why, but it would take too long to explain". Or so I imagine.

    So how much of your description of knowledge in the EEA is your guessimation, and how much is the consensus view? And where can I find papers on the consensus view? My Google-fu fails me.

    I present to you Exhibit A from the field of computer programming.


    I find an easy way to get some of the complicated inferential jumps for free is to find a similar set of inferential jumps they have made in a similar subject. It is much easier to correct a "close" inferential jump than it is to create a new one out of thin air.

    Example: When discussing the concept of programming you can use the concept of an assembly line to get their head into a procedural mode of thinking. Once they think about an object visiting a bunch of stations in a factory you can replace "object" with "program" and "station" with "line of code." They still have no idea how programming works, but they can suddenly create a bunch of inferential jumps based on assembly lines.

    In my experience, they now start asking questions about programming as related to assembly lines and you can fill in the gaps as you find them.

    "So what happens at the end of the line?"
    "Well, the program generally loops back around and starts over."
    "Oh. So it follows the same line forever?"
    "Not necessarily. Sometimes the line takes a detour and heads off into a new area of the plant for awhile. But it generally will come ... (read more)

    4Ben Pace
    I like how it took me until the end to realise you'd re-inventedthe concept of analogies :-)
    And I had to read past the end to realize that...

    As someone who has done (some) teaching, I think this is absolutely correct. In fact, the most difficult thing I find about teaching is trying to find the student's starting knowledge, and then working from there. If the teacher does not goes back enough 'inferential steps', the student won't learn anything - or worse, they might think they know when they don't.

    Excellent stuff.

    Now I think of it, this reminds of something Richard Dawkins used to say at some talks: that we (the modern audience) could give Aristotle a tutorial. Being a fantasist myself, I've sometimes wondered how that could be possible. Leaving aside the complications of building a time machine (I leave that to other people), I wondered how would it be to actually meet Aristotle and explain to him some of the things we now know about life, the universe & everything.

    First of all, I'd have to learn ancient greek, of course, or no communication would be possible... (read more)

    Actually, evolution might be the easiest one. It's inevitable if you have variation and selection. It's a really pretty theory.

    I don't know how hard it would be to convey that observation and experimentation will take you farther than just theorizing.

    If I brought back some tech far advanced over Aristotle's period (and I wonder what would be most convincing), it might add weight to my arguments.

    And personally, even if I had a time machine and the knowledge of ancient Greek, I don't know what hard it would be to get him to listen to a woman.

    You're right - evolution might be easier than, say, how and iPhone works (not that an iPhone would work very well in Ancient Greece, or for much long, anyway). Having some high tech to show to good old Aristotle maybe would convince him you come from a very strange land, and maybe he would want to hear more of what you have to say instead of just dismissing you as a lunatic. But imagine how much you would have to explain to make him even dimly aware of the way an iPhone works! Electronics, electricity, computation, satellites and astronomy (goodbye lunar sphere), calculus, chemistry, physics... I can barely think of all the relevant topics! Of course, as you point out, mysoginy would be a great obstacle too. One more of the 'steps' that separate ancient peoples from modern societies.
    What you want to teach depends on what you're trying to accomplish. I don't think there's much point in trying to give Aristotle an overview of modern scientific conclusions. Assuming we want to accelerate technological progress, I'd rather teach him scientific method, decimal notation, evolution, and maybe what Feynman said (iirc) was the most important conclusion-- that matter is made of tiny bits of elements. I don't know what other specific subjects might be a good idea. Bayes? Calculus? I don't know what would be convincing experiments for atoms. One more I'd want to teach him that you can learn a lot by doing careful measurement and thinking about the results. I don't know what Aristotle would come up with, given all that-- he was very smart.
    Assuming you convinced him of the epistemological primacy of experiment, I see two obvious paths: 1. The kinetic theory of gases, particularly the ideal gas law; 2. Stoichiometry in chemistry - for example, electrolysis of water.
    I would add Brownian motion to that list.

    From a practical point of view teaching the germ theory of disease would probably have the most immediate benefit.

    Using water droplets as rudimentary microscopes. How big a jump would it be to give them lens-making tech?
    You could probably explain geometrical optics without too much trouble.
    I would sort of expect any woman who showed up with apparently magical powers to be put into the goddess category. Even someone like Aristotle, who probably didn't believe that gods and goddesses literally existed, would be culturally conditioned to treat a woman who appeared to have super-powers with some respect.
    How would you implement that? What do we have the tech to build today for a reasonable outlay of money (less than a million euros, for example) that could blow minds in that era?
    I don't know what it would take to pass as a goddess, but a stash of cool stuff could be impressive. An ipad (with appropriate power source-- what would it take to power it from a water wheel?). Stainless steel blades. A Jacquard loom. What else?

    A hunting or sniper rifle, a pistol, a remote controlled helicopter with wireless video, broad spectrum antibiotics, powerful painkillers, explosives.

    A better list than mine. Do you think you'd need to go back with a group just to not have your stuff stolen? If you want to bring back something useful for the educational project rather than just being impressive, a batch of slide rules would be good. Could ancient Greeks make printing presses if they had designs for them? I'm sure they could at least do wood block printing.
    I think that would somewhat depend on how convinced people were by your 'godlike' powers. That's where modern day weaponry would prove quite effective I'd imagine. A taser would probably be useful as a bullet wound would be recognizable physical damage whereas the effect of a taser would probably seem like the power of the gods. If I was on my own I'd probably want body armor, motion sensors and other defensive equipment as well to be on the safe side. Mixing healing powers in would be just as valuable in self-preservation as demonstrating offensive capability. You would probably want to obfuscate the nature of your 'healing magic' so that people would not easily be able to replicate it if they managed to steal some of your stock of medical supplies. Special pills that had to be given in combination to be effective would be useful.
    If you're planning to teach a scientific worldview, it might be well to not be too godlike.
    True, but at least initially personal survival and not getting all your stuff nicked would probably require some compromises on teaching science. Once you had an established power base and some loyal local followers you could start to focus on the teaching.

    A plastic bottle out of the trash. It's transparent but flexible and almost weightless. See how well the lid has been made? It makes a water-tight seal.

    It might be the most valuable object in Greece.


    And then when you've got his attention, show him decimal notation.

    And stirrups for his horse. And lances.

    Once he's hooked, show him why things float. And how a ball rolling down an inclined plane moves 1, 4, 9, 16 as it accelerates.

    Show him Cartesian geometry. And how to play go with lines scratched in the ground and coloured stones. Make a recorder and play him some songs.

    He'll teach you Greek.

    Show him how to send messages using flashing mirrors. Show him Playfair's cipher. Perspective drawing. How to make a magnifying glass. Newton's cradle. Make a model boat out of bronze.

    I suspect in a day in Ancient Greece, you'd see so many easily solved problems that my list would look naive. You don't need modern technology. You need the things that were discovered just after the mediaevals recovered what the Greeks already knew.


    I suspect in a day in Ancient Greece, you'd see so many easily solved problems that my list would look naive.

    This is one of the more interesting approaches to the Connecticut Yankee in King Arthur's Court (as I dub this species of thought problem) - that you don't need any special preparation because your basic background means that you'll spend the rest of your life in the past rushing around yelling 'don't do that, you idiot, do it this way!'

    Diplomacy might actually be the best preparation.

    1Viktor Riabtsev
    Oh god. That is actually just humongous in it's possible effect on warfare. I mean add simple ciphers to it and you literally add another whole dimension to warfare. Communication lines setup this way are almost like adding radio. Impractical in some situation, but used in regional warfare with multiple engagements? This is like empire forming stuff just from reflective stone plus semi-trivial education equals dominance.
    Heck, you don't need a million Euros. I could easily blow minds with 100. A simple Zippo lighter should do the trick. So could an adjustable-beam flashlight, for that matter. A music player with earphones or speakers is another obvious choice. Candy bars maybe? They'd be shocked you brought ambrosia... Pretty much anything that emits light, sound, heat, cold, etc. is likely to have some serious impact. Remember superstimulus.
    Ultimately, I suppose the key question is, "how long do you need to keep up the act?"
    With a budget closer to 5,000EUR, access to firearms, and enough willingness to use Dark Arts, I could probably keep it up for a decade or more. Possibly even pass on knowledge to selected disciples who would likewise guard these technological secrets, even as they rule the ignorant peasants. If, on the other hand, our purpose is as originally stated - prove to the scholars of the time that I have knowledge worthy of them becoming my disciples so I can impart as much knowledge to them as possible - I probably won't need much more than the superstimuli I described and a couple of afternoons. Something decidedly useful could cement this, and still on budget: A map of the world, a geographically appropriate taxonomy book, and a wristwatch (doubles as nautical navigational aide) would be enough. And I don't think I've even reached 100EUR yet, all told :) What we can achieve with today's technology is so marvelous, it's amazing how ordinary it seems to us. One day I turned on the faucet at my house and just marveled at the incredible and unlikely wonder of having fresh drinking water at practically limitless capacity being instantly transported to my residence, at my whim. This isn't just magic. It's better than magic.
    A gun could blow minds in any era. I'm sorry, I couldn't help myself.
    One more thing beside a time machine, knowledge of ancient Greek, and a stash of cool stuff-- the ability to argue well enough to convey your ideas to Aristotle and convince him you're right. This is probably at least as hard as it sounds.

    See also:,, Reddit anti-"repost" rage and the comments like this that appear in practically every online community.

    [This comment is no longer endorsed by its author]Reply

    This is the reason it's a Bad Thing that so many of the deeper concepts of Mormonism have become public knowledge. The first question I get asked, upon revealing that I'm a Mormon, is often, "So, you believe that if you're good in this life, you'll get your own planet after you die?" There are at least three huge problems with this question, and buried deep beneath them, a tiny seed of truth. But I can't just say "The inferential distance is too great for me to immediately explain the answer to that question. Let me tell you about the Plan o... (read more)

    The "your own planet" thing isn't a huge selling point that you'd want to lead with?

    Ha! I'd never thought of it like that! :3 Unfortunately, I have a problem with the idea of "selling" a religion. Just because you like an idea doesn't mean it's true... Besides, the types of person who bother saying "You get your own planet?" instead of "You're religious?" usually views getting your own planet as The Ultimate Sacrilege, so it's not the best selling point, no. :/

    This is one of those things that I realize is so obvious once I thought about it, but until it was pointed out to me, I would have never seen it.

    To Mazur’s consternation, the simple test of conceptual understanding showed that his students had not grasped the basic ideas of his physics course: two-thirds of them were modern Aristotelians...“I said, ‘Why don’t you discuss it with each other?’” Immediately, the lecture hall was abuzz as 150 students started talking to each other in one-on-one conversations about the puzzling question. “It was complete chaos,” says Mazur. “But within three minutes, they had figured it out. That was very surprising to me—I had just spent 10 minutes trying to explain t

    ... (read more)

    If there is a probability of faulty inference, then longer inferences are exponentially less likely to be valid, with the length of the valid inference being proportional to logarithm of the process fidelity. Long handwaved inferences can have unbelievably low probability of correctness, and thus be incredibly weak as evidence.

    Furthermore, informal arguments very often rely on 'i can't imagine an alternative' in multiple of their steps, and this itself has proven unreliable. It is also too easy to introduce, deliberately or otherwise, a huge number of impl... (read more)

    The lack of this knowledge got me a nice big "most condescending statement of the day award" in lab a year ago.

    I don't think this is quite right, but taking up the challenge may be helpful when writing:

    I have always figured that if I can't explain something I'm doing t oa group of bright undergraduates, I don't really understand it myself, and that challenge has shaped everything I have written.

    Daniel Dennett

    And if you think you can explain the concept of "systematically underestimated inferential distances" briefly, in just a few words, I've got some sad news for you...

    "I know [evolution] sounds crazy -- it didn't make sense to me at first either. I can explain how it works if you're curious, but it will take me a long time, because it's a complicated idea with lots of moving parts that you probably haven't seen before. Sometimes even simple questions like 'where did the first humans come from?' turn out to have complicated answers."

    Of course it's not actually a simple question, it's really a broad inquiry. In fact it doesn't even need to have an answer and even when it does, it usually alters the question slightly... the hard part is asking the right questions not finding the answer. (It just dawned on me that this was the whole point of The Question in A Hitchhiker's Guide to the Galaxy, thanks for that.)

    And if you think you can explain the concept of "systematically underestimated inferential distances" briefly, in just a few words, I've got some sad news for you...

    "This is going to take a while to explain."

    Did I do it? Did I win rationalism?!

    “If you understood everything I said, you’d be me” ― Miles Davis
    2Luke Allen
    I'd go with "echo chambers." Or if I weren't feeling pedantic, I'd say "There's a reason this concept takes a whole semester to teach."

    Expecting short Inferential Distances wouldn't that be a case of rational thought producing beliefs which are themselves evidence? :P Manifested in over-explaining to the point of cognitive-dissonance? How about applying Occam's Razor and going the shorter distance: improve the clarity of the source by means of symbolism though a reflective correction (as if to compensate for the distortion in the other lens). To me it means to steel-man the opponent's argument to the point where it becomes not-falisifible. See, the fact that science works by falsificati... (read more)

    that last sentence ha


        I think this concept may be fundamental in explaining the mecanics of cohesion and division in society. This could help understand why politics tend to get more and more divided. Especially on the internet, but also IRL, people tend to confirm their ideas rather than confront them to different ones, as first observed by C. Wason (Peter C. Wason, « On the failure to eliminate hypotheses in a conceptual task », in Quarterly Journal of Experimental Psychology, 1960), and confirmed since. Or, one could argue, that reinforcing one's idea... (read more)

    It's been nearly a century since relativity and quantum physics took the stage, and we have gathered considerably more data, and filled in a great many areas since then, yet we are still waiting for the next big breakthrough.

    The problem may be in the way we approach scientific discovery.

    Currently, when a new hypothesis is advanced, it is carefully considered as to how well it stands on its own and whether, eventually, it is worthy of being adopted as a recognized theory and becoming a part of our overall model of how everything works.

    This might be the prob... (read more)