Here's the new thread for posting quotes, with the usual rules:

  • Please post all quotes separately, so that they can be voted up/down separately.  (If they are strongly related, reply to your own comments.  If strongly ordered, then go ahead and post them together.)
  • Do not quote yourself.
  • Do not quote comments/posts on LW/OB.
  • No more than 5 quotes per person per monthly thread, please.
New Comment
467 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

"if we offer too much silent assent about mysticism and superstition – even when it seems to be doing a little good – we abet a general climate in which skepticism is considered impolite, science tiresome, and rigorous thinking somehow stuffy and inappropriate. Figuring out a prudent balance takes wisdom.”

– Carl Sagan

Everyday words are inherently imprecise. They work well enough in everyday life that you don't notice. Words seem to work, just as Newtonian physics seems to. But you can always make them break if you push them far enough.

--Paul Graham, How to Do Philosophy

[surprisingly not a duplicate]


An idealist is one who, on noticing that a rose smells better than a cabbage, concludes that it will also make better soup.

-- H. L. Mencken, describing halo bias before it was named


I like the pithy description of halo bias. I don't like or agree with Mencken's non-nuanced view of idealists. it's sarcastically funny, like "a liberal is one who believes you can pick up a dog turd by the clean end", but being funny doesn't make it more true.

The point is that idealists suffer from a halo bias around their chosen ideal.
Do roses make for good soup? They make for good chocolate.
Rose water is used for flavoring, sometimes. Roses have essentially no nutritional value, though, and cabbages are widely held to taste better than they smell.
I've had rosewater flavoured ice cream. I bet cabbage ice cream does not taste as nice.

Not everything that is more difficult is more meritorious.

-Saint Thomas Aquinas

I wish I would have memorized this quote before attending university.

*This comment was inspired by Will_Newsome's attempt to find rationality quotes in Summa Theologica.

Summa Theologica is a good example of what happens when you have an excellent deductive system (Aquinas was great at syllogisms) and flawed axioms (a literal interpretation of the Bible).

Aquinas probably meant something different by "literal interpretation" than you think. For instance, I'm pretty sure he agreed with Augustine that the six days of creation were not literally six periods of 24 hours.
Out of curiosity, where did Augustine say that? It's interesting that anyone bothered doubting that the six days were literal before the literal interpretation became embarrassingly inconsistent with established science.
The first three "days" happened before the sun and moon were created, so a literal interpretation was problematic even then.
Eh, there's an easy hack around that: God already knew what the length of a day was before it created the sun and the moon.
The literalness or otherwise of the description wasn't really a issue of major debate one way or the other until there was a strong alternative hypothesis. Theres no political or signalling benefit to supporting a bizarre position when you have nothing to compare it too.
Yes. So, the question is, Which alternative hypotheses were on the table before Darwin, and why were they considered compelling?
If I was copying over rationality quotes from the Summa I'd have gone for way different stuff, Aquinas was a fucking beast of a rationalist. I was just testing LW. Karma is not nearly as useful as accurate beliefs.
I don't know about a beast, but in general philosophers from the Middle Ages are far underrated compared to, say, philosophers from the "Enlightenment".
I think thats a product of people being evaluated by the 'rightness' of their conclusions rather than the validity of their arguments, so someone who rationally derived a wrong conclusion from bad data is less respected than someone who found a conclusion similar to our present ones by bad reasoning or sheer chance (e.g. certain ancient philosophers).
Maybe, but that doesn't explain why there is so much misinformation about medieval philosophy in popular sources. For instance, as Will Newsome tried to point out, Saint Thomas Aquinas was arguably a compatibilist with respect to the problem of free will, but I was taught in university that the "right" solution to the problem of free will (compatibilism) had to wait for a cognitive scientist (specifically, Daniel Dennet). There are numerous issues where thinkers from the Middle Ages did come to roughly the "right" answer, but that moderns teach that they didn't. There has got to be more to the story.
I am surprised by this. The proto-compatibilism of Aquinas might be little-known, but I thought it was common knowledge that compatibilism has a long pedigree before the late 20th century, including most logical positivists like Ayer and earlier British empiricists like Hume (I would include Spinoza as well). What Dennett gives is a version informed by modern cognitive science, but not especially novel in its basic features.

The road to wisdom? — Well, it's plain
and simple to express:
and err
and err again
but less
and less
and less.

--Piet Hein


This has been quoted by Yvain before, but not here.
I was very surprised to see this was not a dupe; checking, the copy in my Mnemosyne was simply taken straight from a collection of his grooks. A missed opportunity.
Do you mean you have a deck for quotes? As I'm just getting into trying out spaced repetition and trying to come up with things to memorize, I'm wondering about your reasons for memorizing quotes (if that is indeed what you're doing). Do you have some sort of system of question/answer pairs that help you remember quotes that are applicable to certain situations? Or are you trying to memorize quote authors? Or what?
I add quotes because it's a handy sort of quotes file (many people keep them) and because I like being able to reel off quotes or just have them handy in my memory for writing. There's nothing fancy about them: the question is the quote, and the answer is all the sourcing and bibliographi information. I grade them based on whether I feel I could paraphrase them in a relevant context. ("Ah yes, good old Box's quote about how 'all models are wrong but some are useful'. Good to remember for statistical discussions. Mark that one a 4.")
Are the decks you personally use available anywhere?
Thanks a bunch :)

Do not accept any of my words on faith,
Believing them just because I said them.
Be like an analyst buying gold, who cuts, burns,
And critically examines his product for authenticity.
Only accept what passes the test
By proving useful and beneficial in your life.

-- The Buddha, Jnanasara-samuccaya Sutra

Good instrumental rationality quote; not so good for epistemic rationality.
Why do you say that?
"Proving useful in your life" (but not necessarily "proving beneficial") is the core of instrumental rationality, but what's useful is not necessarily what's true, so it's important to refrain from using that metric in epistemic rationality. Example: cognitive behavioral therapy is often useful "to solve problems concerning dysfunctional emotions", but not useful for pursuing truth. There's also mindfulness-based cognitive therapy for an example more relevant to Buddhism.
I suppose that is a tension between epistemic and instrumental rationality. Put in terms of a microeconomic trade-off: The marginal value of having correct beliefs diminishes beyond a certain threshold. Eventually, the marginal value of increasing one's epistemic accuracy dips below the marginal value that comes from retaining one's mistaken belief. At that point, an instrumentally rational agent may stop increasing accuracy. On the other hand, it may be a problem of local-versus-global optima: The marginal value of accuracy may creep up again. Or maybe those who see it as a problem can fix it with the right augmentation.
There is no tension. Epistemic rationality is merely instrumental, while instrumental rationality is not. They are different kinds of things. Means to an end don't compete with what the end is.
Upvoted for this
It is useful for pursuing truth to the extent that it can correct actually false beliefs when they happen to tend in one direction.
This sometimes comes at the expense of other truths, just as pursuing evidence for your preferred conclusion turns up real evidence but a less accurate map.
Related quote from Epictetus.

“A casual stroll through the lunatic asylum shows that faith does not prove anything.”

  • Friedrich Nietzsche
That would seem to be an odd notion of "faith"; is the translation untrue to the original or is Nietzsche just being typically provocative? (I also personally don't see how the quote is at all profound or interesting but that's a separate issue and more a matter of taste.)

I apologize for practicing inferior epistemic hygiene. Thank you for indirectly bringing this to my attention. I knew that the quote was commonly attributed to Nietzsche, but I had never seen the original source. It would seem to be a rephrasing of this quote from The Antichrist:

The fact that faith, under certain circumstances, may work for blessedness, but that this blessedness produced by an idée fixe by no means makes the idea itself true, and the fact that faith actually moves no mountains, but instead raises them up where there were none before: all this is made sufficiently clear by a walk through a lunatic asylum.

Ah, that sounds a bit more like the Nietzsche I know and kinda like! Thanks for digging up the more accurate quote.
I'd parse the quote as meaning "Believing in something doesn't make it true", in which case it's something that pretty much everyone on this site takes for granted, but that the average person hasn't necessarily fully internalized. Yudkowsky felt the need to make a similar point near the end of this article, and philosophers as diverse as St. Anselm and William James have built entire epistemologies around the notion that faith is sufficient to justify belief, so obviously it's a point that needs to be made.
I dunno about St. Anselm but I found James's "The Will to Believe" essay reasonable as a matter of practical rationality. The sort of Bayesian epistemology that is Eliezer's hallmark isn't exactly fundamental, and the map-territory distinction isn't either, so I don't find it too surprising that e.g. Kantian epistemology looks a lot more like modern decision theory than it does Bayesian probability theory. I suspect a lot of "faith"-like behaviors don't look nearly as insane when seen from this deeper perspective. So on one level we have day-to-day instrumental rationality where faith tends to make sense for the reasons James cites, and on a much deeper level there's uncertainty about what beliefs really are except as the parts of your utility function that are meant for cooperation with other agents (ETA: similar to Kant's categorical imperative). On top of that there are situations where you have to have something like faith, e.g. if you happen upon a Turing oracle and thus can't verify if it's telling you the truth or not but still want to do hypercomputation. Things like this make me hesitant to judge the merits of epistemological ideas like faith which I don't yet understand very well.
This sort of taxonomy seems to deserve a more thorough treatment in a separate post.

...when you do have a deep understanding, you have solved the problem and it is time to do something else. This makes the total time you spend in life reveling in your mastery of something quite brief. One of the main skills of research scientists of any type is knowing how to work comfortably and productively in a state of confusion.


(emphasis mine)

Teaching, for me and several other people I know, serves the purpose of reveling in your mastery. In fact, Feynman said it best:

In any thinking process there are moments when everything is going good and you've got wonderful ideas. Teaching is an interruption, and so it's the greatest pain in the neck in the world. And then there are the longer period of time when not much is coming to you. You're not getting any ideas, and if you're doing nothing at all, it drives you nuts! You can't even say "I'm teaching my class."

If you're teaching a class, you can think about the elementary things that you know very well. These things are kind of fun and delightful. It doesn't do any harm to think them over again. Is there a better way to present them? The elementary things are easy to think about; if you can't think of a new thought, no harm done; what you thought about it before is good enough for the class. If you do think of something new, you're rather pleased that you have a new way of looking at it.

Teaching helps me a lot in this respect, because I become very insecure sometimes when I do my research.

I can't tell if he presents this as a good thing or a bad thing.
At the very edge its also useful to be able to work while in a state of sheer existential dread.
In my experience, if you find yourself in "a state of sheer existential dread", that probably means you've done something wrong, most likely made a category error somewhere along the way.

"Never interrupt your enemy while he is making a mistake." -- Napoleon Bonaparte

(This has been mentioned before on LW but not in a quote thread. I figured it was fair game.)

Just make sure to only apply this one to your actual enemies, and not to people who generally wish you well but disagree on some key point.
Interrupting even neutral associates when they are making a mistake does not necessarily have good outcomes for you either. Being the messenger has a reputation...
Its apparently a misattribution sadly.
It looks like it's in the attribution section to me, not the misattribution section.

...some people requested that I be prohibited from studying. One time they achieved it through a very holy and simple mother superior who believed that studying would get me in trouble with the Inquisition and ordered me not to do it. I obeyed her for the three months that she was in office in as far as I did not touch a book, but as far as absolutely not studying, this was not in my power. [...] Even the people I spoke to, and what they said to me, gave rise to thousands of reflections. What was the source of all the variety of personality and talent I found among them, since they were all one species? [...] Sometimes I would pace in front of the fireplace in one of our large dormitories and notice that, though the lines of two sides were parallel and its ceiling level, to our vision it appears as though the lines are inclined toward each other and the ceiling is lower in the distance than it is nearby. From this it can be inferred that the lines of our vision run straight, but not parallel, to form the figure of a pyramid. And I wondered if that was the reason that the ancients questioned whether the earth was a sphere or not. Because although it seemed so, their vision mig

... (read more)

"When picking fruit, an excellent first choice is the low-hanging ladderfruit. It is especially delicious."

--Frank Adamek

While you're there, enjoy the laddergoat.
Now in live action


Fabius actually seems a little irrational in this quote. At first he objects to Augustus's interpretation because Augustus is not an expert on the interpretation of signs, which is reasonable. But then when Augustus does have an intepretation that's coming from an augur, Fabius still continues to question it, pitting his view against expert opinion like it was still just the opinion of Augustus. Since it is not established that Fabius would be an augur himself, this seems like motivated cognition / not properly updating on evidence. Alternatively, it could be that Fabius doesn't actually believe in omens, but in that case first appealing to the need to get an expert opinion is pretty dishonest. Of course, Alejandro's comment below does clarify that Livia is probably lying about the augur's testimony, but I'm going by the quote as it was posted (and as most people probably read/voted it).
Fabius does not want to argue with a fool more than it is necessary. He engages the heavy guns only when needs to, this time at the end of the dialogue. My kind of a (dishonest you say) guy.
Because days is the Schelling point interpretation, and if gods are communicating with you they'll probably go for the Schelling point. Lightning implies Zeus-Jupiter, so Augustus should look into historical examples of Zeus talking to people to see if Zeus tends to be misleading in ways similar to those Fabius warns of; in fact the augur had probably already considered things like this before speaking with Livia. And Fabius should trust the augur, who is a specialist in the interpretation of signs and probably has more details of the case than he does. I mean seriously, what are the chances that the letter C would get struck by lightning? We are beyond the point of arbitrary skepticism. Deny the data or trust the professionals. (I'm not familiar with the series in question, I'm just filling in details in the most likely way I can think of.) ETA: Wait, maybe Fabius is trolling Augustus/me? ...Nice one Fabius! I approve of your trolling. Downvote retracted. (Oh yeah and this is an excuse to link to the Wiki article on assassination markets.)
For everyone who knows that Livia is the Magnificent Bastard of the series (which is made clear from the first episode, so no spoiler there), the highest probability mass goes to the hypothesis that was lying about having spoken to an augur or about what he told her, and that she wanted Augustus to question her and only feigned to resist. And "everyone who knows" at this stage probably includes Fabius, and every other character but Augustus.
So the leader of the relevant transhumanly intelligent entities is on the side of the Magnificent Bastard? If I was Augustus I'd seriously consider being nice to the Jews and asking YHWH for guidance. (Rationality: it works even better in magical universes! (Like, ahem, the one we're in.))

Chu-p’ing Man studied the art of killing dragons under Crippled Yi. It cost him all the thousand pieces of gold he had in his house, and after three years he'd mastered the art, but there was no one who could use his services. - Chuang Tzu

So he decided to teach others the art of kiling dragons. - René Thom


In questions of this appalling magnitude, I find the best way to "overcome bias" is often to find perspectives which seem to make each answer obvious. Once we recognize that both A and B are obviously true, and A is inconsistent with B, we are in the right mindset for actual thought.

--Mencius Moldbug

Remember sources please; "How Dawkins got pwned (part 7)", 8 November 2007
You have a thing for Moldbug too, don't you? ^_^
This sounds like bad advice. In Moldbug's application of it, for example, making things "obvious" corresponds to making bad arguments - arguments that, in some alternate reality, possibly made of straw, would correspond to some possibly straw person who found the argument very obvious. And then you say "well, obvious argument #1 is awful, so by process of elimination let's go with obvious argument #2! Q.E.D."

Human knowledge and human power meet in one; for where the cause is not known the effect cannot be produced. Nature to be commanded must be obeyed; and that which in contemplation is as the cause is in operation as the rule.

Francis Bacon

Wow, I'm surprised this had not been posted before. Good catch.
I was very surprised, too. I'd found a similar quote-- one that that I'll put in a top-level comment-- and checked for the Bacon quote.

"Don't ask whether predictions are made, ask whether predictions are implied."

--Steven Kaas


"Is it hard?"

"Not if you have the right attitudes. It’s having the right attitudes that’s hard."

-- Robert M. Pirsig, Zen and the Art of Motorcycle Maintenance

Use only that which works, and take it from any place you can find it.

--Bruce Lee

That seems rather applause-lighty. The reversal is abnormal; who would say "Use some things that don't work"? Maybe in some traditionalist cultures "Resist the appeal of using things that work but come from unworthy places" would sound wise, but on LessWrong it would likely get stares.

Bruce Lee was a martial artist, and martial arts is a field where a lot of people go by tradition rather than checking on what works.

awesomely relevant video: Joe Rogan on MMA and Kung Fu

That seems rather applause-lighty.

I think many cited quotations sound applause-lighty. They are meant to by pithy encapsulations of LW themes, after all. And I don't think that's necessarily a problem; applause lights are a problem for things that might be taken as reasoning, like posts.

"Use only that which works" is obvious enough to be unhelpful, but "take it from any place you can find it" was pretty novel in the context in which he proposed it, and still is to a lot of people in a lot of domains. The existence of the Traditional branch of Jeet Kune Do (as opposed to the Concepts branch,) which exclusively teaches the martial art as Bruce Lee practiced it at the time of his death, is testament to the strength of humans' tendency to behave counter to this advice.

... if anyone thinks they can get an accurate picture of anyplace on the planet by reading news reports, they're sadly mistaken.

--Bruce Schneier

It’s not a good idea for members of the faith-based community like Hitchens to proclaim things like: Science proves we’re all genetically equal, so therefore you shouldn’t be beastly toward people of other races. The obvious flaw in this strategy is that eventually people will figure out that you are lying about what the science of genetics says, and therefore, by your own logic, that discredits the perfectly valid second half of your assertion.

--Steve Sailer

Good chop, bro.

In short, they made unrealistic demands on reality and reality did not oblige them.

Cory Doctorow talking about DRM, but I think there are some wider applications.

Reminiscent of one of my favorite Bruce Schneier quotes.

Imagine willpower doesn't exist. That's step 1 to a better future.

Second slide of this powerpoint by Stanford's Persuasive Tech Lab.

"This has been a good day... I haven't done a single thing that was stupid..."

"Have you done anything that was smart?"

--Peanuts (Nov. 23, 1981) by Charles Schulz

Most people are theists not because they were "reasoned into" believing in God, but because they applied Occam's razor at too early an age. Their simplest explanation for the reason that their parents, not to mention everyone else in the world, believed in God, was that God actually existed. The same could be said for, say, Australia.

--Mencius Moldbug

Please remember sources; this is from "How I Stopped Believing in Democracy", 31 January 2008.
Is it conventional to add sources when it is an on-line? Sorry didn't know that was expected, since it wasn't in the posting rule set. Will remember to add sources in the future. BTW gwern sometimes your attention to detail is as unnerving as it is helpful and impressive.
I thought it was, but then, I may be interested only because it makes it easier in the future to track down citations if there is a title and URL (and because if I click on a URL, it goes into my archive bot). It's just time-wasting... Heck, I time-waste on my time-wasting, I'm supposed to be adding citations on how people are biased against spaced repetition even when their scores are better with SR to my respective article.

"The last enemy that shall be destroyed is death"

--1 Corinthians 15:26

(I wonder what Eliezer would've made of it - as far as I know, he never read Deathly Hallows and so never read about the tombstone.)

Well, he knows about the Hallows themselves via wiki-readings. I think he would have written the story the way it is whether he knew about the tombstone or not, but I put fairly high probability that he does know about the tombstone and how fantastically awesome an endcap it's going to be on the story.
Mm. Maybe:
I think there's a close to 100% chance that the tombstone will be alluded to, because even if Eliezer DIDN'T know about it before, he will by the time the story ends (because I will have questioned and informed him about this), and after that I just can't imagine him making such a terrible mistake as to NOT include the tombstone's quote. I do think a simple bet of "did he already plan this?" is feasible. We can just ask him. (I put odds at 75%). (By "close to 100%" I mean maybe 95. I can think of scenarios where he hadn't originally planned for the tombstone and where it would be hard to integrate it)
Oh fine: But you'd better ask him now!
4Eliezer Yudkowsky
I was already aware of the quote. It's on James and Lily's tombstone (in canon).
I see; but the predictions/questions wasn't were you aware of it at all, but were you planning to incorporate it ex ante, and did you ex post.
7Eliezer Yudkowsky
If it's incorporated it will have been planned beforehand.
You and your silly hatred of spoilers. (The recent experimental evidence, BTW, suggests spoilers are not harmful but helpful for enjoyment.) But I guess that statement works.
For what it's worth, there are stories where I've appreciated going in with no knowledge except for some reason to think I'd like it (the movie Hugo 3D is a recent example, for Mieville's Un Lun Dun I just had a reasonable guess about genre). I think I lost some of the impact of A Deepness in the Sky because I knew what Focus was before I started reading.
I think whether spoilers are harmful varies among works and among readers. (For example, ‘finding out how it ends’ was the only reason why I finished reading Digital Fortress by Dan Brown rather than throwing it in the garbage bin right after the first couple chapters; if I had already known the ending I would likely not have enjoyed it at all (except possibly for laughing at it).)

I think whether spoilers are harmful varies among works and among readers. (For example, ‘finding out how it ends’ was the only reason why I finished reading Digital Fortress by Dan Brown rather than throwing it in the garbage bin right after the first couple chapters;

This is an example of when spoilers are good, right? Every person saved from reading Dan Brown...

I'm confused by what this is an example of. Had you known how it ended, would you have finished reading the book? If so, why? If not, how would that have been harmful?
1) Probably not; 2) that would have taken away from me the enjoyment of reading the book to find out the ending. (I was quite bored that day, and I didn't have my computer or my music player or anything else to do with me.)
(nods) OK, sure... if the most enjoyable thing I can do right now is read a book that isn't enjoyable to read, in order to get the enjoyment of reading the book and being surprised by its ending, then telling me the ending is harmful. Agreed. I have trouble imagining actually being in that state personally, but of course people vary.
The existence of bookshops in train stations and airports selling badly-written suspense novels suggests this is a common state.
Well, I too have bought a number of books in airports and train stations over the years, and I don't see how the fact that airports and train stations sell the books they sell provides evidence to choose between the theory that army1987's state is common, and the theory that my state is common. (Of course, the reality could also be both, or neither.)
If I had been thinking better I would have specified "did he know" rather than "did he plan" so that we could resolve the issue. (I think there is at least a 30% chance one (if not both) of us will have forgotten this wager by the time the reveal happens)
That's what PredictionBook is for. So far I have a good record for long-term use of it...

“The general method that Wittgenstein does suggest is that of ’shewing that a man has supplied no meaning for certain signs in his sentences’.

I can illustrate the method from Wittgenstein’s later way of discussing problems. He once greeted me with the question: ‘Why do people say that it was natural to think that the sun went round the earth rather than that the earth turned on its axis? I replied: ‘I suppose, because it looked as if the sun went round the earth.’ ‘Well,’ he asked, ‘what would it have looked like if it had looked as if the earth turned on its axis?’

This question brought it out that I had hitherto given no relevant meaning to ‘it looks as if’ in ‘it looks as if the sun goes round the earth’.

My reply was to hold out my hands with the palms upward, and raise them from my knees in a circular sweep, at the same time leaning backwards and assuming a dizzy expression. ‘Exactly!’ he said.”

–Elizabeth Anscombe, [An Introduction To Wittgenstein’s Tractatus]( (1959); apropos of a recent Scot Sumner blog post

Another great quote by Sumner in that same post:

The Great Depression was originally thought to be due to the inherent instability of capitalism. Later Friedman and Schwartz blamed it on a big drop in M2. Their view is now more popular, because it has more appealing policy implications. It’s a lot easier to prevent M2 from falling, than to repair the inherent instability of capitalism. Where there are simple policy implications, a failure to do those policies eventually becomes seen as the “cause” of the problem, even if at a deeper philosophical level “cause” is one of those slippery terms that can never be pinned down. [Bold added]

Science isn't just a job, it's a means of determining truth. Methods of determining truth that aren't trustworthy in the laboratory don't become trustworthy when you leave it. There is no doctrine of applying scientific methodology to every aspect of one's life, you either follow trustworthy methods of investigation or you don't, and "follow trustworthy methods of investigation" is the core of science.

~Desertopa, TVTropes Forum

There are types of valid evidence that aren't scientific. In particular science is also partially a social process, whereas you trying to find the truth for yourself is not.


A critical analysis of the present global constellation -- one which offers no clear solution, no "practical" advice on what to do, and provides no light at the end of the tunnel, since one is well aware that this light might belong to a train crashing towards us -- usually meets with reproach: "Do you mean we should do nothing? Just sit and wait?" One should gather the courage to answer: "YES, precisely that!" There are situations when the only truly "practical" thing to do is to resist the temptation to engage immediately and to "wait and see" by means of a patient, critical analysis.

Slavoj Žižek, Violence, emphasis added. Admittedly not the most clear elucidation of the subject of how urgency (fabricated or otherwise) should affect ethical deliberation, but see also his essay "Jack Bauer and the Ethics of Urgency" -- if you're into that sort of thing.

The ultimate theological question is: ‘Where does the Sun go at night?’.

The answer that so many civilisations agreed for so long was: ‘The Sun is driven by one of the gods, and at night it goes under the Earth to fight a battle. There is at least some risk that the god will lose this battle, and so the Sun may not rise tomorrow’. It’s something the human race understood was a cast iron fact before they knew how to cast iron. It survived as the working model twenty-five times longer than the four hundred years we’ve understood the Earth goes around the Sun.

Lance Parkin, Above us only sky

This is less a rationality quote than a "yay science" quote, but I find that impressive beyond words. For millenia that was a huge and frightening question, and then we went and answered it, and now it's too trivial to point out. We found out where the sun goes at night. I want to carve a primer on cosmology in gold letters on a mountain, entitled something in all caps along the lines of "HERE IS THE GLORY OF HUMANKIND".

It survived as the working model twenty-five times longer than the four hundred years we’ve understood the Earth goes around the Sun.

Is it excessive nitpicking to point out that the daily disappearance and reappearance of the Sun has to do with the Earth's rotation on its axis, not its rotation about the Sun? (Probably not, as the first comment on Parkin's blog posting points out the same.)

Is it excessive nitpicking to note that not only did he misuse the word "ultimate", he used it to mean basically the opposite of what it actually means?
No. Thank you for inspiring me to look up the word and learn its true meaning.
Do you mean cosmology or astronomy?
Both. Cosmological content: "stuff goes around other stuff"; astronomical content: "this applies to the stuff we sit on"; philosophical content: "finding this out proves we are awesome"; gastronomical content: "here's a recipe for cake to celebrate".

The truth is common property. You can't distinguish your group by doing things that are rational, and believing things that are true.

Paul Graham, Lies We Tell Kids

The truth is common property. You can't distinguish your group by doing things that are rational, and believing things that are true.

It would seem that if no other humans are behaving rationality and your group is behaving rationally then even Sesame St could tell you which of these things is not the same.

If no other groups of humans are behaving as rationally as yours is, then it's likely no other humans are capable of easily identifying that your group is the one with the high level of uniquely rational behavior. To the extent that other groups can identify rational behaviors of yours, they will have already adopted them and will not consider you unique for having adopted them too. You can signal the uniqueness your group by believing and doing things that are both rational and unpopular, but to most outsiders this only signals uniqueness, not rationality, because the reason such things are unpopular is because most people don't find them to be obviously rational. And the outsiders are usually right: even though they're wrong in your particular actually-is-rational case, that's outnumbered by the other cases which, from the outside, all appear to be similar arational group-identifying behaviors and rationalizations thereof. E.g. at first glance there's not a huge difference between "I'm going to get frozen after I die", "I don't eat pork", "I avoid caffeine and hot drinks", etc.
Not actually true. I'd like it to be!
Damn skippy. I'd even settle for the above being true of my group with respect to other groups.
Depends on how immediate and/or dramatic the benefits of the rational behavior are.
then you're probably insanely wrong.
Why do you say that? That doesn't sound true. Humans are monkeys - I should be surprised if a group of monkeys acts perfectly rational. I suggest that any insanity that however insane I may be this issue is straightforward.
My original comment was meant to be a mildly elaborate adianoeta that is more than the sum of its parts (except that the addition of "insanely" was a regrettable and meaningless rhetorical flourish). So if I seem straightforwardly wrong then maybe something was lost in interpretation or I just didn't do it right.
It's been a while since I read that essay. I can't tell whether that quotation's meant to be an example of a lie we tell kids, or one of Paul Graham's own beliefs! (An invertible fact?)
It is Graham's own belief.
Yes, a look at it in context in the essay confirms that — but isn't it a strange belief for someone like Paul Graham to have? It looks false to me (although "truth is common property" is ambiguous). I think a group could make itself very distinct by believing certain truths and doing certain rationally justified things.
I don't know whether it's strange for Graham to think this; I haven't read much of his stuff. I found the phrase "common property" odd too. I associate the phrase with "commons," as in tragedy of the commons. I think LessWrong is distinctive, and part of its distinctiveness comes from its members' attempts to do the above.
Most groups of weapon developers probably hope to keep their knowledge distinct from that of other groups for as long as they can...
What? I don't get this. Also, why should weapons developers care whether their products are distinctive? Having better weapons helps, and being better is being distinctive, but so is being worse.
I apologize. I should have been clearer. I mean that if a group of weapons developers, such as, for instance, the Manhattan Project, discovers certain critical technical data necessary to their weapons, such as, for instance, the critical mass of Pu-239, they will often prefer that these truths not spread to other groups. For as long as they are able to keep this knowledge secret, it is indeed a set of truths that makes this set of weapons designers distinct from other groups.
Oh, I see now. Thanks for clarifying. But if other developers are incorrect, then you'd want to be correct; and if other developers are correct, you'd still want to be correct. Put game-theoretically, accuracy strictly dominates inaccuracy. By contrast, isnt' distinctiveness only good when it doesn't compromise accuracy?

If some persons died, and others did not die, death would indeed be a terrible affliction.

--Jean de la Bruyère

But we all die, so that makes death alright?
That is one source of acceptance of death.

Prompted by Maniakes', but sufficiently different to post separately:

It cannot have escaped philosophers' attention that our fellow academics in other fields--especially in the sciences--often have difficulty suppressing their incredulous amusement when such topics as Twin Earth, Swampman, and Blockheads are posed for apparently serious consideration. Are the scientists just being philistines, betraying their tin ears for the subtleties of philosophical investigation, or have the philosophers who indulge in these exercises lost their grip on reality?

These bizarre examples all attempt to prove one "conceptual" point or another by deliberately reducing something underappreciated to zero, so that What Really Counts can shine through. Blockheads hold peripheral behavior constant and reduce internal structural details (and--what comes to the same thing--intervening internal processes) close to zero, and provoke the intuition that then there would be no mind there; internal structure Really Counts. Manthra is more or less the mirror-image; it keeps internal processes constant and reduces control of peripheral behavior to zero, showing, presumably, that external behavior Reall

... (read more)
Eliezer Yudkowsky (Some discussions here, such as those involving such numbers as 3^^^3, give me the same feeling.)
I don't understand that quote. A good Bayesian should still pick the aposteriori most probable explanation for an improbable event, even if that explanation has very low prior probability before the event.
I suspect the point is that it's not worthwhile to look for potential explanations for improbable events until they actually happen.
I think it's more than that - he's saying that if you have a plausible explanation for an event, the event itself is plausible, explanations being models of the world. It's a warning against setting up excuses for why your model fails to predict the future in advance - you shouldn't expect your model to fail, so when it does you don't say, "Oh, here's how this extremely surprising event fits my model anyway." Instead, you say "damn, looks like I was wrong."
I don't, however, think it's meant to be a warning against contrived thought experiments.
Absolutely: I strongly recommend you not try to explain how 3^^^3 people might all get a dustspeck in their eye without anything else happening as a consequence, for example.
It's Yudkowsky. Sorry, pet peeve.
Is Eliezer claiming that we aren't living in a simulation, claiming that if we are living in a simulation, it's extremely unlikely to generate wild anomalies, or claiming that anything other than those two is vanishingly unlikely?
Sorry to be so ignorant but what is 3^^^3? Google yielded no satisfactory results...
6MinibearRex TheOtherDave's other comment summed up what it means practically. Also, see
Ah thank you, that clarifies things greatly! Up-voted for the technical explanation.
A number so ridiculously big that 3^^^3 * X can be assumed to be bigger than Y for pretty much any values of X and Y.
Bloody p-zombies. Argh. Yes.

"A Confucian has stolen my hairbrush! Down with Confucianism!"

-GK Chesterton (on ad hominems)

As in the Roman empire age, the theoretical concepts, taken out of the theories assigning their meaning and considered instead real objects, whose existence can be apparent only to the initiated people, are used to amaze the public. In physics courses the student (now unaware of the experimental basis of heliocentrism or of atomic theory, accepted on the sole basis of the authority principle) gets addicted to a complex and mysterious mythology, with orbitals undergoing hybridization, elusive quarks, voracious and disquieting black holes and a creating Big Bang: objects introduced, all of them, in theories totally unknown to him and having no understandable relation with any phenomenon he may have access to.

Lucio Russo, The Forgotten Revolution: How Science Was Born in 300 BC and Why it Had to Be Reborn

Some people will always have to take most of natural science on authority. Sure you can make that sound bad, but to me it sounds like "children take 9*9=81 on authority! spoooooky."

Ye gots to wiggle yer fingers when ye say it.

A "preview" electronic version of this book is available through the translator's website here: I enjoyed the book a lot. It's true that the author reads Hellenic scientists in the most favorable possible light while reading Renaissance scientists in the least favorable possible light. But he gives extensive quotations from the available sources, so that you can judge for yourself whether his interpretations are stretched.

"A “lie-to-children” is a statement which is false, but which nevertheless leads the child’s mind towards a more accurate explanation, one that the child will only be able to appreciate if it has been primed with the lie." "Yes, you needed to understand that” they are told, “so that now we can tell you why it isn’t exactly true." It is for the best possible reasons, but it is still a lie".”

--(The Science of Discworld, Ebury Press edition, quotes from pp 41-42)

Uncertainty, in the presence of vivid hopes and fears, is painful, but must be endured if we wish to live without the support of comforting fairy tales

— Bertrand Russell, History of Western Philosophy (from the introduction)

A stoic sage is one who turns fear into prudence, pain into information, mistakes into initiation, and desire into undertaking.

Nasim Taleb

Ninety per cent of most magic merely consists of knowing one extra fact.

Terry Pratchett

We should venture on the study of every kind of animal without distaste; for each and all will reveal to us something natural and something beautiful.



And now my labor is over. I have had my lecture. I have no sense of fatherhood. If my genetic and personal histories had been different, I should come into possession of a different lecture. If I deserve any credit at all, it is simply for having served as a place where certain processes could take place. I shall interpret your polite applause in that light.

--B.F. Skinner

As an experimental psychologist I have been trained not to believe anything unless it can be demonstrated in the laboratory on rats or sophomores.

Steven Pinker, Words and Rules

Invertible fact alert: I can't tell if Pinker means that as (mostly) a good or a bad thing!
I take it as ha ha only serious. Pinker knows that people are generally appallingly inaccurate and believe untruthful things, and that psychology is right to throw out every other belief and only depend on what it has rigorously verified; but he also knows the rigorous verification has been done on weird subjects and so psychology has thrown out a lot of correct beliefs as well. Accepting this tension is the mark of an educated man, as Aristotle says.
Given the history of psychology as a field, I'd assume he's praising the merits of experimental evidence.

We made our oath to Vavilov
We'd not betray the solanum
The acres of asteraceae
To our own pangs of starvation

"When The War Came", by The Decemberists

(from memory, will fix any errors later)

"While developing his theory on the centres of origin of cultivated plants, Vavilov organized a series of botanical-agronomic expeditions, collected seeds from every corner of the globe, and created in Leningrad the world's largest collection of plant seeds. This seedbank was diligently preserved even throughout the 28-month Siege of Leningrad, despite starvation; one of Nikolai's assistants starved to death surrounded by edible seeds."

Thank you kind sir.
Can you elucidate the connection to rationality?

A few Google searches resolved this question for me, and proved very interesting besides. Vavilov was a Soviet botanist focused on the cultivation of efficient seeds to mitigate hunger. In World War Two, Vavilov's Leningrad seedbank came under siege by the Nazis, who apparently wanted to steal/destroy the seeds. Considering the supplies vital to Russia's long-term survival, several of the scientists swore oaths to protect the seedbank against German forces, starving foragers, and rats.

They succeeded in doing so. The scientist-guards were so loyal that many of them died of starvation despite being in a facility full of edible seeds, as well as potatoes, corn, rice, and wheat. The seedbank endured the siege and was replenished after the city was liberated.

Vavilov himself did not live to see the victory of his researchers, as he had been sent to a camp thanks to his disapproval of the scientific fraud of Lysenkoism and died (ironically, of malnutrition) before the war ended.

The Pavlovsk seed bank is at risk, but not yet doomed.
That is awesome. Thanks.
At first glance, it looks like a clear case of Bayesians vs. Barbarians to me.
Can? Of course. "Will?" Less likely.

I replied as follows: "What would you think of someone who said, "I would like to have a cat, provided it barked"? [...] As a natural scientist, you recognize that you cannot assign characteristics at will to chemical and biological entities, cannot demand that cats bark or water burn. Why do you suppose that the situation is different in the "social sciences?"

-- Milton Friedman

One of these things is not like the others, one of these things does not belong.

There are valid quibbles and exceptions on both counts. Some breeds of cats make vocalizations that can reasonably be described as "barking", and water will burn if there are sufficient concentrations of either an oxidizer much stronger than oxygen (such as chlorine triflouride) or a reducing agent much stronger than hydrogen (such as elemental sodium).

In the general case, though, water will not burn under normal circumstances, and most cats are physiologically incapable of barking.

The point of the quote is that objects and systems do have innate qualities that shape and limit their behaviour, and that this effect is present in social systems studied by economists as well as in physical systems studied by chemists and biologists. In the original context (which I elided because politics is the mind killer, and because any particular application of the principle is subject to empirical debate as to its validity), Friedman was following up on an article about how political economy considerations incline regulatory agencies towards socially suboptimal decisions, addressing responses that assumed that the political economy pressures could easily be designed away by revising the agencies' structures.

I was actually thinking in terms of 'cats can deliberately meow in an annoying fashion (abstract) like human infants and this behaviors seems perfectly modifiable, so a transhumanist could have a decent reason for preferring cats to bark than meow; and this is really stupid anyway, since we can change cats easily - we certainly can demand cats bark - but we can't change physis easily and can't demand water burn'.
pfsch. You can burn water if you add salt and radio waves. Or if you put it in an atmosphere containing a reactive fluorine compound. Etc etc etc.
That since their preference harms nobody (apart from unadopted cats) and the utility function is not up for grabs, I have no grounds to criticize them?
The preference alone is mostly harmless. When the preference is combined with the misapprehension that the preference can be fulfilled, it may harm the person asserting the preference if it leads them to make a bad choice between a meowing cat, a barking dog, or delaying the purchase of a pet. If the preference order were (1. Barking Cat, 2. Barking Dog, 3. Meowing Cat, 4. No Pet), then the belief that a cat could be taught to bark could lead to the purchase/adoption of a meowing cat instead of the (preferred) barking dog. Likewise, in the above preference order, or with 2 and 3 reversed, the belief in barking cats could also lead to the person delaying the selection of a pet due to the hope that a continued search would turn up a barking cat. The problem is magnified, and more failure modes added, when we consider cases of group decision-making.
"I would like to have a cat, provided it barked" states that U(barking cat) > U(no cat) > U(nonbarking* cat). Preferring a meowing cat to no cat is a contradiction of what was stated. The issue you raise can still be seen with U(barking cat) > U(barking dog) > U(no pet) > U(nonbarking cat), however - a belief in the attainability of the barking cat may cause someone to delay the purchase of a barking dog that would make them happier. *In common usage, I expect that we should restrict it from "any nonbarking cat" to "ordinary cat", based on totally subjective intuitions. I would not be surprised by someone who said "I would like an X, provided it Y" for a seemingly unattainable Y, and would not have considered whether they would want an X that Z for some other seemingly unattainable Z. I think they just would have compared the unusual specimen to the typical specimen and concluded they want the former and don't want the latter. This is mostly immaterial here, I think.
I stand corrected.
That's strictly ruled out by the wording in the quote. While people often miscommunicate their preferences, I don't see particular evidence of it there, or even that the hypothetical person is under a misapprehension. To take it back to metaphor: the flip side of wishful thinking is the sour grapes fallacy, and while the quote doesn't explicitly commit it, without context it's close enough to put me moderately on guard.
Here is the full article from which the quote was taken:

I do not pretend to start with precise questions. I do not think you can start with anything precise. You have to achieve such precision as you can, as you go along.

Bertrand Russell

Part of the reason atheism looks the way it does now, and is so lacking in warm fuzzies like "Love and Completeness are Your Spiritual Right," is because it is a refuge for people who think warm fuzzies are bullshit.

-- Dave Gottlieb

What about people who want to reject the claims of religion but still want warm fuzzies? Maybe atheism wouldn't get such a bad rap in the public eye if it felt more welcoming for people who want truths but also want the sense of community provided by religion.
Paganism? It seems like one of the more accepting groups, and you don't need to actually believe to celebrate/be in a community.
Interesting idea, but identifying as pagan will probably raise as many eyebrows as atheism, if not more. I think it would be better if there was more "The universe isn't concerned about us, so it's our job to be concerned about each other" among the atheist community, or something else that sounds welcoming and friendly.
So true, I totally think that way.
But warm fuzzies are bullshit.
I haven't once in my life made a good decision based on feel good thinking. Naturally I may be an outlier but overall models of the world that "feel good" are generally wrong models. I value having a accurate map even if it isn't useful (yes having a wrong map can be instrumentally valuable, and a positive outlook actually often is). Also warm fuzzies are one of the easiest way to manipulate someone. When someone tries to shower me with them I nearly indistinctly try to counterbalance them. Hm, now that I think of it that pattern matches to being a cynic.
I would have expected things to go your way every now and then simply by chance.

Every age has its own outlook. It is specially good at seeing certain truths and specially liable to make certain mistakes. We all, therefore, need the books that will correct the characteristic mistakes of our own period. And that means the old books.

C.S. Lewis, Introduction to a translation of, Athanasius: On the Incarnation

If I may continue it: From
Not likely to be much help if the new outlook is built upon the old in such a way that the mistakes of the old outlook are addressed by the new, but the mistakes of the new were not raised to the point of being able to be addressed within the old.
True, on the other hand, I suspect people around here tend to massively overestimate how often that happens.
Or, you know, some new books with a fresh outlook. Just saying.
Not written yet.

I am often wrong. My prejudices are innumerable, and often idiotic.

--H.L. Mencken

We shape our buildings, and afterwards our buildings shape us.

-Winston Churchill

The rest of the story is interesting; from An apt comparison would be Napoleon's reconstruction of Paris with broad straight streets, I think. (Code is Law.)
"We are shaped and fashioned by what we love." — Goethe

Songs can be Trojan horses, taking charged ideas and sneaking past the ego's defenses and into the open mind.

John Mayer, Esquire (the magazine, not the social/occupational title)

[I]ntractable problems are not a good reason to attempt impossible "solutions".

-- Eric Raymond

Don't shut up and do the impossible!

The existence of gray does not preclude the existence of black and white.

The existence of dawn and dusk does not preclude the existence of noon and midnight.

I'm not sure who originally said this but I vaguely remember the quotes from law school.

I like to say "there are such things as dawn and dusk, but the difference between night and day is like ..." - and here I pause just long enough for the audience to mentally anticipate me - "the difference between night and day."

A man said to the universe:
"Sir, I exist!"
"However," replied the universe,
"The fact has not created in me
A sense of obligation."

-Stephen Crane

More accurate:

A man said to the universe: "Sir, I exist!"

The universe says nothing.

Right, because Eliezer Yudkowsky wasn't addressing it.
There's no Universe; there's only a set of things which Eliezer Yudkowsky allows to exist !
Note from a "sympathetic outsider": I know you are joking, but the sorts of things like this subthread sometimes come across more creepy than funny.
There's a whole page of them too!
I was making an allusion...

He lifted a hand, his index finger pointing upward. "How many fingers am I holding up?" I paused for a moment, which was more consideration than the question seemed to warrant. "At least one," I said. "Probably no more than six"

-Kvothe, The Name of the Wind

The problem with "electability" is that it requires voters to set aside their own feelings on the basis of what they think other people will think in a general election months in the future. The problem with this is that people are generally bad at predicting what other people will think and feel and are lousy at predicting the future. As a result, voters in primaries who focus on electability either vote based on regurgitated popular wisdom of the moment, or on an assumption that other people won't respond to the same things that they respond t

... (read more)

Am I sure that there is no mind behind our existence and no mystery anywhere in the universe? I think I am. What joy, what relief it would be, if we could declare so with complete conviction. If that were so I could wish to live forever. How terrifying and glorious the role of man if, indeed, without guidance and without consolation he must create from his own vitals the meaning for his existence and write the rules whereby he lives.

Thornton Wilder, The Ides of March.

Most people you know are probably weak skeptics, and I would probably fit this definition in several ways. "Strong skeptics" are the people who write The Skeptics' Encyclopedia, join the California Skeptics' League, buy the Complete Works of James Randi, and introduce themselves at parties saying "Hi, I'm Ted, and I'm a skeptic!". Of weak skeptics I approve entirely. But strong skeptics confused me for a long while. You don't believe something exists. That seems like a pretty good reason not to be too concerned with it.

Edit: authorial instance specified on popular demand.

The next sentence is

It's not like belief in UFOs killed your pet hamster when you were a kid or something and you've had a terrible hatred of it ever since.

Skeptics will tell you that yes, it did. Belief that the Sun needs human sacrifices to rise in the morning killed their beloved big brother, and they've had a terrible hatred of it ever since. And they must slay all of its allies, everything that keeps people from noticing that Newton's laws have murder-free sunrise covered. Even belief in the Easter bunny, because the mistakes you make to believe in it are the same. That seems like a pretty good reason to be concerned with it.

Indeed. In fact there's a website: What's the Harm? that explains what damage these beliefs cause.

Victims of Moon Landing Denial Marvellous.
That actually seems to be a victim of belief in moon landing by people who have landed on the moon.
I was impressed when a skeptic source (sorry no cite) admitted that most people who read astrology columns do it for entertainment rather than for guidance in how to live their lives. I don't know why some people and groups damp out most or all of the ill effects of their arbitrary beliefs, while others follow arbitrary beliefs to the point of serious damage or destruction. I don't think I've seen this discussed anywhere.

More accurately, Yvain-2004

Is it more accurate to put it thus because Yvain-2012 disagrees with Yvain-2004 on this issue?

I don't know if there's enough of a specific, meaningful claim there for me to disagree with, but Yvain-2012 probably would not have written those same words. Yvain-2012 would probably say he sometimes feels creeped out by the levels of signaling that go on in the skeptical community and thinks they sometimes snowball into the ridiculous, but that the result is prosocial and they are still performing a service.

(really I can only speak for Yvain-2011 at this point; my acquaintance with Yvain-2012 has been extremely brief)

Well, even if Yvain-2012 does not disagree with Yvain-2004, it would be nice to have the year attached. I would like that the year-attachment convention for attributing quotes and ideas becomes more widespread. Right now, the default assumption that everybody makes is that people are consistent over time. In reality, people almost surely change over time, and it is unreasonable to expect them to justify something which their earlier selves said. So, it would be really nice if the default was year-attachment.
That would seem to have benefits relative to no further information (except the author's name), but would the benefits be greater than those afforded by the current convention of citing the relevant work? Or maybe you think people don't follow that convention enough and they would be more likely to cite something if the thing they had to cite was just a date?
Citing the original work would be the best I suppose. But in relatively informal contexts, like internet forums, it is probably easier for the reader to quickly have a sense of when the given quote was said if the year is attached.
I would say that for instance I don't believe that most alt med stuff works but this is exactly the reason I care that others know this and how we know this. This attitude infuriates me.
The fact is that there are many battles worth fighting, and strong skeptics are fighting one (or perhaps a few) of them. (As I was disgusted to see recently, human sacrifice apparently still happens.) However, I also think it's ok to say that battle is not the one that interests you. You don't have the capacity to be a champion for all possible good causes, so it's good that there is diversity of interest among people trying to improve the human condition.
I totally agree if its not your cup of tea fine. What pisses me off is the line about " if you don't believe it exists it seems like a good reason to not be concerned with it"
The previous quotation would seem to speak in favor of more strong skeptics.