Rationality Quotes: February 2011

by gwern1 min read1st Feb 2011354 comments

16

Rationality Quotes
Personal Blog

Take off every 'quote'! You know what you doing. For great insight. Move 'quote'.

And if you don't:

  • Please post all quotes separately, so that they can be voted up/down separately.  (If they are strongly related, reply to your own comments.  If strongly ordered, then go ahead and post them together.)
  • Do not quote yourself.
  • Do not quote comments/posts from LW. (If you want to exclude OB too create your own quotes thread! OB is entertaining and insightful and all but it is no rationality blog!)
  • No more than 5 quotes per person per monthly thread, please.
354 comments, sorted by Highlighting new comments since Today at 6:07 PM
New Comment
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

At home there was a game that all the parents played with their children. It was called, What Did You See? Mara was about Dann’s age when she was first called into her father’s room one evening, where he sat in his big carved and coloured chair. He said to her, ‘And now we are going to play a game. What was the thing you liked best today?’

At first she chattered: ‘I played with my cousin . . . I was out with Shera in the garden . . . I made a stone house.’ And then he had said, ‘Tell me about the house.’ And she said, ‘I made a house of the stones that come from the river bed.’ And he said, ‘Now tell me about the stones.’ And she said, ‘They were mostly smooth stones, but some were sharp and had different shapes.’ ‘Tell me what the stones looked like, what colour they were, what did they feel like.’

And by the time the game ended she knew why some stones were smooth and some sharp and why they were different colours, some cracked, some so small they were almost sand. She knew how rivers rolled stones along and how some of them came from far away. She knew that the river had once been twice as wide as it was now. There seemed no end to what she knew, and yet her father had not told h

... (read more)

A long one:

. . . once upon a time men lived among the giants, who were like themselves but far more powerful, and these giants always had a supply of bread, fruit, milk, and all that was necessary to sustain life, which they must have acquired in ways that cost them little, for they would always give away their goods to whoever knew how to please them. And the giants would also carry them wherever they wanted to go, provided they asked in the proper way. So it came about that men never thought of working, nor of walking, nor of building wagons or ships; instead they became natural orators, and spent all of their time watching the giants, figuring out what would please or displease them, smiling at them or imploring them with tears in their eyes; or else simply pronouncing the necessary words, which had to be memorized exactly, though they had no understanding of the changes of humor that would come over the giants, their brusque refusals, or their sudden willingness. Now, if some man, in those days, had tried to get something for himself by his own industry, they would have laughed him to scorn; for the results of his labor would have been puny beside the immense provisions th

... (read more)

I thought the punchline was going to be that the men were cats.

8Matt_Simpson10yNah, definitely dogs. They're the undisputed masters of manipulating humans in the animal kingdom.

Excepting other humans.

2[anonymous]10yI thought it was a Marxist parable, or something of the sort...an allegorical critique of capitalism, supervalue, the elite exploiting the masses. I must be in a bad mood because of the Cathie Black situation in NYC...where the "giants" are the democratic masses, who protested against the natural orators of our government... Last night was a "change of humor that would come over the giants"... a "brusque refusal"...but in the end the middle/lower classes "seemed nevertheless to be charged with nourishing them and housing them and transporting them, and who eventually carried out their duties, provided they were prayed to" (the "praying" being only the making of promises, "I stand for the middle class", "we'll create jobs for you", "think of the children!!11!!1!"). The masses do, at times, crush the endeavors of the orators (more than one reference to Egypt was made last night)...but for the most part the giant masses do what they are told, as long as they hear the right things, and have a cookie or a coo tossed to them now and then. I freely admit taking too much liberty with all of that...but it really is what I was thinking about as I read it.

I guess I'm far too literal-minded. The whole time I simply assumed the giants were a normal God parable. I was rather non-plussed about the whole quote until I saw "A meditation on childhood" and then my head exploded. I don't even remember being a kid anymore.

3CronoDAS10yI saw it coming before I read the line that explicitly mentioned childhood.
5Costanza10yOn the next page in the book, the author mentions, "I decided to go through with the fiction of the giants, although the reader will have seen by the third line where I was leading him." Personally, I didn't see it coming when I first read it. My first reaction was pretty much the same as Eneasz'.
0false_vacuum10yMe too.

This was wasted as a point about 'gods'. The commentary on human social instincts irrespective of belief in literal gods was far more insightful.

Ok, so it seems almost everyone got a different idea of who the giants and the men were. Children and adults, pets and humans, humans and gods, governments and populations (in both directions!), humans and computers...

My first impulse upon seeing this, is that this must be a very general phenomena that occurs in a great spectrum of situations. That all these different situations are isomorpic towards one another. The next is that we should come up with a generalized theory for the concept and maybe make up a word to access the concept quicker.

6SRStarin10yI didn't know where it was going at all until I hit the words "instead they became natural orators." It was a that point that I thought of my 17-month-old daughter. Thank you for a very timely message.
5gwern10yHah; I read through that entire thing expecting the punchline to be that the giants were computers.
2Costanza10yMaybe one day they will be. Or we will be, or they'll make paperclips of us all.
3AngryParsley10yFor most of the time I spent reading this quote, I thought the men were celebrities or demagogues and the giants were the populace.
-2simplyeric10yI thought it was a Marxist parable, or something of the sort...an allegorical critique of capitalism, supervalue, the elite exploiting the masses. I must be in a bad mood because of the Cathie Black situation in NYC...where the "giants" are the democratic masses, who protested against the natural orators of our government... Last night was a "change of humor that would come over the giants"... a "brusque refusal"...but in the end the middle/lower classes "seemed nevertheless to be charged with nourishing them and housing them and transporting them, and who eventually carried out their duties, provided they were prayed to" (the "praying" being only the making of promises, "I stand for the middle class", "we'll create jobs for you", "think of the children!!11!!1!"). The masses do, at times, crush the endeavors of the orators (more than one reference to Egypt was made last night)...but for the most part the giant masses do what they are told, as long as they hear the right things, and have a cookie or a coo tossed to them now and then. I freely admit taking too much liberty with all of that...but it really is what I was thinking about as I read it.

I will not procrastinate regarding any ritual granting immortality.

--Evil Overlord List #230

Day ends, market closes up or down, reporter looks for good or bad news respectively, and writes that the market was up on news of Intel's earnings, or down on fears of instability in the Middle East. Suppose we could somehow feed these reporters false information about market closes, but give them all the other news intact. Does anyone believe they would notice the anomaly, and not simply write that stocks were up (or down) on whatever good (or bad) news there was that day? That they would say, hey, wait a minute, how can stocks be up with all this unrest in the Middle East?

--Paul Graham

6simplyeric10yAn interesting concept...but I wonder. I bet at least some people would actually notice that. They'd see unrest in the middle east and say "hmm...oil prices didn't change the way I expected them to" or something. Sometimes you see things like " index rises in spite of ". I think Graham's inference has merit: these people don't really know what's happening...but I think some people at least would notice the anomoly.
9benelliott10yWell now I want to test this. Do we have anyone here who thinks they know a thing or two about the stock market? If so would they be amenable to an experiment? I'm thinking that they would agree not to look at any stock price information for a day (viewing all the other news they want). At the end of the day they are presented with some possible sets of market closes, all but one of which of which are fake, and we see if they can reliably find the right one.
2Gurkenglas7yFinding the most probable market outcome given a few possibilities and a day's news is easier than noticing by yourself that the news and the market don't fit.
2ig0r10yI will participate if you'd like to try, there are some problems with the experiment though
1benelliott10yI'm still interested, what changes would you suggest?
0ig0r10ySorry for the slow reply, want to do this over email? im gbasin at gmail
2benelliott10yI'm benelliott3 at gmail. To be honest I'm not very familiar with the stock-market so if you could suggest a procedure for the experiment, including such things as where to get the information that would be appreciated. Care to precommit to a discussion post about the experiment regardless of the result?
4private_messaging7yWell, the time Steve Ballmer announced he was to quit the Microsoft, Microsoft's stock jumped quite a bit, clearly because Ballmer quit, even though one could perhaps explain either a raise or a fall with Ballmer quitting. Expected square of a change was big from Ballmer quitting, that's for sure. Same goes for any dramatic news, such as the recent gas attack in Syria. And yes, over the time one could tell that something is up if the stock market graph is uneventful while there's dramatic news. Bottom line is, a causal link can exist and be inferred even when there is no correlation.

In the past, also, war was one of the main instruments by which human societies were kept in touch with physical reality.

(...)

In philosophy, or religion, or ethics, or politics, two and two might make five, but when one was designing a gun or an aeroplane they had to make four.

-- George Orwell, 1984

"Great is Bankruptcy: the great bottomless gulf into which all Falsehoods, public and private, do sink, disappearing; whither, from the first origin of them, they were all doomed. For Nature is true and not a lie. No lie you can speak or act but it will come, after longer or shorter circulation, like a Bill drawn on Nature's Reality, and be presented there for payment, - with the answer, No effects.

Pity only that it often had so long a circulation: that the original forger were so seldom he who bore the final smart of it! Lies, and the burden of evil they bring, are passed on; shifted from back to back, and from rank to rank; and so land ultimately on the dumb lowest rank, who with spade and mattock, with sore heart and empty wallet, daily come in contact with reality, and can pass the cheat no further.
[...]
But with a Fortunatus' Purse in his pocket, through what length of time might not almost any Falsehood last! Your Society, your Household, practical or spiritual Arrangement, is untrue, unjust, offensive to the eye of God and man. Nevertheless its hearth is warm, its larder well replenished: the innumerable Swiss of Heaven, with a kind of Natural loyalty, gather round it;

... (read more)
1NMJablonski10yThere is no greater joy than riding the words of Thomas Carlyle. He may not always be correct (although his point above is a blow of hard-hitting truth as great as any ever written) but his phrasing, his metaphors, his analogy, are all magnificent.
0SilasBarta10yBut ... but ... what about bankruptcies induced by a liquidity crunch -- the kind the political elite's propagandists have have been telling me entitle a "too big to fail" company to receive perpetual government assistance? In those cases, bankruptcy wouldn't suck up falsehoods, would it?
5gwern10yNo. But I think you* are guilty of affirming the consequent [http://en.wikipedia.org/wiki/Affirming_the_consequent]. If something is false, then it will end in bankruptcy - but that does not logically imply that everything ending in bankruptcy was false. So something true could still end in bankruptcy (for whatever reason, like a liquidity crunch). * Or Carlyle, I suppose, but given the choice between accusing a famous thinker of an elementary fallacy and a quick off-the-cuff Internet comment, I'd rather accuse the latter.
0BillyOblivion10yIf your business is structured such that a liquidity crunch will drive you bankrupt then some restructuring might be in order. Now one can (and I certainly would be willing) to make the argument that it is almost impossible for a small to medium sized business to structure itself such that a liquidity crunch exacerbated by an inept political class and a rapacious bureaucracy (or a rapacious political class and an inept bureaucracy, whatever). In that case it's best to take your ball and go home leaving the enlightened revolutionaries with the society they voted for.
-1SilasBarta10yEh, that's what I thought do, but the very idea looks to be beyond the pale. They tell me that we needed to make huge loans on terms no one else could get to prop up some large banks, and it's "only" to provide "liquidity". But my thought is: I find it extremely unsettling to be in an economy where such a huge fraction of it is based on business plans this brittle; and the sooner and more spectacularly they die off, the better a foundation future growth will be built on. But current mainstream thinking doesn't even allow such a thought. I also saw people present, as "evidence" of a liquidity crunch, the fact that overnight lending rights spiked from 4% to 6% annualized. Considering that these are the annualized rates for loans with a life of a few weeks at most, this is a trivial increase in borrowing costs. A business so fragile that it can't withstand paying a few extra pennies for ultra-cheap loans every once in a while ... well, any economy dependent on such brittle business plans is living on borrowed time anyway.
3RolfAndreassen10yWell, I don't know. The sort of gun you had before modern precision machining, 4.2 would be good enough, maybe 4.3 at a pinch.
3CronoDAS10y

"Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered.
We should forget about small efficiencies, say about 97% of the time; premature optimization is the root of all evil."

--Donald Knuth (see also Amdahl's law)

A premature really powerful Optimization Process is the root of all future evil.

6CronoDAS10y"The first rule of code optimization: Don't."
4imaxwell10yI never thought of this quote outside the context of programming before reading it here, but it does seem pretty generally applicable. The force behind premature optimization is the force that causes me to spend so much time comparison shopping that the time lost eventually outvalues the price difference; or to fail to give money to charity at all because there may be a better charity to give it to. (I've recently started donating the dollar to Vague Good Cause at stores and restaurants when asked, because it's all well and good to say "SIAI is better," but that defense only works if I then actually give the dollar to SIAI.)

The Company that needs a new machine tool is already paying for it.

-old Warner & Swasey ad

Just saw on reddit a perfect accidental metaphor: jakeredfield posted this in r/gaming:

For the people that have no played Portal yet, be warned, there may be spoilers up ahead for you.

So anyway, I am a huge fan of Portal, I love everything about the game. I bought it upon release and have played through it multiple times. My friends aren't as big of gamers as me so it took them some time to get their hands on Portal. My one friend didn't have a computer capable of running Portal so I let him play on mine.

I pulled up a chair besides him and eagerly watched him play then entire time. He loved the game. I expected him to. It's an awesome game. But here comes the WTF part...(SPOILERS AHEAD)

He go to the part at the last puzzle, right before GlaDOS tries to kill you in the fire. So then, my friend is like, "Oh, so it's one of those games where you die at the end. Haha, it was a good game." And then he immediately shuts it down. I just sat there. Shocked. In awe. I couldn't believe what I just saw. He turns to me and goes, "Good game, I'd play that again."

This is the part where I just hit him and yell, "IT WASN'T OVER YET!" He was so confused. He loaded it

... (read more)

KanadianLogik adds:

[...] Imagine if you really were Chell, and just accepted your fate....

2Aryn10yIt's possible that if there were several copies of Chell, some of them did.
5Normal_Anomaly10yUnfortunately, I think I saw somebody else play that section correctly before I played it myself. Still, if I had died, I would've come back at the last time I saved. That would've clued me in that I was supposed to survive, and I probably would've figured it out in one or two more tries tops.
2atucker10yI am going to shamelessly and totally steal this example for when talking about anti-deathism to anyone. Seriously, thank you so much.
1Psy-Kosh10yYou're welcome. No need really to thank me. After all, I shamelessly stole it too. It was just too perfect. :)
0imaxwell10yI just had to comment on this, it's too perfect. Thanks.
1Psy-Kosh10yYou're welcome. :)

Kräht der Hahn am Mist, ändert sich's Wetter oder es bleibt wie's ist.

-- Common German folk saying

Translates as "If the rooster crows on the manure pile, the weather will change or stay as it is." In other words, P(W|R) = P(W) when W is uncorrelated with R.

Another good one:

Ist's zu Sylvester hell und klar, ist am nächsten Tag Neujahr.

"If it's bright and clear on New Year's Eve, the next day will be New Year's."

9D_Alex10yI'll chip in with this Russian saying: "It is better to be rich and healthy than to be poor and sick!"
9Kutta10yWoody Allen had a take on it too:
4bbleeker10yAls het regent in mei, is april al voorbij. (If it rains in May, April is already past)

Speed is not attained by hurrying; it is an unsought by-product of intelligent and continuous work.

-- Frederick Giesecke, et al, Technical Drawing, 8th ed

1RichardKennaway10yOn similar lines: Ancient Latin saying. [http://en.wikipedia.org/wiki/Festina_lente]

"After solving a problem, humanity imagines that it finds in analogous solutions the key to all problems.
Every authentic solution brings in its wake a train of grotesque solutions."

--Nicolás Gómez Dávila, Escolios a un Texto Implícito: Selección, p. 430

1roryokane10yCan you give some examples of when that has happened? I’m having trouble of thinking of any. The widespread use of computers seems to have been a great success, on the whole.
4gwern10yWho said it was about computers per se? I didn't. I was personally thinking more of electricity and radiation. Electric belts! Electroshock therapy! Electric toothbrushes! The mind as a power grid! Radium salt supplements! Radioactive watches! Well actually we still use tritium for that. (Or to take a more recent example, microfilm. "Let's put everything on microfilm and shred all the original newspapers and books! What could possibly go wrong?") We may be a little too close to the computer to see the silliest and most grotesque solutions it has provided, although The Daily WTF may be a good start.
0roryokane10yI know you didn’t mention computers; it was just the first example that came to mind. It seemed like if the quote would apply to anything, it would apply to computers most of all, but it didn’t. But good points about electricity and about being computers being too recent.

You know in those stories where there's this immortal guy and they talk about how bored they are and how boring life is after 5000 years or whatever? I am going to call something.

I am going to call SHENANIGANS.

You know who writes those stories? MORTALS. Folks using some of their PRECIOUS, FINITE LIFE to write a made-up story in which an imaginary person keeps going on about how being immortal is actually sucky and how they're totes jealous that others get to die someday!

Ridiculous!

And kinda sad!

-- Today's Dinosaur Comic

Things are only impossible until they're not.

-- Jean-Luc Picard

Sometimes not even then.

1sketerpot10yExcept when they really are.
2Snowyowl10yNo, then too.
1AstroCJ10yUnless...?

As they say in Discworld, we are trying to unravel the Mighty Infinite using a language which was designed to tell one another where the fresh fruit was.

-- Terry Pratchett

6TheOtherDave10y"Language is a drum on which we beat out tunes for bears to dance to, when all the while we wish to move the stars to pity." -- Flaubert

Some in their discourse desire rather commendation of wit, in being able to hold all arguments, than of judgment, in discerning what is true; as if it were a praise to know what might be said, and not what should be thought.

Francis Bacon

3Eliezer Yudkowsky10yI shall have to quote this a good deal more when dealing with people who chide me for not mentioning all the possible objections that philosophers consider to still be in play.
2David_Gerard10yIt doesn't help that undergraduate philosophy has rather a lot of enumerating the history of philosophical arguments regardless of quality.
1sark10yWell, sexual selection chose wit as the target for our intelligence, not discernment of the truth of matters of Far concern. Anybody can figure out the truth of the Near, where is the impressiveness in that? Nobody can verify Far claims, so we don't know who should impress us.

The following reminded me of Arguments as Soldiers:

Statistics for the enemy. Anecdotes for the friend. -- Zach Weiner

I'm sorry to have not found his blog sooner.

0Snowyowl10yWeiner has a blog? My life is even more complete.

Apathy on the individual level translates into insanity at the mass level.

-- Douglas Hofstadter

9TheOtherDave10yInsanity will prevail when sane men do nothing? (Apologies to Edmund Burke)
3Kazuo_Thow10yI think this adaptation is much more precise than the original.
3Matt_Simpson10yNot when apathy and insanity are correlated. See, e.g., The Myth of the Rational Voter [http://www.amazon.com/Myth-Rational-Voter-Democracies-Policies/dp/0691129428]

Give a man a fish and he'll eat for a day. Give a man a fishing rod and he'll sell it for a fish.

  • ???

Make a man a fire and he'll be warm for a day. Set a man on fire and he'll be warm for the rest of his life.

(Terry Pratchett, I think.)

I saw a creepy hospice volunteer search ad on the street a few days ago. It said something along the lines of "They will be grateful to you for the rest of their lives." Like an inappropriate joke.

6Kaj_Sotala10yThat's... disturbing, but also weirdly compelling.

I think it's more elegant to say it like this: "Light a man a fire and he'll be warm for a day. Light a man afire and he'll be warm for the rest of his life."

2shokwave10yIn text, yes. I said it aloud a few times and I couldn't tell the two apart easily. Maybe "light a man A fire / light a man ON fire"
1Alicorn10yI've successfully delivered "a fire"/"afire" aloud, but it's a little tricky to time right.
0NihilCredo10yA little gesturing will likely help a lot.
1Kaj_Sotala10yI find my formulation slightly quicker to parse, but otherwise you're right.
0Sniffnoy10yFrom Jingo, IIRC. Also I think the second line began "But set fire to him..."
0Snowyowl10yIIRC, he uses this joke several times.
0Sniffnoy10yAh, nevermind then.
6Kyre10yGive a man a fish, feed him for a day Teach a man to fish, feed him for around 15 years until his major fishery collapses into unprofitability.
6MartinB10yThat looks like a description of one problem with support of developing countries.
0RichardKennaway9yUnknown source. But I'm not sure what any of the variants of this have to do with rationality.
0TheOtherDave9yWell, a lot of instrumental-rationality posts around here are basically about the benefits of devoting effort in the short term to developing techniques for making a class of task easier, rather than devoting effort in the short term to implementing an instance of that class with more difficulty. Also, efficient charity is a recurring theme. Whether either of those things have much to do with rationality is a broader question, but they certainly seem relevant to the quote.

"I submit that claims about God are of this latter sort. There’s simply no reason to take them more seriously than one does claims about witches or ghosts. The idea that one needs powerful philosophical theories to settle such issues I like to call the “philosophy fallacy.”

We will see that people are particularly prey to it in religious discussions, both theist and atheist alike; indeed, atheists often get trapped into doing far more, far riskier philosophy than they need."

--Georges Rey, "Meta-atheism: Religious Avowal as Self-deception" (2009)

(First version seen on http://www.strangedoctrines.com/2008/09/risky-philosophy.html but quote from an expanded paper.)

It's true that the question of God's existence is epistemologically fairly trivial and doesn't require its own category of justifications, and it's also true that even many atheists don't seem to notice this. But even with that in mind, it almost never actually helps in convincing people to become atheists (most theists won't respond to a crash course in Bayesian epistemology and algorithmic information theory, but they sometimes respond to careful refutation of the real reasons they believe in God), which is probably why this point is often forgotten by people who spend a lot of time arguing for atheism.

-2ChristianKl10yChoosing good priors isn't something that's epistemologically fairly trivial. Using the majority opinion of the human race as a prior is a general strategy that you can defend rationally.
4komponisto10yUse it as a prior all you want; but then you have to update on the (rest of the) evidence.
-2Will_Newsome10yIt's really epistemologically difficult to find out what people mean by God in the first case; how then can it be epistemologically trivial to judge the merits of such a hypothesis?

Difficult to pin down within a range of trivial-to-judge positions.

0false_vacuum10yWith, possibly, vanishingly rare exceptions.
9DSimon10yIf a given hypothesis is incoherent even to its strongest proponents, then it's not very meritorious. It's in "not even wrong" territory.
2Will_Newsome10yI strongly suspect that there is a lot of coherence among many different spiritualists' and theologians' conception of God, and I strongly suspect that most atheists have no idea what kind of God the more enlightened spiritualists are talking about, and are instead constructing a straw God made up of secondhand half-remembered Bible passages. In general I think LW is embarrassingly bad at steel-manning.
-1ChristianKl10yCoherence isn't necessary factor for a good theory. In artificial intelligence it's sometimes preferable to allow incoherence to have higher robustness.
2NihilCredo10yCould you expand?
0false_vacuum10yIt's Georges Rey. I know because I sat through an entire class that he taught once. I think I also read his book, Contemporary philosophy of mind: a contentiously classical approach, during that time, but I don't recall learning anything from it. Can someone who has actually read the paper (I don't feel like it) tell me whether it has the same upshot as the earlier version I seem to remember, viz. that people only pretend to believe in God? (It's possible I've got this mixed up with something else.)
2gwern10yThanks. It does; as I said, it's an expanded paper.
-4Will_Newsome10yI... what... is this some kind of atheistic affective death spiral? How could this possibly be construed as a reasonable analogy, even rhetorically? And with such a smug tone? Why are we tolerating blatantly misleading dark arts that appeal to the inductive biases of our epistemological reference class?

What is unreasonable about the analogy? All three are claims about apparently unfalsifiable super-natural entities with no normal epistemological support, and many arguments for God would seem to work as well for other such entities. (As Anselm's contemporary pointed out, his ontological argument served as well to prove the existence of perfect demons or islands or fairies.)

If you disagree, a read of the paper might be in order so you don't have to resort to accusations of the Dark Arts.

-4Will_Newsome10yBayesians don't care about unfalsifiability, 'supernatural' can only be constructed relative to a limited ontology (things that aren't made up of subatomic particles are supernatural, say; variants on an algorithmic ontology have room for something like a God) and is thus a dangerous and slippery word, and the hypothesis of there existing something important called a God has ridiculous amounts of epistemological support even if there is lots of evidence against such a hypothesis as well.
6benelliott10yBayesians care a lot about unfalsifiability, a theory can only gain probability mass by assigning low probabilities to some outcomes (if you don't believe me then go read Eliezer's technical explanation of technical explanation). "Anything not made from subatomic particles" is a poor definition of the supernatural, since it leaves is irrationally prejudiced against the idea that subatomic particles could be made out of something else, which is a perfectly reasonable hypothesis (currently one with no evidence for it, but we still shouldn't be prejudiced against it). Try "ontologically fundamental mental states" for a better definition of supernatural, and a much better one since there is very good reason to assign a low prior to such claims (there is a huge number of imaginable ontologically fundamental things which are simpler than mental states, so by occam's razor and the principle of limited probability mass any hypothesis that claims they exist gets a very low prior). Hypothesis: Most if not all of this epistemological support of which you speak is bad philosophy, possibly based on the mind projection fallacy, which could just as easily have been constructed to defend witches or ghosts if someone had had enough reason to do so.
3Oscar_Cunningham10yTo be more precise (and more correct) we should say that it can gain probability mass, but only when more precise hypotheses are falsified. If I think a coin is either fair or biased toward heads, and then it comes up tails three times, it's probably fair.
-2Will_Newsome10yNitpicking; I meant that falsifiability-in-practice-as-such-the-way-most-people-use-the-word is not a necessary precondition for determining which hypotheses to pay attention to. Apparently unfalsifiable hypotheses (which are nonetheless probably actually falsifiable with enough computing power) like the existence of a creator God are thus fair game for Bayesians, and pointing out their apparent unfalsifiability isn't scoring a point for the atheists. Right, but you have to use a poor ontology in order to get a concept that even looks like supernatural in the first place... this is an argument against using the word supernatural at all. God is just not supernatural if you are using the right ontology. I don't know what an ontologically fundamental state would look like (when I think of people who believe in the supernatural that does not seem to describe their beliefs at all), and I don't see how that conception is at all relevant to gods, witches, or ghosts. We can follow that digression, as I'm really curious as to what people are trying to explain when they talk about supernaturalism as belief in ontologically fundamental mental states, but it doesn't seem relevant to the OP. Most? Yes, of course yes. To a first approximation, everyone everywhere always has always been wrong about everything, including all of atheism and science. But all? Not even close. Here's a basic argument for a somewhat vague Creator God: the universe exists. Things that exist tend to have causes. Powerful things like superintelligences or transcendent uploads are good at causing things. This universe might have been caused by one of those really powerful things. That we feel better when we call those powerful things 'superintelligences' instead of 'gods' just says something about our choice of ontology, not about the righteousness of our epistemology.
8shokwave10yHere's a basic knockdown: "the universe" is not a thing in the way that requires a cause. It's a category of things, so if you must assign it a cause, it is caused by the existence of things (and maybe a desire to refer to everything). By way of demonstration, if you listed every physical thing that makes up the universe, and I found some physical things that existed but were not on your list, would you say "there are things outside the universe" or would you add those things to the list? (That is, your argument needs to point to things that are likely to be caused by superintelligences / transcendences. I would point to all the things we know of so far as being very unlikely to have been caused by superintelligences / transcendences, and claim that the rest of the universe probably shares that same property.)
6benelliott10yWrong. A consequence of Bayes' Theorem is that if two theories A and B fit the data equally well, but A fits hypothetical alternative data better than B does (in other words, B is more falsifiable) then A must assign a lower conditional probability to the actual data than B, by conservation of probability. This means that regardless of where the priors start out, if we keep accumulating evidence without falsifying either the probability of A must eventually become vanishingly small, too small for any reasonable person to even spare the time to consider the hypothesis. Believing in ontologically fundamental mental states means the you believe that the actual territory, as opposed to a map, contains minds. This can seem reasonable, but the reasonableness is an illusion caused by the fact that our monkey brains are pretty good at thinking about other monkey brains and pretty bad at thinking about much simpler things, such as maths. God falls into this category as normally postulated, since he is usually assumed to be fundamental and is usually assigned mental states as well as exhibiting the complex behaviour typical of minds. Ghosts fall into it since for a person's mind to survive the destruction of the physical entity it was contained in/supervenient upon it must have its own ontologically fundamental properties. Here's the corresponding argument for ghosts: death is an event. People's conciousness tends to continue existing through most events, so it probably continues existing through death even though it has never been observed to do so (the same way no universe has been observed to have a cause). Therefore minds must continue existing after death, and we might as well call them ghosts. Motivated cognition at its worst. No, it shows that we are cautious that the connotations of our statements don't say anything that we don't mean.
-1Will_Newsome10yAtheism is also unfalsifiable in practice, though, so I don't see the relevance. And I think the positive evidence points somewhat towards theism, not atheism. Thus I find theism more likely. Thanks for the explanation. God is normally considered to be outside the universe; why do you say he is usually assumed to be fundamental? Also note that there are many conceptions of God, some of which actually are something like fundamental even if isomorphic to a more detailed description (with less errant connotations) of the structure of the ensemble universe. And this is where I start thinking you're crazy, for thinking this is even close to a corresponding argument. What similarities do you see? Every single thing we have ever seen has a cause. We have seen the universe. We postulate a cause, by simple induction. We have seen peoples' consciousness fade in and out as they go to sleep or fall into comas. We postulate that it is thus probable that death is like an endless coma. It is hard for me to fathom how you could possibly have seen your argument for ghosts as being at all in the same reference class, except that 'ghosts' and 'gods' are both contemptible hypotheses 'round these parts. Like, seriously, your analogy on the meta level seems to me like motivated cognition at its worse. This is only because we already have a word for 'superintelligence'. Most people don't. My point was that we shouldn't be automatically contemptuous of concepts that are really damn similar to the ones we're already postulating just because they're labeled in the language of the enemy.
4benelliott10yIf God rides down from heaven hurling lightening bolts in all directions and wantomly altering the very nature of reality, I will consider atheism to be falsified. This is an extreme case, but there are many observations that could falsify, or at least provide very strong evidence against, atheism. Don't confuse unfalsified with unfalsifiable. Either God can be reduced to something else or he is fundamental. No conception of God that I have ever heard of can be reduced (I'm not show how he could create the universe if he was reducible) so it seems likely he is usually assumed fundamental. I'm afraid I can't understand what you mean here. The universe itself has no observed cause, so this statement is false. It seems likely that there is at least one uncaused thing, since otherwise you have an infinite regress, and the universe seems like as good as bet as any for what that thing is, since it has no observed cause and it belongs to a very different reference class to everything else. Ah, but we have never observed conciousness to be ended permanently except by death. You may challenge that this is not evidence since it is true by definition, but if you think about it the fact that the universe has no observed cause is also true by definition, since if it did have an observed cause we would just have included that thing in 'the universe' and then asked what caused it. Its not about 'God' being the language of the enemy, its about the fact that it has been used by too many people to mean too many things and it has reached the point where even to use it is to imply many of those things. If someone wants to talk about the cause of the universe they should call it 'Flumsy', since that way nobody gets confused. Think about it this way. If you are working on an algebra problem and you have some complicated term in your equation that you want to define so you don't have to write out the whole thing every line, you might decide to call it 'x'. This is a perfectly legi
2NihilCredo10yI would be curious to know if you are putting this forward as an hypothetical argument for the sake of the discussion, or as an actual summarised argument that you really do find at least somewhat persuasive.
0Will_Newsome10yI do find something quite like it somewhat persuasive.

If people can't think clearly about anything that has become part of their identity, then all other things being equal, the best plan is to let as few things into your identity as possible.

Paul Graham

Statistics is applied philosophy of science.

A. P. Dawid

People who have been living with serious problems for a long time find it hard to imagine that there's been a solution within their reach all along. For the short term, it's easier to go on putting up with the problem than it is to change one's expectations.

paulwl (quoted here)

ETA: I thought this had the smell of Usenet about it, and on Google Groups I found the original, written by one Alex Clark here. paulwl is actually the person he was replying to.

BTW, there's quite a bit of rationality (and irrationality) on that newsgroup on the subject of people looking for relationships (mostly men looking for women), from way back when. I don't know if 1996 predates the sort of PUA that has been talked about on LW.

"But can people in desperate poverty be considered to be making free choices? Many say no. So, is the choice between starving and selling one’s kidney really a choice? Yes; an easy one. One of the options is awful. To forbid organ selling is to take away the better choice. If we choose to provide an even better option to the person that would be great – but it is no solution to the problem of poverty to take away what choices the poor do have absent outside help."

Katja Grace, on Metaeuphoric, Dying for a Donation

Life is tough, but it's tougher if you're stupid.

-John Wayne, Sands of Iwo Jima (1949)

History doesn't repeat itself, but it does rhyme.

-Mark Twain

[-][anonymous]10y 14

On two occasions I have been asked, – "Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?" In one case a member of the Upper, and in the other a member of the Lower House put this question. I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

-Charles Babbage

5gwern10yDupe [http://lesswrong.com/lw/bs/rationality_quotes_april_2009/8ch]
0[anonymous]10yUpvoted. I didn't know it was already posted, I've read quite a few of these quote threads but never commented before or noticed that one. What's the protocol for this? Should I delete the post?
2gwern10yI think you just accept quietly your downvotes or lack of upvotes, and remember to search next time. (Also, Clippy - nice try.)
0[anonymous]10yI did search but rather lazily (just entered the text and logically nothing came up). But that was very sloppy, I should have searched for "Charles Babbage" and skimmed the quotes that came up.
-3Clippy10yThe procedure is to delete the post and then send User:Kevin 10,000 USD, asking that it be credited toward User:Kevin's contract with User:Clippy. Edit: The procedure is NOT to delete the post and then send User:Kevin 10,000 USD, asking that it be credited toward User:Kevin's contract with User:Clippy.
4cata10yFunnier the first time. [http://lesswrong.com/lw/2nz/less_wrong_open_thread_september_2010/2jiy]
1bcoburn10yYou should try asking people to send smaller amounts of money at once, it's slightly more likely to work.

Too broad a viewpoint, too philosophical an outlook paralyzes the will.

-- Robert A Heinlein, Lost Legacy

Go not to the elves for counsel, for they will say both no and yes.

-- Frodo Baggins, conveying one of the many wise sayings that Hobbits chuck around daily. The elf he was talking with thought it was hilarious, but refused to simply agree or disagree with it.

4mwengler10yI prefer its negation: "Go to the elves for counsel, for they will say both no and yes."

After finishing dinner, Sidney Morgenbesser decides to order dessert. The waitress tells him he has two choices: apple pie and blueberry pie. Sidney orders the apple pie. After a few minutes the waitress returns and says that they also have cherry pie at which point Morgenbesser says "In that case I'll have the blueberry pie."

http://en.wikipedia.org/wiki/Independence_of_irrelevant_alternatives

http://en.wikipedia.org/wiki/Sydney_Morgenbesser

1mcandre10ySidney chooses pies on the basis of popularity. Apple pie is more popular than blueberry pie. Apple pie is so popular that pie eaters have grown sick of it. They quickly gorge on the new cherry pie. When the fad dies down, they are still sick of apple pie and begin a blueberry revival. Sidney correctly predicts that blueberry will be more popular.
0endoself10yHis preferences in that scenario do not violate independence of irrelevant alternatives (that might be your point; I'm not sure). This is meant as an intuition pump to show the absurdity of violating IIA, not a watertight argument that the observed behaviour does in fact violate it.

Some pirates achieved immortality by great deeds of cruelty or derring-do. Some achieved immortality by amassing great wealth. But the captain had long ago decided that he would, on the whole, prefer to achieve immortality by not dying

-- The Colour of Magic, Terry Pratchett

5NihilCredo10ySo that's where Woody Allen got it from.
6ata10yI haven't been able to find the original source of the Woody Allen quote, but it seems "The Colour of Magic" was published in 1983, and Google Books finds some copies of the Woody Allen quote predating that.
1NihilCredo10yAhh, nevermind then. (I only looked it up on Wikiquote, which referenced a bio-photo-book from 1993).
2RobinZ10y...I thought you were being ironic. o_o
[-][anonymous]10y 13

To study and not think is a waste. To think and not study is dangerous.

-Confucius

Teachers open the door. You enter by yourself.

-Chinese proverb

Sometimes they only unlock the deadbolt, and you need a friend to help push open the door. Sometimes the door is on the top of a cliff, and you need to climb up the rope of Wikipedia to get there. And so on. A lot of people who are having trouble learning something are having trouble realizing what resources they have available.

8FiftyTwo9yIts a bizarre feature of university life that it is very difficult to get students to take opportunities for help, even when they are obviously and explicitly provided.

And the reasons those students don't take opportunities for help tend to be embarrassingly pathetic. Like, so embarrassing that they avoid even thinking about it, because if they made their real reason explicit, they would be pained at how dumb it is. (I've done this sot of thing myself, more times than I'm comfortable with.)

For example, I discovered that a significant fraction of the students in a certain class were afraid to ask questions of the professor because they found him scary. Now, I know the professor in question, and he's a friendly person who wishes that his students would talk to him more -- but he has an abrupt, somewhat awkward way of speaking, and an eastern European accent. Such superficial details are apparently what leaves the biggest impression on most people.

Or there are the guys who get depressed and stop coming to class for a week or two, and then keep on not coming to class because they haven't been to class for a while, and it would be hard trying to get back up to speed. I really sympathize with these guys, but that doesn't make their reasoning any saner. (A fair number of them come in at the end of a semester to flunk their final exams. Damn it all, this... (read more)

5FiftyTwo9yMy experience of students here at [prominent UK university] is that they are very unwilling to ask for help because they have never needed to do so before, and so consider asking for help as a sign of weakness/low intelligence/low status. This makes a certain amount of sense, the people who have been able to meet entry requirements are likely in the top percentile of their subject and been the best or nearly at their school. Generally this has been the result of either natural ability or brute force work (memorising equations and examples etc) rather than acting strategically and gaining study skills such as the ability to find new sources f information or ask for help. So they either despair at the seeming impossibility of their tasks, or spend increasingly large amounts of time brute forcing the work and burn out. It takes a lot for people to understand that needing help doesn't mean you are stupid, but that the work is hard and its supposed to be hard.
3Swimmer9639yI've often found that this is so. I do try to read my textbooks, at least the assigned readings, because...well, because you're supposed to, I guess. But for most of my first year classes (three anatomy courses, psych 101, microbiology) just going to class was enough. (I did of course take detailed notes, with colourful diagrams, and then study from my notes afterwards. I have now bequeathed my anatomy notes to a friend a couple of grades younger.) One possible reason why this is true for me is that I like biology-related subjects, and I've always read anything I could get my hands on, and so I arrived in university to find that I already knew at least 50% of the material. Areas where this isn't true: English classes, history classes, etc, where there are a lot of required readings that cover material not covered in class, and where there are essays or papers to be written on material that isn't covered in class. And of course there's no rule that you can get good grades without reading textbooks. It just happens to be true sometimes, for some people.
3Eliezer Yudkowsky9yThat's not "adversity", that's "solvable problems requiring initiative".

Admitting error clears the score and proves you wiser than before.

--Arthur Guiterman

Will_Newsome pointed out the caveat that it's only good to admit errors when actually in error. I'd add a second caveat, which is that most of the benefit from admitting an error is in the lessons learnt by retracing steps and finding where they went wrong. Each error has a specific cause - a doubt not investigated, a piece of evidence given too much or too little weight, or a bias triggered. I try to make myself stronger by identifying those causes, concretely envisioning what I should have done differently, and thinking of the reference classes where the same mistake might happen in the future.

5Pavitra10yThe wording actually given in this quote avoids the problems discussed by Will_Newsome and jimrandomh: admitting error clears the score, resets it to zero. If you were wrong, this wipes out your negative score, for a net win; if you were right, it wipes out your positive score, setting you back.
2benelliott10yI think you meant to say right instead of wrong in this bit.
0Pavitra10yFixed.
1false_vacuum10yTrue, but clearly unintentional.
3Will_Newsome10y(Unless you weren't in error. Once you start awarding yourself internal karma for admitting that you were wrong, it becomes much easier to do so even when you weren't actually wrong. Of course, this is sidestepped with empiricism.)

At my mother's knee I learned to view religious worship as a practice which lures people away from their duties and pleasures on earth, and breeds in them a thirst for impossible things, the chasing of which can bring no honour or delight but only bewilderment, disappointment, and insanity.

  • K. J. Bishop, "The Etched City"

(a sentiment I think applies to all super-stimuli)

How emotionally entangled are you with your point of view? Test yourself - defend an opposing view, believing your life depends upon it.

-- Marc Stiegler, David's Sling

5Desrtopa10yThere seem to be separate failure conditions here though. You could fail because you're too emotionally invested in your view, or you could fail because you can spot the flaws in all the arguments for the opposing view. If your original view was actually right, then you're not at fault. Since this can be hard to distinguish from motivated cognition, I think the exercise is questionably useful.
4Nornagest10yI don't think the point of the exercise is to successfully defend the opposing point of view but to make a good-faith attempt to come up with an argument for it without getting your original emotions involved. If you can conjure up a coherent argument for the opposing side (allowing for a slightly different set of priors), that's some evidence that you're looking at consequences rather than being strung along by motivated cognition. If you can't -- and this is pretty common -- that's good evidence that the opposing view has been reduced to a caricature in your mind. It's a litmus test for color politics [http://lesswrong.com/lw/gt/a_fable_of_science_and_politics/], in other words. Not a perfect one, but it doesn't have to be.
4Kaj_Sotala10yI keep seeing insightful bits from this book [http://www.amazon.com/Davids-Sling-Marc-Stiegler/dp/0671653695] (for instance, here [http://lesswrong.com/lw/mm/the_fallacy_of_gray/] and somewhere else that I forget). Am I correct when I say it seems worth reading as rationalist fiction?
1Eliezer Yudkowsky10yIt's very nearly one of the only pieces of rationalist fiction out there.
0XiXiDu10yen.wikipedia.org/wiki/Marc_Stiegler [http://en.wikipedia.org/wiki/Marc_Stiegler]
0NancyLebovitz10yI'm quite fond of "The Gentle Seduction", but I eventually noticed how he simplified his problem-- he wrote about a relatively isolated person.
0kpreid10yI haven't read David's Sling; but I read his Earthweb and found it to be not-particularly-deep futurism (primarily presenting the idea of prediction markets).

"Nor let him [the ruler] ever believe that a state can always make safe choices; on the contrary, let him think that he must make only doubtful ones; because this is in the order of things, that one never tries to avoid one inconvenience without incurring another; but prudence consists of knowing how to recognize the kinds of inconveniences, and to take the least sad for good."

--Niccolò Machiavelli, The Prince

It has never mattered to me that thirty million people might think I'm wrong. The number of people who thought Hitler was right did not make him right... Why do you necessarily have to be wrong just because a few million people think you are?

-- Frank Zappa, quoted from The Real Frank Zappa Book

0David_Gerard10yZappa was a fantastic example of someone who kept their head firmly screwed on while simultaneously exercising his inner rampaging weirdness. Everyone should read the book.

We've all bought and enjoyed books called 'Optical Illusions'. We all love optical illusions. But that's not what they should call the book. They should call them 'Brain Failures'. Because that what it is: a complete failure of human perception. All it takes is a few clever sketches and our brains can't figure it out.

  • Neil deGrasse Tyson

Transcribed from http://www.youtube.com/watch?v=CAD25s53wmE

8Dr_Manhattan10yDisagree, at least in some instances. Many of these are just results of optimizing for normal environment. There is a theorem in machine learning (blanking on the name) that says any "learner" will have to be biased in some sense.
7fiddlemath10yThe No Free Lunch Theorem [http://en.wikipedia.org/wiki/No_free_lunch_theorem]. Also, just because we can't expect to be free of bias doesn't mean that the bias is "proper functioning" of the hardware. An expected failure, perhaps, but still a failure.
3Dr_Manhattan10yI make a finer distinction of "failure" as something that's inefficient for it's clear purpose. E.g. Laryngeal nerve of the giraffe. Evolution will do that on occasion. Sensory interpretations that optical illusions are based on are often optimal for the environment, and are a complement to the power of evolution if anything. Viewing something that is optimal as a failure seems like wishful thinking (though I suspect this is more of a misunderstanding of neurobiology).
2fiddlemath10yActually, that seems kind of fair. Something is a "failure to X" if it doesn't achieve X; something is a "failure" if it doesn't achieve some implicit goal. You can rhetorically relabel something a "failure" by changing the context. Vision works well in our usual habitat, so we should expect it to break down in some corner cases that we can construct: agreed. For me to argue further would be to argue the meaning of "failure" in this context, when I'm pretty sure I actually agree with you on all of the substance of our posts.
2Dr_Manhattan10yI really do not want to argue about semantics either, but our agreed interpretation makes Niel's statement equivalent to "our visual system is not optimal for non-ancestral environments", which is highly uninteresting. I think the Dawkin's larengyal nerve example is much more interesting in this sense, since it points out body designs do not come from a sane Creator, at least in some instances (which is enough for his point).
3AstroCJ10ySince we do not live in the ancestral environment now, I think the quotation could be just underlining how we should viscerally know our brain is going to output sub-optimal crud given certain inputs. Upvoted original.
0DanielLC10yI don't understand. Does that mean they have priors?
0Dr_Manhattan10yI think it's another way of putting it, though IIRC the biases are not always explicitly prior probabilities, they could just be a way the algorithm is constructed. Choosing the specific construct is acting on a prior.
2Timwi10yHow do you define “illusion”? I think an illusion is a type of brain failure. An optical illusion is even more specific. Therefore, I think the term is wholly appropriate — and “brain failure”, while not at all inappropriate, is just unnecessarily vague.

The world around us redounds with opportunities, explodes with opportunities, which nearly all folk ignore because it would require them to violate a habit of thought ... I cannot quite comprehend what goes through people's minds when they repeat the same failed strategy over and over, but apparently it is an astonishingly rare realization that you can try something else.

-- Eliezer Yudkowsky, putting words in my other copy's mouth

8gwern10yMeta-comment: I think MoR quotes are legitimate for rationality quote pages, since IIRC we previously established that Eliezer quotes from Hacker News were kosher. And if random Eliezer comments not on OB/LW are kosher, then surely quotes from his fiction are kosher.

I disagree. MoR fits the same criteria ("shooting fish in a barrel") as OB/LW.

surely quotes from his fiction are kosher.

I'm happy to see gems from HPMOR done up in needlepoint and hung on the metaphorical wall of the parlor. But it still smells like trayf! Consider:

Quirrell avoids the ban on quoting himself by attributing the quotation to Eliezer. And he then avoids the ban on quoting Eliezer by pointing out that Eliezer was quoting Quirrell. This is clever and slippery and rabbinical and all that, but it jumps the shark when you realize that Quirrell is not just Eliezer's HPMOR character, he is also probably his LW sock-puppet!

Quirrell is not just Eliezer's HPMOR character, he is also probably his LW sock-puppet!

Oh, come on. It's obviously been the other way around all along.

2Normal_Anomaly10yYou simultaneously gave me the lolz and the shivers. Karma for you!
0Vladimir_Nesov10yThat would violate the One Level Higher Than You principle.
3Perplexed10yWhat makes you certain we are not living in a simulation whose computational substrate lives in the HPMOR universe?
6gwern10yI didn't know there was another antonym to kosher besides nonkosher. Interesting. Anyway, I don't think Quirrel is Eliezer; if he is, then most of the usual reasons against self-quoting wouldn't apply anyway. (It's not like Eliezer needs more karma or higher profile here.)
1MartinB10yHe is a clever guy. Be carefull!

Heed a lesson from a successful practical propagandist. If you want to persuade people that a premise they unconsciously hold is wrong, do not give it a label they will perceive as insulting! If you do this, you make them reluctant to consciously accept that they hold the premise, which will make it more difficult to argue them out of it!

This rule does not hold for conscious premises. It can be effective to make insulting labels for those.

Eric S. Raymond

(This applies no less strongly to one's own brain.)

Nothing in life is certain except death, taxes and the second law of thermodynamics. All three are processes in which useful or accessible forms of some quantity, such as energy or money, are transformed into useless, inaccessible forms of the same quantity. That is not to say that these three processes don't have fringe benefits: taxes pay for roads and schools; the second law of thermodynamics drives cars, computers and metabolism; and death, at the very least, opens up tenured faculty positions.

-- Seth Lloyd

I would like to get rid of one or two of them. Its painfull to see how often really inevitable things get confused with those that could at least in theory be dealt with.

2Armok_GoB10yI read this as an argument against having taxes.
0FiftyTwo10yThe difference being that with taxes nothing is actually 'lost' it is just relocated, where it can be accessed again. Whereas with energy you can only move from high to low concentrations, so there can be genuine loss of usable energy. I initially like it as well, suppose its a good example of not believing something merely because it corroborates an already existing belief (most people dislike taxes).
2Matt_Simpson9yWell, taxes can cause a genuine loss of wealth (as distinct from money) depending on how they're spent and how they're collected, however taxes can also cause a genuine gain in wealth, again depending on how they're collected and spent.

"Alas, how terrible is wisdom
when it brings no profit to the man that's wise!
This I knew well, but had forgotten it,
else I would not have come here."

--Teiresias to the unrelenting Oedipus, Oedipus the King 316-9, Sophocles

(Assigning a specific location to 'here' left as an exercise for the reader...)

Since so many poker opponents often decide at whim, we need to do more than just strategically analyze their actions relative to what they should be doing. We need to watch and listen and determine what they are doing.

--Mike Caro, Caro's Book of Tells

The astro-philosophers of Krull once succeeded in proving conclusively that all places are one place and that the distance between them is an illusion, and this news was an embarrassment to all thinking philosophers because it did not explain, among other things, signposts.

-- Terry Pratchett, "Sourcery"

On simpler solutions:

"But still you did not know the algorithm."

"Yes, but I had some idea that it was related to the Azure/Pufferfish algorithm, which in turn is related to the zeta functions that we studied at Princeton. So I just sat down and said to myself if Rudy were going to build the ultimate cryptosystem on this basis, and if Azure/Pufferfish is a simplified version of that system, then what is Arethusa? That gave me a handful of possibilities."

"And out of that handful you were able to pick the right one."

"No,&q

... (read more)

"A witty saying proves nothing" --Voltaire

9CronoDAS10y
8ata10yThat's been posted [http://lesswrong.com/lw/152/rationality_quotes_august_2009/10u7] (a few times) before. Though it may be worth repeating.

I appeal to the philosophers of all countries to unite and never again mention Heidegger or talk to another philosopher who defends Heidegger.

-Karl Popper

6JoshuaZ10yI don't like this quote. It is amusing but not very rational. It is not rational to ignore arguments because they were made by an awful person. It also isn't rational even if one thinks that an argument or set of ideas is not worth thinking about to actively refuse to discuss those ideas, even if one thinks that the ideas aren't worth considering. The first part of the quote is marginally defensible if Popper is very sure that Heidegger's ideas are a waste of time. The second part of the quote, about refusing to talk to people who defend Heidegger makes about as much sense as a religion telling its adherents not to listen to some specific critic. (That said, while I'm by no means an expert on this matter, my general opinion is that Heidegger is a waste of time.)

It is not rational to ignore arguments because they were made by an awful person.

In academic philosophy there is a tendency to refer to "Heidegger's arguments and positions" as simply "Heidegger". (This is true of all philosophers, not just Heidegger). Popper, of course, would have been familiar with this; when I read that quote I got the distinct impression of "Heidegger's arguments are hollow and his positions are indefensible; please can we agree on this and stop discussing them?"

5Sniffnoy10yRelevant old LW post: Tolerate tolerance [http://lesswrong.com/lw/42/tolerate_tolerance/].
0wedrifid10yIs his philosophy rubbish (even relative to other philosophy) or is it just a problem with him being a Nazi?
5Daniel_Burfoot10yI think both. But mostly I like this quote because it's hilarious.
0wedrifid10yThat it is. :D
3Mitchell_Porter10yHeidegger's theme from beginning to end was "Being". Why is there something rather than nothing, and what is existence anyway? In practice, it was the second question that dominated his life. He started out in phenomenology, so he was initially interested in being as appearance. We get this idea of existence from somewhere, but where exactly? How does it emerge from appearance? Another theme was the forgetting of Being in favor of beings. The modern mind, with its busyness and technological power, is usually engaged in interaction with one particular thing or another particular thing, and loses sight of the fact of existence as such. This theme led him to a historical examination of the concept of Being in different ages. A distinction between existence and essence - thatness and whatness - develops in Greek philosophy, and persists through the centuries despite many transformations, such as the emphasis on subjectivity and consciousness which characterizes the epistemology-dominated era since Descartes. By the end of his life, Heidegger considered that technology and especially "cybernetics" (computer science and information technology) were the start of a whole new epoch in humanity's relationship to Being; initially one in which the obliviousness to Being itself would persist - the metaphysical oblivion created by the focus on essence having been joined by a daily sensibility which was all about action rather than thought - but also a circumstance in which there could be a "second beginning", in which Being might be encountered anew again. So Heidegger deserves his place in the history of philosophy, and he's not obsolete yet, even if so much about him and his work belongs to a vanished culture and politics.
1Jack10yIf I recall he convinced his son to become a computer scientist on these grounds.
3Matt_Simpson10yI'm not sure what Popper's motivation for saying that was, but I've read a bit of Heidegger and I felt the same way afterward.
9NihilCredo10yI once told a university friend of mine, who was majoring in modern philosophy, that Heidegger was the most empty and nonsensical philosopher I had encountered in high school. He blamed this on translation difficulties and my Marxist teacher, and offered to guide me through a selected reading of Sein und Zeit; an offer on which I took him up. We called it quits (in a friendly manner) after five evenings of heated arguing over whether it was even intellectually permissible to use half of the words Heidegger was using, and I left with the judgment that Heidegger was raping the German language.
1Jack10yI don't know about raping the German language but your friend is right in that a) Heideggerr, more than maybe any other philosopher ever, is harder to understand in translation and b) a Marxist might have a lot of trouble explaining Heidegger. He definitely is not an author one should take on by oneself and I definitely can't explain much of anything he's said. I do lean toward the position that he said meaningful, even important things but thats totally based on people whose rationality and intelligence I trust regarding other philosophy telling me so. His obscurity is definitely the cause of a ton of bad philosophy.
2gwern10yHere's another Popper quote on Heidegger. No points for guessing how Popper took this (as is clear from the surrounding context): --"The Unknown Xenophanes", The World of Parmenides, Karl Popper

"Paper clips are gregarious by nature, and solitary ones tend to look very, very depressed." - dwardu

"Please don't hold anything back, and give me the facts" – Wen Jiabao, Chinese Premier (when meeting disgruntled people at the central complaints offices).

The same reign of terror that occurred under Robespierre and Hitler occurred back then in the fifties, as it occurs now. You must realize that there is very little actual courage in this world. It's pretty easy to bend people around. It doesn't take much to shut people up, it really doesn't. In the fifties all I had to do was call a guy up on the telephone and say, "Well, I think your wife would like to know about your mistress."

An upvote to the first person to identify the author of that quote.

5knb10yRonald DeWolf. The son of L. Ron Hubbard.
5SilasBarta10ySo, wait, was it that: a) Most men worth influencing in the 50s had a mistress his wife didn't know about? or that: b) Most men worth influencing in the 50s understood that the guy calling him could persuade the wife that there was a mistress irrespective of whether there was really a mistress?
7TheOtherDave10yOr perhaps that they believed they had a mistress, whether they did or didn't?
3Robin10yI don't know which it was. But I'd say that you're seeing the trees, not the forest. The major point of the quote was that there's a lack of courage in the world, the rest of the quote is just examples.
4SilasBarta10yThe courage to allow one's infidelity to be exposed (let alone falsely exposed) isn't what most people have in mind when they think of courage.
2[anonymous]10yb) fits in better with the reign of terror metaphor.
2BillyOblivion10yDude, SRSLY, 30 seconds with google. http://en.wikiquote.org/wiki/Ronald_DeWolf [http://en.wikiquote.org/wiki/Ronald_DeWolf]
1Eneasz10yI like the quote, but I downvoted. An upvote to the first person to identify why.
7Sniffnoy10yBecause of the "an upvote to whoever can identify the author"?
3TheOtherDave10yGodwin's Law violation?
2MartinB10yIt's wrong.
1Blueberry10yThe comma splice? Please tell us...
4Eneasz10yOh, I assumed the answer was inherent in the question. :) As Sniffnoy said, because of the "an upvote to whoever can identify the author"
2Blueberry10yWhy would that cause a downvote?
2[anonymous]10yBecause Robin can identify the author and a downvote is needed to balance that.
0false_vacuum10yIt's not about rationality?
0komponisto10yThe fact that it wasn't formatted as a ?
0[anonymous]10yIt's not about rationality? (But I prefer Sniffnoy's reason.)

Probable impossibilities are to be preferred to improbable possibilities.

-- Aristotle

"Sherlock Holmes once said that once you have eliminated the impossible, whatever remains, however improbable, must be the answer. I, however, do not like to eliminate the impossible. The impossible often has a kind of integrity to it that the merely improbable lacks." -- Douglas Adams's Dirk Gently, Holistic Detective

In Dirk Gently's universe, a number of everyday events involve hypnotism, time travel, aliens, or some combination thereof. Dirk gets to the right answer by considering those possibilities, but we probably won't.

3DSimon10yI love this quote, but I'm pretty sure I wouldn't describe it as "rational".

I think we could modify our sense of it to mean that if you are down to having to accept a 0.01% probability, because you've excluded everything else, then it's probably better to go back over your logic and see if there's any place you've improperly limited your hypothesis space.

Several paradigm-changing theories introduced concepts that would have previously been thought impossible (like special relativity, or many-worlds interpretation)

2false_vacuum10yI don't understand this one.
1atucker10yThe way I read it was that he's using "impossibilities" to mean things that you don't think are possible, don't understand, or find inconceivable rather than things which can't actually happen. A probable impossibility is something that will probably happen that a given person doesn't think is possible. An improbable possibility is something that that same person understands, but (whether you know it or not) isn't probable.
1false_vacuum10yI read 'probable impossibility' as 'something that is probably impossible'. It's a poor translation if it means something else; but your version at least makes some kind of sense.

I think some time we should have an irrationality quotes thread, kind of in the "how not to" spirit.

I think such a thread should include an expectation of deconstruction - "this is wrong and this is why".

8Alicorn10yIt's been done [http://lesswrong.com/lw/b0/antirationality_quotes/].
8Perplexed10yI don't think that thread [http://lesswrong.com/lw/b0/antirationality_quotes/] serves the purpose RobinZ seems to have in mind. That one seems to be oriented at laughing at the theists, thus promoting ridicule of them and self-esteem for ourselves. It might instead be nice to have a thread of anti-rationality quotes devoted to advancing our rationality, rather than merely celebrating it. One idea for doing this is to also use "anti- ground rules". Require that the anti-rationality quotes must come from LessWrong. You can quote only yourself or Eliezer. And, as RobinZ suggests, explain why the quotation exemplifies an error of rationality (one you have since recognized and corrected). Do we make enough educational mistakes so that we can populate a thread with them? I suspect we do.
4ata10yWe have one: http://lesswrong.com/lw/b0/antirationality_quotes/ [http://lesswrong.com/lw/b0/antirationality_quotes/] Edit: Oops, Alicorn beat me by 57 seconds.

"What happens when you combine organized religion and organized sports? I don’t know, but I suspect not much would change for either institution."

Scenes from a Multiverse

Opinions are like sex, you should change your positions if it feels wrong

~ garcia1000, Witchhunt game

But unlike sex you shouldn't change positions just for fun and novelty.

1FiftyTwo9yYou should experiment with multiple positions, then use the best one.
1wnoise10yDepends on how useful you think the experience of being a devil's advocate is.
6AlexMennen10yIt would be more accurate to say that you should critically look over the evidence again if your position feels wrong. A belief can be justified by logic and still be at odds with intuition, making it still feel wrong. Example: There are compelling arguments that simulation hypothesis is at least somewhat likely to be correct. However, my intuition tells me that the simulation hypothesis is just plain false. I know that this is a subject that my intuition is poorly suited for, so I follow the logic and estimate a non-negligible chance of being in a simulation, despite it feeling wrong.

Increasingly each year the wild predictions of science-fiction writers are made tame by the daily papers.

Robert Heinlein

4sketerpot10yAt one point, he quite audaciously predicted that the Soviet Union was headed for collapse. If he'd lived longer, he would have seen that his prediction should have been even crazier: not only did the Soviet Union fall apart, but it did so without starting a major war, or nuking any cities. And don't even get me started on his books where we've got interstellar travel, guided by computers that are the size of a room but barely faster than someone with a slide rule.

[Humanity] had been the mere plaything of nature, when first it crept out of uncreative void into light; but thought brought forth power and knowledge; and, clad with these, the race of man assumed dignity and authority.

-- Mary Shelley, The Last Man

People think of the future as something other people do, But there's something weirder about a society where people don't think about the future. -- Peter Thiel

Seibel: The way you contributed technically to the PTRAN project, it sounds like you had the big architectural picture of how the whole thing was going to work and could point out the bits that it wasn’t clear how they were going to work.

Allen: Right.

Seibel: Do you think that ability was something that you had early on, or did that develop over time?

Allen: I think it came partially out of growing up on a farm. If one looks at a lot of the interesting engineering things that happened in our field—in this era or a little earlier—an awful lot of them come fro... (read more)

To be sure, science is also mistrusted by those who don't like its discoveries for religious, political, ethical, or even esthetic reasons. Some thoughtful people complain that science has erased enchantment from the world. They have a point. Miracles, magic, and other fascinating impossibilities are no long much encountered except in movies. But in the light shed by the best science and scientists, everything is fascinating, and the more so the more that is known of its reality. To science, not even the bark of a tree or a drop of pond water is dull or h

... (read more)

One might expect self-improving systems to be highly unpredictable because the properties of the current version might change in the next version. Our analysis will instead show that self-improvement acts to create predictable regularities. It builds on the intellectual foundations of microeconomics, the science of preference and choice in the face of uncertainty. The basic theory was created by John von Neumann and Oskar Morgenstern in 1944 for situations with objective uncertainty and was later extended by Savage and Anscombe and Aumann to situations wi

... (read more)
3gwern10yIt's an interesting topic, but what exactly makes this a rationality quote?
1Perplexed10yMaybe it doesn't belong. But I was thinking in terms of something like rationality being an attractor. Minds, whatever their origin, if capable of self-improving, will tend toward a pattern which human economists had already identified as being at the heart of human rationality. The rational direction to guide your own improvement is toward greater rationality. Even if you are not all that rational to begin with. That means that the characteristics we assign to modeled "rational agents" may be universal - they are not just something invented by some lackey of a capitalist patron. Unless Omohundro's analysis is wrong and he just wrote it because he is a lackey, that is.
1timtyler10ySome human irrationality seems adaptive. Humans apparently deceive themselves so they can manipulate others without actually lying - so as to avoid detection.
3Perplexed10yThat does not directly contradict Omohundro. The quotation merely suggests that almost-rational humans will seek to self-modify in the direction of becoming less self-deceptive and better at lying. A look at the self-help literature tends to confirm Omohundro's prediction. That leaves the question, though, as to why Natural Selection didn't take care of this 'improvement' itself. My guess is that it is a life-history, levels-of-selection, and kin-selection issue. Self-help books are purchased by adults. NS tries to optimize the whole life history. It is good for neither children nor their families that they become accomplished liars. Maybe self-deception in children has some advantages as well. Just speculating.
2timtyler10yAre the liars going to win, though? Nature subsidises both transparency and lie detectors, for reasons to do with promoting cooperation. In the future it may get even harder to convince others of things you don't personally believe - as is dramatically portrayed in The Truth Machine [http://en.wikipedia.org/wiki/The_Truth_Machine].

"Meanness and stupidity are so closely related that anything you do to decrease one will probably also decrease the other."

--Paul Graham, here.

-2wedrifid10y* Select the most dominant prisoner in every (male) prison in the country and use them to artificially inseminate 5,000 women each (use IVF with the female top dogs if you wish too). * Punish all observed incidents of stupidity with physical beating. I voted the comment up - because there is a relationship there. There are just other correlations and causal influences that are somewhat stronger in some situations.
0gjm9yThe fact that you had to choose so ridiculous an example suggests that Paul Graham is basically correct. (I think the correct reading of "anything you do to decrease one will probably also decrease the other" is "if you pick something that decreases one, it will probably decrease the other" rather than "literally every single thing that might decrease one will, with high probability given that you do that particular thing, decrease the other".)
4wedrifid9yNo it doesn't. It suggests that when selecting examples for the purpose of countering generalizations wedrifid chooses examples that are clear and unambiguous to anyone who correctly parses the claim rather than choosing the most likely counter example. This is particularly the case when rejecting the extent of a general claim while accepting the gist - as I went out of the way to make explicit. I also reject the idea that the second example I gave is at all unrealistic: Corporal punishment for stupidity is an actual (hopefully mostly historical) thing.
2gwern9yI can't help this quote: --N-Space, Larry Niven
2gjm9yFor the record, I took you to be proposing a single counterexample with two components, rather than two separate counterexamples; I'm sorry for the misunderstanding. Now that I know the second bullet point was meant to be a separate counterexample, I have a different objection to it: I am unconvinced that any implementable version of it would both reduce stupidity and increase meanness. (The most likely outcome, I think, would be to increase meanness while replacing more blatant varieties of stupidity with more widely spread lower-level stupidity.) EDITED to add: Oh, one other thing. If it happens that (1) it was you who downvoted me and (2) you did so because you thought I downvoted your previous comment, then you might want to know that I didn't.
[-][anonymous]10y 3

Happily our civilization possesses two great advantages over past times : scientific knowledge and the scientific spirit. To us have been revealed secrets of life our forebears never knew. And to us has been vouchsafed a passion for truth such as the world has never seen. Other ages have sought truth from the lips of seers and prophets; our age seeks it from scientific proof. Other ages have had their saints and martyrs dauntless souls who clung to their faith with unshakable constancy. Yet our age has also its saints and martyrs heroes who can not only f

... (read more)

"All this knowledge is giving me a raging brainer!"

Professor Farnsworth, Futurama

Pendarvis Theory of Technology: "..., it is my theory that everything wrong with everything is the fault of language teachers.

"If a child is taught that it is all right if you mis-spell a word occasionally, or don't always punctuate exactly correctly, then you are teaching that child that small mistakes are okay, as long as people know pretty well what is meant. I feel this is a dangerous attitude to foster in a highly technological society."

-- William Tuning, Fuzzy Bones

Better to teach the child the difference between programming a computer, proving a theorem, and writing an essay.

7Pavitra10yIf you never misspell a word, you're spending too much time proofreading. [http://www.scottaaronson.com/blog/?p=40]
9Wei_Dai10yThat's true if the only benefit of proofreading is finding misspellings. But you should be proofreading to find errors of expression in general, and the optimal amount of proofreading for that may imply that you find and fix all misspellings.
1false_vacuum10yThat may be good advice for most people. (Or maybe not.) But me, I'm a chronic floccinaucinihilipilificationist. (It's one of my more endearing traits.) And no, I don't use a spellchecker. I don' need no steenkeeng spellchecker.
6CronoDAS10yYou routinely estimate things as valueless [https://secure.wikimedia.org/wiktionary/en/wiki/floccinaucinihilipilification]?
1false_vacuum10yOops. I said I knew how to spell it, not what it means. ('If you never misuse a word, you're spending too much time second-guessing yourself/reading the dictionary'?) For some reason I thought 'floccinaucinihilipilification' meant 'nitpicking'. Probably I inferred its meaning incorrectly from the context in which it appeared; that was my standard failure mode, during the era when I assume I picked up that word. (In fairness to my child-self, it was before widespread internet access--but not dictionaries.) Also, I think I was suffering from some kind of localised cognitive impairment when I wrote that comment (sleep deprivation, perhaps). It strikes me as pretty boorish now, as well as incorrect.
4TheOtherDave10yYou may have been misled by a Robert Heinlein novel where the soi-disant genius narrators agree that that's what the word means. ('Number of the Beast,' I think.)
2false_vacuum10yI believe you are correct.
2false_vacuum10yMotivated cognition. It's such a good word to show off with. (At least, it would be if it meant what I thought it meant.) In fact, I'm sure I've looked it up before. Maybe this time I can remember permanently.
1Normal_Anomaly10yDo mean you're sesquipedalian?
1false_vacuum10yNo, but I am.
1[anonymous]10yAnd if you mis-spell too much, or worse use the wrong word (which is increasingly common with spell-checking), you waste any readers' time trying to figure out what you are trying to say.

Approximate quote: [You should] go in with a thesis, not a conclusion.

From a BBC program about the media and crime in Detroit. The context was the extent to which Detroit is over-reported as a high-crime city, and someone commented that the BBC had sent someone over for a reason, but they were actually looking at the situation instead of assuming they knew what they were going to see.

Whenever I’m about to do something, I think "would an idiot do that?" and if they would, I do not do that thing.

— Dwight Schrute ("The Office" Season 3, Episode 17 "Business School," written by Brent Forrester)

7Desrtopa10ySounds like reversed stupidity.
3RobinZ10yIt would be better to explain why that is a bad thing [http://lesswrong.com/lw/lw/reversed_stupidity_is_not_intelligence/] when you post statements such as that.
3wedrifid10yThe 'least convenient possible world' might be relevant too. I translated the verbal self interrogation as something that would elicit responses along the lines of "would doing this thing distinguish one as an idiot?" In practice the question probably would be useful. In fact, in practice only an idiot would really reverse the stupidity of an idiot when asking that question of themselves. Breath, eat, etc.
1gwern8yOr a different version of rubber-ducking.
-1Document7yI won't ask what that means, because I could presumably easily find out by searching; but I won't search, because I don't care enough (and I'm already here as a distraction from what I meant to be doing).
1gwern7y/not sure if should provide a useful link or not.
2JoshuaZ10yHow is this a rationality quote?
6roland10yThanks for asking. I linked it on purpose to wikipedia from where I quote: Tempus fugit is a succint admonition to focus on what is really important as opposed to what is merely salient. Focus on the not urgent but important things(quadrant 2 in the covey matrix). http://en.wikipedia.org/wiki/File:MerrillCoveyMatrix.png [http://en.wikipedia.org/wiki/File:MerrillCoveyMatrix.png]
4gwern10yI thought it was a good quote, although I'm not sure LWers need to know it. (On the other hand, one might think the same thing of curing aging or helping cryonics, but Eliezer's essay on his dead brother [http://yudkowsky.net/other/yehuda] still got a substantial reaction.) Do you like this one better? --Cato the Elder; Epistles (94) as quoted by Seneca

[E]conomic statistics are a peculiarly boring sub-genre of science fiction; extremely useful, but not to be treated as absolute truth.

-- Paul Krugman

4Costanza10ySpeaking of peculiarly boring sub-genres of science fiction, I am told that Paul Krugman was once the best and most promising of the Jedi Masters of Economics. But somehow, the forces of the Sith seduced him to the dark side, and he has since become Darth Pundit the Mindkillingly Political. In any case, if economic statistics are bad, let them be made better. For that matter, if they're very, very good, let them be made better still, and even then nobody should treat them as the absolute truth.
3gjm9yHe has certainly become political. It might be worth asking: Has he become any less accurate in the process? Another possibility would be that the positions taken up by the major political parties in the US at present are such that it's impossible to tell the truth about some subjects without being (perceived as) highly political. (That's certainly happened often before. For extreme examples, consider cases where an important political movement is based on badly broken racial theories or on a specific religion.)
3[anonymous]9yAccording to this study [http://www.hamilton.edu/documents/Analysis-of-Forcast-Accuracy-in-the-Political-Media.pdf] , he does okay, but I'm not impressed with their methodology. For some reason I can't copy/paste the relevant section of the PDF, but they discuss him explicitly on page 15. They looked at "a random sample" of his columns and television appearances (whatever that means) and found 17 predictions, of which 14 were right, 1 was wrong, and 1 was hedged. Only 17 predictions? I thought we did science. "He is, after all, a Nobel-prize winning economist."
3gjm9yI agree that that study is unimpressive, in a number of ways. (And it's comparing his accuracy with that of other pundits, rather than with that of past-Krugman.)
[-][anonymous]10y 1

"Traditional Renaissance perspective conforms most closely to the way people in our Western culture perceive objects in space. In our perceptions, parallel lines appear to converge at vanishing points on a horizon line (the viewer's eye level) and forms appear to become smaller as distance from the viewer increases."

-- Betty Edwards, "Drawing on the Right Side of the Brain" (actually an awesome book, this quote isn't very representative)

Mr President, the Eagle has landed!

  • A note, left at Kennedy's grave.

Disputes with men, pertinaciously obstinate in their principles, are, of all others, the most irksome; except, perhaps, those with persons entirely disingenuous, who really do not believe the opinions they defend, but engage in the controversy from affectation, from a spirit of opposition, or from a desire of showing wit and ingenuity superior to the rest of mankind. The same blind adherence to their own arguments is to be expected in both; the same contempt of their antagonists; and the same passionate vehemence, in inforcing sophistry and falsehood. And

... (read more)
[-][anonymous]10y 1

The best of all things is to learn. Money can be lost or stolen, health and strength may fail, but what you have committed to your mind is yours forever.

-- Louis L'Amour, The Walking Drum

[-][anonymous]10y 1

If you don't like to read, you'll have to [go to] college. A college's faculty presumes that students don't read without threats of failure [or] expulsion. But, if you like to read, your education will take place, despite everything.

-- John Coyne & Tom Hebert, This Way Out

Emancipate yourself from mental slavery, none but yourself can free your mind.

An upvote to the first person to correctly identify the first person to say that (the quote is often misattributed, you'll get a downvote if you identify the wrong author).

5Skatche10yMarcus Garvey. I think it works better in this longer form.
3gwern10yBob Marley [http://en.wikiquote.org/wiki/Bob_Marley#Song_Lyrics], although before I checked Google search, Books, and Scholar, I had expected to find it was by Epictetus. Oh well. EDIT: In my defense, Garvey's original is not the same as the Bob Marley version which Robin presented. I think it's a little disingenuous to consider the Bob Marley version 'misattributed'.
2aranazo10yReminded me of... Roy Harper

When we exhort people to Faith as a virtue, to the settled intention of continuing to believe certain things, we are not exhorting them to fight against reason. The intention of continuing to believe is required because, though Reason is divine, human reasoners are not. When once passion takes part in the game, the human reason, unassisted by Grace, has about as much chance of retaining its hold on truths already gained as a snowflake has of retaining its consistency in the mouth of a blast furnace.

C.S. Lewis, "Religion: Reality or Substitute?", in "Christian Reflections".

2Oscar_Cunningham10yAnyone want to try and tease a rationality message out of this?
8gwern10yLewis is saying that if you've disproved faith, your reason is flawed. After all, faith must be right! This is 'extraordinary claims require extraordinary evidence', but in unfamiliar garb. We're not used to seeing it used the other way. (If a study reports ESP, then we ought to suspect problems in how it was conducted or analyzed rather than accept its conclusion - to use a recent example.) I'm sure there are a number of relevant LW posts on the topic like "Einstein's Arrogance" [http://lesswrong.com/lw/jo/einsteins_arrogance/].
3Desrtopa10yThe one that immediately comes to mind for me is making your explicit reasoning trustworthy [http://lesswrong.com/lw/2yp/making_your_explicit_reasoning_trustworthy/]. Lewis was exhorting Christians not to trust their explicit reasoning.
4Normal_Anomaly10yWithin the context of Lewis' Christianity, it could be the valid form of the argument from authority: don't believe appealing falsehoods with a little evidence over unappealing truths with a lot of evidence you don't know. To give an example: you tell kids to believe evolution or special relativity without explaining the evidence in detail, but it would still be right for them to have "faith" instead of changing to believe creationism the first time they read a (bogus, but they wouldn't be able to tell) creationist argument on the internet.
4RichardKennaway10yExcept that Lewis' Christianity was not based on any authority deemed infallible. He reasoned himself into it, while recognising the fallibility of reason. His writings set out his arguments; they do not tout any source of authority whose reliability he has not already argued. But how can one rightly reason, while recognising one's fallibility? That is an issue for rationalists as well. Let me fix the original quote for you: When a long argument produces a conclusion that strikes one as absurd, one sometimes just has to say, "This is bullshit. I don't know what's wrong with the argument, but I'm not going along with it."
3Mass_Driver10yI think the flaw in the syllogism is "the human reason, unassisted, has a low chance of retaining its hold on truths." We certainly forget a great deal of procedural and propositional knowledge if we don't use it on a regular basis, but that's different from letting go of a belief because you are passionate about how inconvenient the belief is. Once a belief takes root -- i.e., after you announce it to your friends and take some actions based on it -- it is usually very difficult to let go of that belief.
4TheOtherDave10yMy take: "Because our cognition is unreliable, we can easily lose sight of truths we started out knowing as we walk along tempting-but-wrong garden paths, especially when strong emotions are involved." In other contexts this is sometimes known as "being so sharp you cut yourself."
2Nornagest10yThat's a good moral, but to me Lewis's quote seems to be more simply interpreted as an exhortation against successful doubt [http://lesswrong.com/lw/jy/avoiding_your_beliefs_real_weak_points/]. Our thinking is certainly unreliable, but compensating for that with a fixed intention to keep believing whatever we're currently obsessed with seems like exactly the wrong thing to do; it essentially enshrines motivated cognition as a virtue.
2TheOtherDave10yHaving a "settled intention of continuing to believe" X shares with having a "high prior probability for" X the property that quite a lot of counterevidence can pile up before I actually start considering X unlikely. This is not a bad thing, in and of itself. Of course, if X happens to be false, it's an unfortunate condition to find myself in. But if X is true, it's a fortunate one. That just shows that it's better to believe true things than false ones, no matter how high or low your priors or settled or indecisive your intentions. Of course, if I start refusing to update on counterevidence at all, that's a problem. And I agree, it's easy to read Lewis as endorsing refusing to update on counterevidence, if only by pattern-matching to religious arguments in general.
3Nornagest10yPoint taken, but Lewis wasn't operating within a Bayesian framework. I haven't read a lot of his apologetics, but what I remember seemed to be working through the lens of informal philosophy, where a concept is accepted or rejected as a unit based on whether or not you can think of sufficiently clever responses to all the challenges you're aware of. From this perspective, a "settled intention of continuing to believe" implies putting a lot more mental effort into finding clever defenses of your beliefs, and Lewis's professed acceptance of reason implies nothing more than admitting challenges in principle. Since it's possible to rationalize pretty much anything, this strikes me as functionally equivalent to refusing to update. And, of course, enshrining the state of holding high priors as virtuous in itself carries its own problems.
2TheOtherDave10y(nods) Mostly agreed.
2RichardKennaway10yYou get it.
[-][anonymous]10y -2

No testimony is sufficient to establish a miracle unless the testimony be of such a kind that its falsehood would be more miraculous than the fact which it endeavors to establish.

David Hume

2JoshuaZ10ydupe [http://lesswrong.com/lw/bs/rationality_quotes_april_2009/8eo] (which includes citation and larger context.)

"Everything works by magick; science represents a small domain of magick where coincidences have a relatively high probability of occurrence."

3false_vacuum10yDoes this merely call attention to the high probability of the existence of unknown unknowns, or does it promote map-territory confusion?
1mkehrt10yI totally knew who said that [http://en.wikipedia.org/wiki/Peter_J._Carroll]. Does that make me a bad rationalist?