Happy New Year! Here's the latest and greatest installment of rationality quotes. Remember:
- Please post all quotes separately, so that they can be voted up/down separately. (If they are strongly related, reply to your own comments. If strongly ordered, then go ahead and post them together.)
- Do not quote yourself
- Do not quote comments/posts on LessWrong or Overcoming Bias
- No more than 5 quotes per person per monthly thread, please
-- Greg Egan, "Border Guards".
Dear LWers: do you have these moods (let us gloss them as "extreme temporary loss of confidence in foundational beliefs"):
[pollid:377]
I have had "extreme temporary loss of foundational beliefs," where I briefly lost confidence in beliefs such as the nonexistence of fundamentally mental entities (I would describe this experience as "innate but long dormant animist intutions suddenly start shouting,") but I've never had a mood where Christianity or any other religion looked probable, because even when I had such an experience, I was never enticed to privilege the hypothesis of any particular religion or superstition.
I answered "sometimes" thinking of this as just Christianity, but I would have answered "very often" if I had read your gloss more carefully.
I'm not quite sure how to explicate this, as it's something I've never really though much about and had generalized from one example to be universal. But my intuitions about what is probably true are extremely mood and even fancy-dependent, although my evaluation of particular arguments and such seems to be comparatively stable. I can see positive and negative aspects to this.
Erm...when I was a lot younger, when I considered doing something wrong or told a lie I had the vague feeling that someone was keeping tabs. Basically, when weighing utilities I greatly upped the probability that someone would somehow come to know of my wrongdoings, even when it was totally implausible. That "someone" was certainly not God or a dead ancestor or anything supernatural...it wasn't even necessarily an authority figure.
Basically, the superstition was that someone who knew me well would eventually come to find out about my wrongdoing, and one day they would confront me about it. And they'd be greatly disappointed or angry.
I'm ashamed to say that in the past I might have actually done actions which I myself felt were immoral, if it were not for that superstitious feeling that my actions would be discovered by another individual. It's hard to say in retrospect whether the superstitious feeling was the factor that pushed me back over that edge.
Note that I never believed the superstition...it was more of a gut feeling.
I'm older now and am proud to say that I haven't given serious consideration to doing anything which I personally feel is immoral for a very, very l... (read more)
Occasionally the fundamental fact that all our inferences are provisional creeps me out. The realization that there's no way to actually ground my base belief that, say, I'm not a Boltzmann brain, combined with the fact that it's really quite absurd that anything exists rather than nothing at all given that any cause we find just moves the problem outwards is the closest thing I have to "doubting existence".
To be fair, the philosopher Tamar Gendler only coined the term in 2008.
-- Tim Kreider, The Quiet Ones
"This is how it sometimes works", I would have said. Anything more starts to sound uncomfortably close to "the lurkers support me in email."
...but why wait until they'd almost gotten to Boston?
Perhaps because at that point, one is not faced with the prospect of spending several hours in close proximity to people with whom one has had an unpleasant social interaction.
No one wants to appear rude, of course. As this was almost the end of the ride, the person who rebuked them minimized the time he'd have to endure in the company of people who might consider him rude because of his admonishment, whether or not they agree with him. I wonder if this is partly a cultural thing.
The passage states that he'd already spoken to them twice.
Every actual criticism of an idea/behaviour is likely to imply a much larger quantity of silent doubt/disapproval.
Sometimes, but you need to take into account what P(voices criticism | has criticism) is. Otherwise you'll constantly cave to vocal minorities (situations where the above probability is relatively large).
--Freefall
-- TVTropes
Edit (1/7): I have no particular reason to believe that this is literally true, but either way I think it holds an interesting rationality lesson. Feel free to substitute 'Zorblaxia' for 'Japan' above.
Interesting; is this true?
Yes, my Japanese teacher was very insistent about it, and IIRC would even take points off for talking about someones mental state with out the proper qualifiers.
It's not necessarily an advantageous habit. If a person tells you they like ice cream, and you've seen them eating ice cream regularly with every sign of enjoyment, you have as much evidence that they like ice cream as you have about countless other things that nobody bothers hanging qualifiers on even in Japanese. The sciences are full of things we can't experience directly but can still establish with high confidence.
Rather than teaching people to privilege other people's mental states as an unknowable quality, I think it makes more sense to encourage people to be aware of their degrees of certainty.
(a few verbal tics were removed by me; the censorship was already present in the version I heard)
Sympathetic, but ultimately, we die OF diseases. And the years we do have are more or less valuable depending on their quality.
Physicians should maximize QALYs, and extending lifespan is only one way to do it.
I'd vote this up, but I can't shake the feeling that the author is setting up a false dichotomy. Living forever would be great, but living forever without arthritis would be even better. There's no reason why we shouldn't solve the easier problem first.
Sure there is. If you have two problems, one of which is substantially easier than the other, then you still might solve the harder problem first if 1) solving the easier problem won't help you solve the harder problem and 2) the harder problem is substantially more pressing. In other words, you need to take into account the opportunity cost of diverting some of your resources to solving the easier problem.
— Umberto Eco, The Search for the Perfect Language
Éowyn explaining to Aragorn why she was skilled with a blade. The Lord of the Rings: The Two Towers, the 2002 movie.
"I've never seen the Icarus story as a lesson about the limitations of humans. I see it as a lesson about the limitations of wax as an adhesive."
-- Randall Munroe, in http://what-if.xkcd.com/30/ (What-if xkcd, Interplanetary Cessna)
.
I'd have thought from observation that quite a lot of human club is just about discussing the rules of human club, excess meta and all. Philosophy in daily practice being best considered a cultural activity, something humans do to impress other humans.
Imagine the average high school clique. They would be very uncomfortable explicitly discussing the rules of the group - even as they enforced them ruthlessly. Further, the teachers, parents, and other adults who knew the students would be just as uncomfortable describing the rules of the clique.
In short, we are socially weird for being willing to discuss the social rules - that our discussion is an improvement doesn't mean it is statistically ordinary.
Ah, I see.
Human club has many rules. Some can be bent. Others can be broken.
I think the author is needlessly overcomplicating things.
1) People instinctively form tight nit groups of friends with people they like. People they like usually means help them survive and raise offspring. This usually means socially adept, athletic, and attractive.
2) Having friends brings diminishing returns. The more friends a person have, the less they feel the need to make new friends. That's why the first day of school is vital.
3) Ill feelings develop between sally and bob. Sally talks to Susanne, and now they both bear ill feelings towards Bob. Thus, Bob has descended a rung in the dominance hierarchy.
4) Bob's vulnerability is a function of how many people Sally can find who will agree with her about him. As a extension of this principle, those with the fewest friends will get the most picked on. The bullies can be both from the popular and unpopular crowd.
5) Factors leading to few friends - lack of social or athletic ability, conspicuous non-conformity via eccentric behavior, dress, or speech, low attractiveness, or misguided use of physical or verbal aggression.
By the power law, approximately 20% of the kids will be friends with 80% of the network. These are the popular ... (read more)
--Benjamin Franklin
Keith E. Stanovich, What Intelligence Tests Miss: The Psychology of Rational Thought
Possibly, but Stanovich thinks that most heuristics were basically given to us by evolution and rather than choose among heuristics what we do is decide whether to (use them and spend little energy on thinking) or (not use them and spend a lot of energy on thinking).
The Last Psychiatrist (http://thelastpsychiatrist.com/2010/10/how_not_to_prevent_military_su.html)
--Rory Miller
-Buttercup Dew (@NationalistPony)
--Wendy Cope, He Tells Her from the series ‘Differences of Opinion’
-Paul Graham
Partial duplicate
David Wong, 6 Harsh Truths That Will Make You a Better Person. Published in Cracked.com
This article greatly annoyed me because of how it tells people to do the correct practical things (Develop skills! Be persistent and grind! Help people!) yet gives atrocious and shallow reasons for it - and then Wong says how if people criticize him they haven't heard the message. No, David, you can give people correct directions and still be a huge jerk promoting an awful worldview!
He basically shows NO understanding of what makes one attractive to people (especially romantically) and what gives you a feeling of self-worth and self-respect. What you "are" does in fact matter - both to yourself and to others! - outside of your actions; they just reveal and signal your qualities. If you don't do anything good, it's a sign of something being broken about you, but just mechanically bartering some product of your labour for friendship, affection and status cannot work - if your life is in a rut, it's because of some deeper issues and you've got to resolve those first and foremost.
This masochistic imperative to "Work harder and quit whining" might sound all serious and mature, but does not in fact has the power to make you a "better person"; rather, you'll... (read more)
I've taken a crack at what's wrong with that article.
The problem is, there's so much wrong with it from so many different angles that it's rather a large topic.
My complaint about the article is that it has the same problem as most self-help advice. When you read it, it sounds intelligent, you nod your head, it makes sense. You might even think to yourself "Yeah, I'm going to really change now!"
But as everyone whose tried to improve himself knows, it's difficult to change your behavior (and thoughts) on a basis consistent enough to really make a long-lasting difference.
osewalrus
I try to get around this by assuming that self-interest and malice, outside of a few exceptional cases, are evenly distributed across tribes, organizations, and political entities, and that when I find a particularly self-interested or malicious person that's evidence about their own personality rather than about tribal characteristics. This is almost certainly false and indeed requires not only bad priors but bad Bayesian inference, but I haven't yet found a way to use all but the narrowest and most obvious negative-valence concepts to predict group behavior without inviting more bias than I'd be preventing.
"Just because you no longer believe a lie, does not mean you now know the truth."
Mark Atwood
Orson Scott Card, The Lost Gate
-Bas van Fraassen, The Scientific Image
Believing large lies is worse than small lies; basically, it's arguing against the What-The-Hell Effect as applied to rationality. Or so I presume, did not read original.
I had noticed it and mistakenly attributed it to the sunk cost fallacy but on reflection it's quite different from sunk costs. However, it was discovering and (as it turns out, incorrectly) generalising the sunk cost fallacy that alerted me to the effect and that genuinely helped me improve myself, so it's a happy mistake.
One thing that helped me was learning to fear the words 'might as well,' as in, 'I've already wasted most of the day so I might as well waste the rest of it,' or 'she'll never go out with me so I might as well not bother asking her,' and countless other examples. My way of dealing it is to mock my own thought processes ('Yeah, things are really bad so let's make them even worse. Nice plan, genius') and switch to a more utilitarian way of thinking ('A small chance of success is better than none,' 'Let's try and squeeze as much utility out of this as possible' etc.).
I hadn't fully grasped the extent to which I was sabotaging my own life with that one, pernicious little error.
Lambs are young sheep; they have less meat & less wool.
The punishment for livestock rustling being identical no matter what animal is stolen, you should prefer to steal a sheep rather than a lamb.
-- Penn Jilette
-http://writebadlywell.blogspot.com/2010/05/write-yourself-into-corner.html
I would argue that the lesson is that when something valuable is at stake, we should focus on the simplest available solutions to the puzzles we face, rather than on ways to demonstrate our intelligence to ourselves or others.
-Jobe Wilkins (Whateley Academy)
I was rereading HP Lovecraft's The Call of Cthulhu lately, and the quote from the Necronomicon jumped out at me as a very good explanation of exactly why cryonics is such a good idea.
(Full disclosure: I myself have not signed up for cryonics. But I intend to sign up as soon as I can arrange to move to a place where it is available.)
The quote is simply this:
Er... logical fallacy of fictional evidence, maybe? I wince every time somebody cites Terminator in a discussion of AI. It doesn't matter if the conclusion is right or wrong, I still wince because it's not a valid argument.
The original quote has nothing to do with life extension/immortality for humans. It just happens to be an argument for cryonics, and it seems to be a valid one: death as failure to preserve rather than cessation of activity, mortality as a problem rather than a fixed rule.
-- Steven Brust, spoken by Vlad, in Iorich
— Gregory Wheeler, "Formal Epistemology"
Is there a concrete example of a problem approached thus?
(If you wonder where "two hundred and forty-two miles" shortening of the river came from, it was the straightening of its original meandering path to improve navigation)
-- John Rawls, Justice as Fairness: A Restatement.
But we've had self-driving cars for multiple years now...
-- Kyubey (Puella Magi Madoka Magica)
http://www.exmormon.org/whylft18.htm
.
"Two roads diverged in a wood. I took the one less traveled by, and had to eat bugs until the park rangers rescued me."
Wasn't that poem sarcastic anyway? Until the last stanza, the poem says how the roads were really identical in all particulars -- and in the last stanza the narrator admits that he will be describing this choice falsely in the future.
--"Sid" a commenter from HalfSigma's blog
--Sir Francis Galton
-- Groucho Marx
It's hardly fair to describe this tiny modicum of doubt as atheism, even in the umbrella sense that covers agnosticism.
Person 1: "I don't understand how my brain works. But my brain is what I rely on to understand how things work." Person 2: "Is that a problem?" Person 1: "I'm not sure how to tell."
-Today's xkcd
This argument really isn't very good. It works on precisely none of the religious people I know, because:
A: They don't believe that God would tell them to do anything wrong.
B: They believe in Satan, who they are quite certain would tell them to do something wrong.
C: They also believe that Satan can lie to them and convincingly pretend to be God.
Accordingly, any voice claiming to be God and also telling them to do something they feel is evil must be Satan trying to trick them, and is disregarded. They actually think like that, and can quote relevant scripture to back their position, often from memory. This is probably better than a belief framework that would let them go out and start killing people if the right impulse struck them, but it's also not a worldview that can be moved by this sort of argument.
-Woody Allen EDIT: Fixed formatting.
Eh. Would you say that "humans aren't capable of evil. Evolution makes them that way"?
I might, if I was a god talking to other gods. And if I was a gun talking to other guns, I'd tell them to shut up about humans and take responsibility for their own bullets.
-G. K. Chesterton
--Daniel Dennett, Breaking the Spell (discussing the differences between the "intentional object" of a belief and the thing-in-the-world inspiring that belief)
-- Larry Wall
Vannevar Bush
If you are an American perhaps it stood out this time because of all the recent discussion of gun control.
-- Eric Hoffer, The True Believer
-- Eric Hoffer, The True Believer
Beatrice the Biologist
-- Sterren with a literal realization that the territory did not match his mental map in The Unwilling Warlord by Lawrence Watt-Evans
-- someone on Usenet replying to someone deriding Kurzweil
In general, though, that argument is the Galileo gambit and not a very good argument.
There's a more charitable reading of this comment, which is just "the absurdity heuristic is not all that reliable in some domains."
What makes this the Galileo Gambit is that the absurdity factor is being turned into alleged support (by affective association with the positive benefits of air travel and frequent flier miles) rather than just being neutralized. Contrast to http://lesswrong.com/lw/j1/stranger_than_history/ where absurdity is being pointed out as a fallible heuristic but not being associated with positives.
In reference to Occam's razor:
--from Machine Learning by Tom M. Mitchell
Interesting how a concept seems more believable if it has a name...
Well, top two at least. Christianity and Islam take the top spots, followed by Hinduism, which has a Supreme Existence, but no tenets of it being benevolent (at least as far as I've been able to find, maybe some Hindus believe differently, as it's not a very homogeneous religion.) Here's a table of top religions by adherents. It's not clear how to count down from there since some of the items are aggregations of what might not fairly count as individual religions, but after Islam, the next religion down claiming a benevolent supreme being has close to two orders of magnitude fewer adherents.
Not that this affects the point of what our cultural understanding of "God" means, but it does give a bit of a sense of how much that idea is an outlier in human culture.
I don't know if this is as interesting as you're hoping, but my father is an atheist offshoot of a very religious family, and my mother is an agnostic/deist who was once a member of the Tr... (read more)
"De notre naissance à notre mort, nous sommes un cortège d’autres qui sont reliés par un fil ténu."
Jean Cocteau
("From our birth to our death, we are a procession of others whom a fine thread connects.")
It is entirely possible for someone to believe in an evil god, and (quite reasonably) decline to do that god's alleged bidding.
It's not easy to find rap lyrics that are appropriate to be posted here. Here's an attempt.
-Pirkei Avot (5:15)
Deep wisdom indeed. Some people believe the wrong things, and some believe the right things, some people believe both, some people believe neither.
-- P. W. Bridgman, ‘‘The Struggle for Intellectual Integrity’’
--Scott Derrickson
While affirming the fallacy-of-composition concerns, I think we can take this charitably to mean "The universe is not totally saturated with only indifference throughout, for behold, this part of the universe called Scott Derrickson does indeed care about things."
“There is light in the world, and it is us!”
Love that moment.
Scott Derrickson is indifferent. How do I know this? I know because Scott Derrickson's skin cells are part of Scott Derrickson, and Scott Derrickson's skin cells are indifferent.
-- Steve Smith, American Dad!, season 1, episode 7 "Deacon Stan, Jesus Man", on the applicability of this axiom.
-- Closing lines of Crimes and Misdemeanors, script by Woody Allen.
Jimmy the rational hypnotist on priming and implicit memory:
That seems to cheapen Cthulhu, to be honest. The emotional impact of Lovecraft's stories, and of their descendants such as the Azathoth metaphor, relies not on an immoral or amoral Power (that's well-trod territory in many religions and not a few fantasies) but rather on Powers with motivations fundamentally incompatible with human minds: entities of godlike potency that can neither be mollified nor bargained with nor easily apprehended in native reasoning modes.
That doesn't describe the occupants of any historical mythology I can think of, at least not i... (read more)
-Alfie Kohn, "Punished By Rewards"
That is a hugely unfair assessment of my motives (unlike abody97's comment which claims not to be about my motives, which I also doubt). People say untrue things all the time, e.g. when storytelling. The goal of storytelling is not to directly relate the truth of some particular experience, and I didn't think the goal of posting rationality quotes was either, considering how many quotes these posts get from various works of fiction. I posted this quote for no reason other than to suggest an interesting rationality lesson, and calling that "bullshit" sneaks in unnecessary connotations.
Lots of people in Weimar Germany got angry at the emerging fascists - and went out and joined the Communist Party. It was tough to be merely a liberal democrat.
If the memories of my youth serve me anger 'leads to the dark side of the force' via the intermediary 'hate'. That is, it leads you to go around frying things with lightening and choking people with a force grip. This is only 'evil' when you do the killing in cases where killing is not an entirely appropriate response. Unfortunately humans (and furry green muppet 'Lannik') are notoriously bad at judging when drastic violation of inhibitions is appropriate. Power---likely including the power to kill people with your brain---will almost always corrupt.
Not nearly as much as David Brin perverts the message that Lucas's message. I in fact do reject the instructions of Yoda but I reject what he actually says. I don't need to reject a straw caricature thereof.
Automatically. Immediately. Where did this come from? Yoda is 900 years old, wizened and gives clear indications that he think... (read more)
SMBC comics: a metaphor for deathism.
While I am a fan of SMBC, in this case he's not doing existentialism justice (or not understanding existentialism). Existentialism is not the same thing as deathism. Existentialism is about finding meaning and responsibility in an absurd existence. While mortality is certainly absurd, biological immortality will not make existential issues go away. In fact, I suspect it will make them stronger..
edit: on the other hand, "existentialist hokey-pokey" is both funny and right on the mark!
I think this is a mistake, and a missed chance to practice the virtue of scholarship. Lesswrong could use much more scholarship, not less, in my opinion. The history of the field often gives more to think about than the modern state of the field.
Progress does not obey the Markov property.
-- Scenes From A Multiverse
I aspire to be VNM rational, but not a utilitarian.
It's all very confusing because they both use the word "utility" but they seem to be different concepts. "Utilitarianism" is a particular moral theory that (depending on the speaker) assumes consequentialism, linearish aggregation of "utility" between people, independence and linearity of utility function components, utility is proportional to "happyness" or "well-being" or preference fulfillment, etc. I'm sure any given utilitarian will disagree with something in that list, but I've seen all of them claimed.
VNM utility only assumes that you assign utilities to possibilities consistently, and that your utilities aggregate by expectation. It also assumes consequentialism in some sense, but it's not hard to make utility assignments that aren't really usefully described as consequentialist.
I reject "utilitarianism" because it is very vague, and because I disagree with many of its interpretations.
Quite sure. I assume you value the life of a sparrow (the bird), all else being equal. Is there a number of sparrows to spare which you would consign yourself and your loved ones to the flames? Is there a hypothetical number of sparrows for which you would choose them living over all currently living humans?
If not, then you are saying that not all goals reduce to a number on a single metric, that there are tiers of values, similar in principle to Maslow's.
You're my sparrows.
Any reason whatsoever to think that this particular characteristic contributed wholly or partly to Swartz' suicide, other than its being a known & salient fact about Swartz?
I assume a significant amount of them were. I also tried subtly using God/Fate, God/Freewill, God/Physics, God/Universe, God/ConsciousMultiverse, and God/Chance, as interchangeable "redefinitions" (of course, on different samples each time) and was similarly called on it.
Incidentally, I can't confirm if this suggests a pattern (it probably does), but in one church I tried, for fun, combining all of them and just conflating all the meanings of all the above into "God", and then sometimes using the specific terms and/or God interchangeab... (read more)
John Locke, Essay Concerning Human Understanding
"We are living on borrowed time and abiding by the law of probability, which is the only law we carefully observe. Had we done otherwise, we would now be dead heroes instead of surviving experts." –Devil's Guard
Well, that gets right to the heart of the Friendliness problem, now doesn't it? Mother Brain is the machine that can program, and she reprogrammed all the machines that 'do evil'. It is likely, then, that the first machine that Mother Brain reprogrammed was herself. If a machine is given the ability to reprogram itself, and uses that ability to make itself decide to do things that are 'evil', is the machine itself evil? Or does the fault lie with the programmer, for failing to take into account the possibility that the machine might change its utility ... (read more)
The generation of random numbers is too important to be left to chance. Robert R. Coveyou, Oak Ridge National Laboratory
Sorry. Attempt #2:
If I had infinite storage space and computing power, I would store every single piece of information I encountered. I don't, so instead I have to efficiently process and store things that I learn. This generally requires that I throw information out the window. For example, if I take a walk, I barely even process most of the detail in my visual input, and I remember very little of it. I only want to keep track of a very few things, like where I am in relation to my house, where the sidewalk is, and any nearby hazards. When the walk is ove... (read more)
-- Ricardo, publicly saying "oops" in his restrained Victorian fashion, in his essay "On Machinery".
Randall Munroe
Dr. Seuss
I never found that argument very compelling. The Classical Greeks did a whole lot better than the Christians at developing scientific knowledge, before the Renaissance. Both monotheistic and polytheistic tradtions can foster either strong or weak scientific progress. Islam is a good example of a monotheistic tradition moving from high to low scientific productivity by the shifting of ideas within that tradition (see The Incoherence of the Philosophers.)
A polytheist can perfectly easily see the world as functioning according to a single, consistent set of r... (read more)
That line always bugged me, even when I was a little kid. It seems obviously false (especially in the in-game context).
I don't understand why this is a rationality quote at all; Am I missing something, or is it just because of the superficial similarity to some of EY's quotes about apathetic uFAIs?
?!
-- Jonathan Haidt
-- Yvain, on why brinkmanship is not stupid
Respectfully, I think we have reached the limit of our ability to have productive conversation.
(1) I don't desire to have the "Who is more evil: Nazis or Communists?" fight - I'm not sure that discussion is anything more than Blue vs. Green tu quoque mindkiller-ness. The important lesson is "beware 'do not debate him or set forth your own evidence; do not perform replicable experiments or examine history; but turn him in at once to the secret police.'"
(2) It is possible to piece together acceptable moral lessons from Jedi philosophy, ... (read more)
With all respect that I'm generically required to give, I don't care whether you care or not. The argument I made was handling what you posted/quoted, neither you as a person nor your motives to posting.
-- thedaveoflife
Dupe.
Wile this is all very inspiring, is it true? Yes, truth in and of itself is something that many people value, but what this quote is claiming is that there are a class of people (that he calls "dissidents") that specifically value this above and beyond anything else. It seems a lot more likely to me that truth is something that all or most people value to one extent or another, and as such, sometimes if the conditions are right people will sacrifice stuff to achieve it, just like for any other thing they value.
South Park, Se 16 ep 4, "Jewpacabra"
note: edited for concision. script
What is a deontological concept and what is a non-deontological concept?
Have worse consequences for everybody, where "everybody" means present and future agents to which we assign moral value. For example, a sufficiently crazy deontologist might want to kill all such agents in the name of some sacred moral principle.
... (read more)Max liklihood tells you which is most likely, which is mostly meaningless without further assumptions. For example, if you wanted to bet on what the next flip would be, a max liklihood method won't give you the right probability.
(Source: Dennettations)
I think that this quote might benefit by tabooing the word "god".
Does it mean "an omniscient, omnipotent being"?
Does it mean "an omniscient, omnibenevolent being that would never ask you to do anything truly evil, but may on occasion ask you to do things that you don't see the sense in, and that in fact appear evil at first glance"?
Does it mean "a being worthy of respect and obedience, even in the most dire circumstances"?
Sure, that's probably true. I don't see what difference it makes, though.
I mean, OK, suppose I wait an hour, or a day, or a week, or however long I decide to wait, and I ask again, and a Voice says "Yes, kill 'em all." Do I believe it's God now? Why?
Conversely, I wait however long I decide to wait and I ask again and a Voice says "No, don't kill 'em." Do I believe that's God? Why?
Do I ask a dozen times and take the most common answer?
None of those seem reasonable. It seems to me that on her account, what I ought to do is rely on my ju... (read more)
The problem Brin is criticizing is that Good is entirely prohibited from feeling strong emotions. Brin explicitly acknowledges that strong emotions can lead to evil acts - he's challenging the implicit idea that strong emotions must lead to evil.
Also, not my downvote.
Not really a rationality quote, is it...
--Michael Huemer
Nelson Goodman
I don't think change can be planned. It can only be recognized.
jad abumrad, a video about the development of Radio Lab and the amount of fear involved in doing original work
Acting the other way around would be trusting my judgement that the AI is friendly.
In any case, I would expect a superintelligence, friendly or not, to be able to convince me to kill my child, or do whatever.
Do you mean CEV_(mankind)?
CEV_(mankind) is a compromise utility function (that some doubt even contains anything) that is different from your own utility function.
Why on earth would I ever voluntarily choose a different utility function, out of a mixture of other human utility functions, over my own? I already have one that fits me perfectly by definition - my own.
If you meant CEV_(Kawoomba), then it wouldn't change the outcome of that particular decision. Maybe refer to the definition here?
It's been that since the start. The Penn quote is just broken and deserves no further attention.
In that case, the Christian's obvious and correct response is "that wouldn't happen", and responding to that with "yeah, but what if? huh? huh?" is unlikely to lead to a fruitful conversation. Penn's original thought experiment is simply broken.
Replace "God" by "rationality" and consider the question asked of yourself. How do you respond?
I asked a religious relative something along these lines.
Her response was that God would never ask people to do bad things, and if it seemed that He was that would just be someone else deceiving her.
I explained the atheist view on this sort of thing and then the conversation shifted directions before I thought to point out the example of God asking someone to sacrifice their child in the bible.
Exactly what we are discussing. Brin explicitly acknowledges the first point - he's rejecting the second point.
That's not a charitable reading of that point. In the real world, there are lots of different ways to be evil. In Jedi-land, evil = Sith.
Annakin opposes the Sith. Then he feels strong emotions (love of Padme). Then he beco... (read more)
"We are what we repeatedly do. Excellence, then, is not an act, but a habit." — Aristotle
It goes both ways. And it's meaningless to speak of changing "what you are" if you do not, as a result, do anything different.
I don't think the Cracked article, or I, ever said that the only way to change your actions is by changing some mysterious essence of your being. That's actually a rather silly notion, when it's stated explicitly, because it's self-defeating unless you ignore the observable evidence. That is, we can see that changing your act... (read more)
Atribution?
-
That's not a bad essay (BTW, essays should be in quote marks, and the book itself, The Simpsons and Philosophy, in italics), but I don't think the quote is very interesting in isolation without any of the examples or comparisons.
Um, no. I can't respond to a challenge to give a non-X definition of Y if I don't know what X means.
Smells like consequentialist reasoning. Look, if I had a better example I would give it, but I am genuinely not sure what deontologists think they're doing if they don't think they're just using heuristics that approximate consequentialist reasoning.
You do have a point, but there is another explanation to resolve that, see this comment.
We still have a fundamental disagreement on whether rationality is in any way involved when reflecting on your terminal values. I claim that rationality will help the closet murderer who is firm in valuing pain and suffering the same as the altruist, the pap... (read more)
It can be difficult to know what will be harmful without knowing whether certain things are true.
Hypothetical example: A person kills their child in order to prevent them from committing some kind of sin and going to hell. If this person's beliefs about the existence of hell and how people get in and stay ou... (read more)
Since we're talking about CEV_(individual), the "poetic" definition would be "[my] wish if [I] knew more, thought faster, were more the [man] [I] wished [I] were, (...), where [my] wishes cohere rather than interfere; extrapolated as [I] wish that extrapolated, interpreted as [I] wish that interpreted."
Nothing that would change my top priorities, though I'd do a better job convincing Galactus.
I'm not so sure about that. We have much more exposure to attempts to defend monotheism from polytheism or atheism, so it may appear easier, because there's a glut of arguments coming from that direction. That could just be a historical accident though. Maybe we could have ended up quite easily in a world wh... (read more)
Remember that ~33% of the world is Christian (which is more than any other religion), and so it is not all that surprising that many atheists come from Christian backgrounds, simply because the probability that an arbitrary person came from a Christian background is quite high to start with.
My comment applies just the same, whether you spell god God, G_d, GOD or in some other manner: You can believe such a being exists (making you a theist) without following its moral codex or whatever commands it levies on you. Doesn't make you an atheist.
--Thomas Sowell
The term was coined by Brian Caplan here
This was long before Less Wrong.
I realized that lower-level discussions of free will were kind of pointless. I abandoned the eternal-springing hope that souls and psychic powers (Hey! Look! For some reason, all of air molecules just happened to be moving upward at the same time! It seems like this guy is a magnet for one in a 10^^^^^^^10 thermodynamic occurances! And they always help him!) could exist. I fully accepted the physical universe.
Yes, sorry. I was using the term "confused" in a slightly different manner from the one LWers are used to, and "confusing" fits better. Basically, "meaninglessly mysterious and deep-sounding" would be the more LW-friendly description, I think.
... (read more)Sorry, can you clarify what you mean here? None of what passes an ideological turing test? Are you saying something like "theists erroneously conclude that the proponents of evolution must believe in God because evolutionists believe that evolution is what produced all creatures great and small"? What exactly is the mistake that theists make on this point that would lead them to fail the ideological turing test?
Or, did I misunderstand you, and are you saying that people like Dennett fail the ideological turing test with theists?
I added a link, but I would prefer to suggest a fake name over a generic name.
Let's say that you don't do something that you want to do, because you're not confident enough.
What is the difference between doing that thing, and improving your confidence which causes you to do that thing? What does it even mean to distinguish between those two cases?
And if improving your confidence doesn't cause you to do the thing in question, then what's the point?
Edit: On a reread, I might interpret you as saying that one might try (but fail) to change one's actions "directly", or one might attack the root cause, and having done so, succee... (read more)
So, is there any research done about this kind of stuff? All the discussions of this kind of things I've seen on Wikipedia talk:Manual of Style and places like that appear to be based on people Generalizing From One Example.
The Third Doctor
No, but the temptation was rejected specifically on the grounds that it did not agree with scripture. Therefore, the same grounds can surely be used in other, similar situations, including those where one is unsure of who is talking.
For those unaware of how the story goes:
Taboo "people".
I guess almost never (in the mathematical sense). OTOH, in the real world the difference is often so tiny that it's hard to tell its sign -- but then, the thing to do is gather more information or flip a coin.
-The mayor, in "do the right thing"
That is so. Why unfortunately? Also, why "under the impression"? If you were to tell me some of your terminal values, I'd give you the courtesy of assuming you are telling the truth as you subjectively perceive it (you have privileged access to your values, and at least concerning your conscious values, subjective is objective).
I get it that you hold nothing on Earth more sacred than a hypothetical sufficiently high number of sparrows, we differ on that. It is not a question of epistemic beliefs about the world state, of creating a better match... (read more)
Why?
--Mencius Moldbug, here
I can't overemphasise how much I agree with this quote as a heuristic.
Second statement assumes that the base rate of underdogs and overdogs is the same. In practice I would expect there to be far more underdogs than overdogs.
I agree. For example:
This statement is obviously true. But it sure would be useful to have a theory that predicted (or even explained) when a putative civil disobedience would and wouldn't work that way.
Obviously, willing to use overwhelming violence usually defeats civil disobedience. But not every protest wins, and it is worth trying to figure out why - if for no other reason than figuring out if we could win if we protested something.
You cannot cross a chasm by pointing to the far side and saying, "Suppose there was a bridge to there? Then we could cross!" You have to actually build the bridge, and build it so that it stays up, which Penn completely fails to do. He isn't even trying to. He isn't addressing Christians. He's addressing people who are atheists already, getting in a good dig at those dumb Christians who think that a monkey gave birth to a human, sorry, that anyone should kill their child if God tells them to. Ha ha ha! Is he not witty!
The more I think about that quote, the stupider it seems.