All of mszegedy's Comments + Replies

How Tim O'Brien gets around the logical fallacy of generalization from fictional evidence

Right, that's true. In the particular case of The Things They Carried, I'd trust O'Brien moderately well to depict what the Vietnam War was like, since he participated in it.

How Tim O'Brien gets around the logical fallacy of generalization from fictional evidence

Before I edited it, it was like the current one with the second paragraph removed, the last two sentences of the third paragraph removed, and the third and fourth paragraph combined into one, roughly. I'm glad gwern posted his comment, though, because I think the post is much better now.

How Tim O'Brien gets around the logical fallacy of generalization from fictional evidence

Are you sure you understood the point? I am highlighting a writing technique where you write the same short story over and over again slightly differently to convey a probabilistic model to the reader in a way that is interesting. HPMoR is not quite this; it's a different story every time, with a different lesson every time, that is treated as a sequence of events.

0Gunnar_Zarncke8yAh yes. There are at least two aspects in the 'war stories': The 'probabilistic' aspect which indeed I didn't mention and the 'no plot, no sense' part which I do see in the failure to double guess and the confusion it leaves the reader in. One could argue though that as this is repeated and repeated between Harry and Quirrell and thus kind of probabilistic.
How Tim O'Brien gets around the logical fallacy of generalization from fictional evidence

He literally tells the same story over and over again, differently every time. He has several stories that he does this to. The book is a fictional autobiography; O'Brien was in the Vietnam War, and writes as though he were recollecting stories from the Vietnam War, but the stories are all made up. Here, I found an excerpt that illustrates the principle in a somewhat okay manner.

EDIT: Here, this is better (PDF warning).

How Tim O'Brien gets around the logical fallacy of generalization from fictional evidence

If you want, read it. Hopefully, though, the principle that I was highlighting was clear, wasn't it? While fiction with a probability distribution given for each sequence of events is boring, fiction with many short stories describing the different possible scenarios is interesting, and gives the same probabilistic model.

Should I give examples of how O'Brien does it? I don't know how much I can type out without violating copyright law.

1Punoxysm8yI'd say it was pretty unclear. There are many short story collections; most don't tell and retell the same story. Is he doing this literally, or just metaphorically (e.g. soldiers experience battle many times, and each time is similar but different?). And what is the "storyteller"? Is it told through a framing device?
Anchoring and Adjustment

I've found that going by significant digits helps.

"If I represented the date that Einstein came to the US with only one significant digit of precision, what would it be? Definitely 2000. What about two? Definitely 1900. What about three? Probably 1900 again; I'm willing to take that bet. But four digits of precision? I'm not sure at all. I'll leave it as 1900."

The answer came out way off, but hopefully it prevented any anchoring, and it also accurately represents my knowledge of Einstein (namely, I know which properties of physics he discovered, ... (read more)

How to offend a rationalist (who hasn't thought about it yet): a life lesson

That's exactly what I can't make my mind up about, and forces me to default to nihilism on things like that. Maybe it really is irrelevant where the pleasure comes from? If we did wirehead everyone for eternity, then would it be sad if everyone spontaneously disappeared at some point? Those are questions that I can't answer. My morality is only good for today's society, not tomorrow's. I guess strictly morally, yes, wireheading is a solution, but philosophically, there are arguments to be made against it. (Not from a nihilistic point of view, though, which... (read more)

4Qiaochu_Yuan9yI think you're giving up too early. Have you read the metaethics sequence [http://wiki.lesswrong.com/wiki/Metaethics_sequence]?
How to offend a rationalist (who hasn't thought about it yet): a life lesson

True, I swear! I think I can summarize why I was so distraught: external factors, this was a trusted friend, also one of my only friends, and I was offended by related things they had said prior. I am seeking help, though.

2V_V9yThat makes more sense. As a general rule, however, I suggest trying to avoid to take personal offense at contrary opinions of other, expecially when discussing philosophical issues.
How to offend a rationalist (who hasn't thought about it yet): a life lesson

You're completely right. I tried, at first, to look for ways that it could be a true statement that "some areas shouldn't have consistent belief systems attached", but that made me upset or something (wtf, me?), so I abandoned that, and resolved to attack the argument, and accept it if I couldn't find a fault with it. And that's clearly bad practice for a self-proclaimed rarionalist! I'm ashamed. Well, I can sort of make the excuse of having experienced emotions, which made me forget my principles, but that's definitely not good enough.

I will be ... (read more)

3shminux9yFrom your original story, it doesn't look like you have noticed that your cached belief was floating. Presumably it's a one-off event for you, and the next time you feel frustrated like that, you will know what to look for. Now, I am not a rationalist (IANAR?), I just sort of hang out here for fun, so I am probably not the best person to ask about methodology. That said, one of the approaches I have seen here and liked is steelmanning the opposing argument to the point where you can state it better than the the person you are arguing with. Then you can examine it without the need to "win" (now it's your argument, not theirs) and separate the parts that work from those which don't. And, in my experience, there is a grain of truth in almost every argument, so it's rarely a wasted effort.
How to offend a rationalist (who hasn't thought about it yet): a life lesson

Oh, okay. That makes sense. So then what's the rational thing to conclude at this point? I'm not going to go back and argue with my friend—they've had enough of it. But what can I take away from this, then?

(I was using the French term philosophe, not omitting a letter, though. That's how my history book used to write it, anyway.)

1Qiaochu_Yuan9yI've mentioned various possible takeaways in my other comments. A specific thing you could do differently in the future is to practice releasing againstness [http://lesswrong.com/lw/gid/thoughts_on_the_january_cfar_workshop/8e34] during arguments.
How to offend a rationalist (who hasn't thought about it yet): a life lesson

My point was that they probably did mean both things, because the distinction between "it's impossible" and "I don't know how" is not really clear in their mind. But that is not as alarming as it would be coming from someone who did know the difference, and insisted that they really did mean "impossible."

Hmm, I agree, but I don't think that it adequately explains the entire picture. I think it might have been two different ideas coming from two different sources. I can imagine that my friend had absorbed "applying form... (read more)

6Qiaochu_Yuan9yI think it is deeply misleading to label these "axioms." At best these are summaries of heuristics that you use (or believe you use) to make moral decisions. You couldn't feed these axioms into a computer and get moral behavior back out. Have you read the posts orbiting around Fake Fake Utility Functions [http://lesswrong.com/lw/lp/fake_fake_utility_functions/]?
4RichardKennaway9y(axioms omitted) I don't see any mathematics there, and making them into mathematics looks to me like an AI-complete problem. What do you do with these axioms?
1Eugine_Nier9yWhat do you mean by "positive feelings"? For example, would you support wireheading [http://wiki.lesswrong.com/wiki/Wireheading] everyone?
How to offend a rationalist (who hasn't thought about it yet): a life lesson

It seems possible that when your friend said, in effect, that there can never be any axioms for social justice, what they really meant was simply, "I don't know the axioms either." That would indeed be a map/territory confusion on their part, but it's a pretty common and understandable one. The statement, "Flying machines are impossible" is not equivalent to "I don't know how to build a flying machine," but in the short term they are making a similar prediction: no one is flying anywhere today.

They seemed to be saying both ... (read more)

1B_For_Bandana9yMy point was that they probably did think they meant both things, because the distinction between "it's impossible" and "I don't know how" is not really clear in their mind. But that is not as alarming as it would be coming from someone who did know the difference, and insisted that they really did mean "impossible." Okay, I'll bite. What are they?
How to offend a rationalist (who hasn't thought about it yet): a life lesson

I don't think your friend's point of view is impossible to argue against (as I mentioned in my other comment you can argue based on results)

I'm talking hypothetically. I did allow myself to consider the possibility that the idea was not perfect. Actually, I assumed that until I could prove otherwise. It just seemed pretty hopeless, so I'm considering the extreme.

it's not obvious to me that you've correctly understood your friend's point of view

Maybe not. I'm not angry at my friend at all, nor was I before. I felt sort of betrayed, but my friend had ... (read more)

But the point is that it, to me, is much more interesting/useful/not tedious to consider this idea that challenges rationality very fundamentally

This is what I mean when I say I don't think you've correctly understood your friend's point of view. Here is a steelmanning of what I imagine your friend's point of view to be that has nothing to do with challenging rationality:

"Different domain experts use different kinds of frameworks for understanding their domains. Taking the outside view, someone who claims that a framework used in domain X is more a... (read more)

How to offend a rationalist (who hasn't thought about it yet): a life lesson

It took me the whole day to figure even that out, really. Stress from other sources was definitely a factor, but what I observed is, whenever I thought about that idea, I got very angry, and got sudden urges to throw heavy things. When I didn't, I was less angry. I concluded later that I was angry at the idea. I wasn't sure why (I'm still not completely sure: why would I get angry at an idea, even if it was something that was truly impossible to argue against? a completely irrefutable idea is a very special one; I guess it was the fact that the implication... (read more)

1ChristianKl9yHumans are emotional creatures. We don't feel emotions for rational reasons. The emotion you felt is called cognitive dissonance. It's something that humans feel when they come to a point where one of their fundemental beliefs is threatened but they don't have good arguments to back them up. I think it's quite valuable to have a strong reference experience of what cognitive dissonance feels like. It's make it easier to recognize the feeling when you feel it in the future. Whenever you are feeling that feeling, take note of the beliefs in question and examine them more deeply in writing when you are at home.
1bsterrett9yI was recently reflecting on an argument I had with someone where they expressed an idea to me that made me very frustrated, though I don't think I was as angry as you described yourself after your own argument. I judged them to be making a very basic mistake of rationality and I was trying to help them to not make the mistake. Their response implied that they didn't think they had executed a flawed mental process like I had accused them of, and even if they had executed a mental process like the one I described, it would not necessarily be a mistake. In the moment, I took this response to be a complete rejection of rationality (or something like that), and I became slightly angry and very frustrated. I realized afterwards that a big part of what upset me was that I was trying to do something that I felt would be helpful to this person and everyone around them and possibly the world at large, yet they were rejecting it for no reason that I could identify in the moment. (I know that my pushiness about rationality can make the world at large worse instead of better, but this was not on my mind in the moment.) I was thinking of myself as being charitable and nice, and I was thinking of them as inexplicably not receptive. On top of this, I had failed to liaise even decently on behalf of rationalists, and I had possibly turned this person off to the study of rationality. I think these things upset me more than I ever could have realized while the argument was still going on. Perhaps you felt some of this as well? I don't expect these considerations to account for all of the emotions you felt, but I would be surprised if they were totally uninvolved.
4Qiaochu_Yuan9yThanks for the explanation. I still think it is more likely that you got angry at, for example, your friend's dismissive attitude, and thinking about the idea reminded you of it. You are a human, and humans get angry for a lot of reasons, e.g. when other humans challenge their core beliefs. 1) I don't think your friend's point of view is impossible to argue against (as I mentioned in my other comment you can argue based on results), 2) it's not obvious to me that you've correctly understood your friend's point of view, 3) I still think you are focusing too much on the semantic content of the conversation.
How to offend a rationalist (who hasn't thought about it yet): a life lesson

Oh. Well, that was a while ago, and I get over that stuff quickly. Very few people have that power over me, anyway; they were one of the only friends I had, and it was extremely unusual behavior foming from them. It was kind of devastating to me that there was a thought that was directed at me by a trusted source that was negative and I couldn't explain... but I could, so now I'm all the more confident. This is a success story! I've historically never actually committed sucide, and it was a combination of other stress factors as well that produced that res... (read more)

How to offend a rationalist (who hasn't thought about it yet): a life lesson

Well, the friend had counterexamples to "math as a basis for society is good". I sort of skipped over that. They mentioned those who rationalized bad things like racism, and also Engels. (We both agree that communism is not a successful philosophy.) Counterexamples aren't really enough to dismiss an idea unless they're stronger than the evidence that the idea is good, but I couldn't think of such evidence at the time, and I still can't think of anything particularly convincing. There's no successful society to point at that derived all of its laws and givernment axiomatically.

Those are good examples that you need to be really careful applying math to society.

If you come up with a short list of axioms for a social group, and then use them to formulate policy, you're probably going to end up leaving the domain over which those axioms are valid. If you have a lot of power, this can be a really bad thing.

How to Convince Me That 2 + 2 = 3

Wasn't that what Einstein said about QM?

3JoshuaZ9yAlmost. Eliezer is making a bad wordplay with what Einstein said.
How to Convince Me That 2 + 2 = 3

I once conducted an experiment in which I threw a die 500 times, and then prayed for an hour every day for a week that that die consistently land on a four, and then threw the die 500 more times. Correlation was next to zero, so I concluded that God does not answer prayers about dice from me.

7MixedNuts9yI wouldn't expect a deity to answer that sort of prayer. You're not being sincere, just trying to test them, which many canonically find annoying because it shows mistrust; you don't need that die to land on a four; it suggests you'd use prayer to lowly ends (e.g. "Let me score a touchdown" rather than "Please solve world hunger"); it gives an easily publishable result, which no deity would characteristically accept - if they didn't want to be discreet they'd still be doing showy miracles. Studies where you pray to cure cancer or something are much stronger evidence.

Haven't you ever heard the saying, "God does not throw dice games"?

The Virtue of Narrowness

Maybe I'm misunderstanding Plato, then? It seems to me that Plato's advocating that you can't learn about things outside by staring at the ceiling, but by interacting with them, which is Yudkowsky's position as well.

4Desrtopa9yI think you are misunderstanding the Plato quote. He's not saying that you have to go and look at things outside rather than "staring at the ceiling," but that "staring at the ceiling" (making observations about things) isn't a true exercise of reason. He's arguing that only contemplation of that which cannot be perceived by the senses is truly exalting.
The Virtue of Narrowness

Wait, you criticize the fallaciousness of the ancient Greeks, and then follow up with a quote from Plato on the same subject? Doesn't that undermine your statement about them a bit?

2Desrtopa9yHe's taking a critical attitude to the position expressed in the quote, not quoting a passage from Plato which shares his criticism.