Sequences

What is the next level of rationality?

Comments

Welcome to LessWrong! Your story sounds fitting to me. I'd love to read to read it :)

I think calling these "Attitudes" is alright (and indeed better than "Theories"). But if you're still not happy with it then you might prefer "Dispositions".

He didn't say anything like that in Politics is the Mind-Killer, quite the contrary:

"Politics is an important domain to which we should individually apply our rationality—but it’s a terrible domain in which to learn rationality, or discuss rationality, unless all the discussants are already rational."

"I’m not saying that I think we should be apolitical"

The main point of the post was to not shove politics where it's unnecessary, because it can have all these bad effects. I expect Eliezer agrees far more with the idea that Politics is hard mode, than the idea that "we couldn’t expect our rationality skills to be as helpful in determining truth in politics".

The new comments outline feature is great! Thanks, LW team :)

I don't know what you mean by aesthetic death, but I'm glad to help :)

Can you say exactly which claims Zack is making without showing enough evidence? Is it one or more of these

(1) For all nouns N, you can't define N any way you want, for at least 37 reasons.

(2) Woman is such a noun.

(3) Therefore, you can't define the word woman any way you want.

Or something else?

Even if it's true that he's obsessed with it and everything he writes is somehow connected to it - what's the problem with that? Couldn't you have said the same thing about Eliezer and AI? I bet there were lots of important contributions that were made by people following an obsession, even to their own detriment.

To me the question is whether it's true and valuable (I think so), not whether he's obsessed. 

I do not think this post serves some greater goal (if it does, like many others in this comment section, I am confused)

(I'll try to explain as best I understand, but some of it may not be exactly right)

The goal of this post is to tell the story of Zack's project (which also serves the project). The goal of Zack's project is best described by the title of his previous post - he's creating a Hill of Validity in Defense of Meaning.

Rationalists strive to be consistent, take ideas seriously, and propagate our beliefs, which means a fundamental belief about the meaning of words will affect everything we think about, and if it's wrong, then it will eventually make us be wrong about many things.

Zack saw Scott and Eliezer, the two highest status people in this group/community, plus many others, make such a mistake. With Eliezer it was "you're not standing in defense of truth if you insist on a word, brought explicitly into question, being used with some particular meaning.". With Scott it was "I ought to accept an unexpected [X] or two deep inside the conceptual boundaries of what would normally be considered [Y] if it'll save someone's life.".

This was relevant to questions about trans, which Zack cares a lot about, so he made a bunch of posts arguing against these propositions. The reason it didn't remain a mere philosophy of language debate, is that it bumped into the politics of the trans debate. Seeing the political influence made Zack lose faith with the rationalist community, and warranted a post about people instead of just about ideas.

I would put it like this:

  1. Comparative advantage
  2. Incentive to produce
  3. Distributing differential value (could be better phrased)

The option to make exchanges that increase my total value, incentivizes me to use my comparative advantage to produce new things that will be low value to me, but high value to others, so I can trade them. This increases the total value in the economy by both creating new value and distributing it in a more beneficial manner.

Load More