With respect to power dynamics point one and two, there is another person known to the community who is perhaps more qualified and already running something which is similar in several respects - Geoff Anders of Leverage Research. So I don't think this is precisely the only group making an attempt to hit this sort of thing, though I still find it novel and interesting.

(disclaimer: I was at the test weekend for this house and am likely to participate)

I Want To Live In A Baugruppe

Something like this also happened with Event Horizon, though the metamorphosis is not yet complete...

Strategic Thinking: Paradigm Selection

Broadly agreed - this is one of the main reasons I consider internal transparency to be so important in building effective organizations. in some cases, secrets must exist - but when they do, their existence should itself be common knowledge unless even that must be secret.

In other words, it is usually best to tell your teammates the true reason for something, and failing that you should ideally be able to tell them that you can't tell them. Giving fake reasons is poisonous.

Strategic Thinking: Paradigm Selection

In some cases it can be - and I will discuss this further in a later post. However, there are many situations where the problems you're encountering are cleanly solved by existing paradigms, and looking at things from first principles leads only to reinventing the wheel. For instance, the appropriate paradigm for running a McDonald's franchise is extremely understood, and there is little need (or room) for innovation in such a context.

Rationality Quotes Thread January 2016

This is one of the worst comments I've seen on LessWrong and I think the fact that this is being upvoted is disgraceful. (Note: this reply refers to a comment that has since been deleted.)

Lesswrong, Effective Altruism Forum and Slate Star Codex: Harm Reduction

So, maybe this is just my view of things, but I think a big part of this conversation is whether you're outside looking in or inside looking out.

I'm on the inside and I think we should get rid of these things for the sake of both insiders and outsiders.

Is that true? I mostly don't notice people scoring cheap points by criticizing religion; I mostly notice them ignoring religion.

See for instance Raising the Sanity Waterline, a post which raises very important points but is so unnecessarily mean-spirited towards religion that I can't particularly show it to many people. As Eliezer writes elsewhere:

Why would anyone pick such a distracting example to illustrate nonmonotonic reasoning? Probably because the author just couldn't resist getting in a good, solid dig at those hated Greens.

Lesswrong, Effective Altruism Forum and Slate Star Codex: Harm Reduction

In terms of weird fixations, there are quite a few strange things that the LW community seems to have as part of its identity - polyamory and cryonics are perhaps the best examples of things that seem to have little to do with rationality but are widely accepted as norms here.

If you think rationality leads you to poly or to cryo, I'm fine with that, but I'm not fine with it becoming such a point of fixation or an element of group identity.

For that matter, I think atheism falls into the same category. Religion is basically politics, and politics is the mind-killer, but people here love to score cheap points by criticizing religion. The fact that things like the "secular solstice" have become part of rationalist community norms and identity is indicative of serious errors IMO.

For me, one of the most appealing things about EA (as opposed to rationalist) identity is that it's not wrapped up in all this unnecessary weird stuff.

Load More