Wiki Contributions

Comments

If it is raining, we will move to the Buvette of the Orangerie.

Self-alignment only works sustainably when your environment and you want Good. Often, people seem to want things that aren't Good. I think this is important to notice because you differ from many other people here: what you want is also what's Good and your environment incentivizes or at least tolerates this.

Aligning your wants with what the world needs is not self-alignment and seems like another important step to figure out. 

From my limited view, it looks like getting what you want will eventually lead most people to want Good things? But it doesn't seem obvious at all.

I would repost this as a top-level comment for it to be able to gain visibility. Thanks for building this!

Not sure I understand the "reality has joints that can be cleaved"-thing but sounds like a possibly valuable framing.

Do you mean that reality can be broken down into different gears and one can find out how the gears interact?

Would an illustration of this be a look at how humans, on a biological level, could be described as "selfish-gene"-style driven and, possibly, on a mental level modeled as multi-agent minds?