Philosophy PhD student. Interested in ethics, metaethics, AI, EA, disagreement/erisology. Former username Ikaxas
Am I the only one who, upon reading the title, pictured 5 people sitting behind OP all at the same time?
The group version of this already exists, in a couple of different versions:
My model of gears to ascension, based on their first 2 posts, is that they're not complaining about the length for their own sake, but rather for the sake of people that they link this post to who then bounce off because it looks too long. A basics post shouldn't have the property that someone with zero context is likely to bounce off it, and I think gears to ascension is saying that the nominal length (reflected in the "43 minutes") is likely to have the effect of making people who get linked to this post bounce off it, even though the length for practical purposes is much shorter.
There seems to be a conflict between putting “self-displays on social media” in the ritual box, and putting “all social signalling” outside it. Surely the former is a subset of the latter.
My understanding was that the point was this: not all social signalling is ritual. Some of it is, some of it isn't. The point was: someone might think OP is claiming that all social signalling is ritual, and OP wanted to dispel that impression. This is consistent with some social signalling counting as ritual.
I think the idea is to be able to transform this:
- item 1 - item 2 - item 3
- item 3 - item 1 - item 2
I.e. it would treat bulleted lists like trees, and allow you to move entire sub-branches of trees around as single units.
This isn't necessarily a criticism, but "exploration & recombination" and "tetrising" seem in tension with each other. E&R is all about allowing yourself to explore broadly, not limiting yourself to spending your time only on the narrow thing you're "trying to work on." Tetrising, on the other hand, is precisely about spending your time only on that narrow thing.
As I said, this isn't a criticism; this post is about a grab bag of techniques that might work at different times for different people, not a single unified strategy, but it's still interesting to point out the tension here.
Putting RamblinDash's point another way: when Eliezer says "unlimited retries", he's not talking about a Groundhog Day style reset. He's just talking about the mundane thing where, when you're trying to fix a car engine or something, you try one fix, and if it doesn't start, you try another fix, and if it still doesn't start, you try another fix, and so on. So the scenario Eliezer is imagining is this: we have 50 years. Year 1, we build an AI, and it kills 1 million people. We shut it off. Year 2, we fix the AI. We turn it back on, it kills another million people. We shut it off, fix it, turn it back on. Etc, until it stops killing people when we turn it on. Eliezer is saying, if we had 50 years to do that, we could align an AI. The problem is, in reality, the first time we turn it on, it doesn't kill 1 million people, it kills everyone. We only get one try.