As my timelines have been shortening, I've been rethinking my priorities. As have many of my colleagues. It occurs to us that there are probably general considerations that should cause us to weight towards short-timelines plans, or long-timelines plans. (Besides, of course, the probability of short and long timelines) For example, if timelines are short then maybe AI safety is more neglected, and therefore higher EV for me to work on, so maybe I should be systematically more inclined to act as if timelines are short.
We are at this point very unsure what the most important considerations are, and how they balance. So I'm polling the hive mind!
Thanks. While it's true that shorter timescales means less ability to shift the system, what I'm talking about is shorter timelines, in which we have plenty of ability to shift the system, because all the important stuff is happening in the next few years.
Roughly, I was thinking that conditional on long timelines, the thing to do is acquire resources (especially knowledge, as you say) and conditional on short timelines, the thing to do is... well, also a lot of that, but with a good deal more direct action of various sorts as well. And so I'... (read more)