Wiki Contributions

Comments

time to crash this party
unfortunate problems:
this list of topics is legitimately too big.
nobody can try to cover all of it in a reasonable amount of time.

possible solutions:
the general patterns of 'skip (lazy evaluate) prerequisites and audit classes' is good, but the best thing you can do if you want to keep up with ai research is to directly court mentors who are well-established in this field.
and to trust those mentors to give you the personal guidance you individually need the most in order to make the most rapid progress you can.

this pattern is extremely valuable and more-or-less obsoletes individual recommendations in terms of literature or conceptual categories.  i'm a lakoffian, and will of course tell people to read lakoff, & nietzsche, & pubmed trawls on specific topics in neurochemistry.  but that's because "ai" or "alignment" are more like 'intelligence studies' than any clearly divided topic area, and the central problems can be reached from linguistics, philosophy, biology, or even the computational challenges in designing videogame cheats and bots in arms races against studio developer captchas and primitive turing tests.

closing thoughts:
this post is uncannily similar to a recommendation for readers to roll up their own doctoral program, more or less.
and that's not a bad thing!
but it's good to keep in mind that tackling a research problem as broad, significant, and challenging as this is best done with peers, advisors, & sources of external feedback to help the questant pointed towards useful self-development instead of futile toil.

Everyone has fascinations and pleasures, but only a limited time alive to indulge in them.
If you know that you can expect to grow in potential, capability, power in almost any area of life if you set your attentions to it, you must be left with the problem of "what do I intend to make of myself?".
This is a very serious question with very serious implications😅

If you're a person who wanders, uncertain what project deserves the most attention and resources next, maybe you can use "gatekeeping" as a tool to sharpen your mind.
If you feel lost, alone, and tribeless, maybe "gatekeeping" is a silly or counterproductive diversion for you, and ought to be cast away unused.

A priori, it is impossible to decide what advice to give what person, no matter how perfectly generalizable any piece of advice might seem to be.  
If you ask yourself "if I like scifi so much, why haven't I written a decent story outline yet?", the most constructive and perhaps rational response may be "perhaps I should begin writing, today".
Questions that may lead to unpleasant or counterproductive answers in the minds of some might only bring delight and motivation to the minds of others.

In this spirit, I hope only to offer a motivational system to nascent could-be creators on the precipice of making their first contributions to a culture, who likely need different and perhaps sterner sounding self-criticism than other people in different stages of life development.

I feel like there's a very serious risk of turning a 'broad rationalist movement' reaction, feeding on PARC adjacent extreme-aspirationals and secreting 'rationalists' into a permanently capped out minor regional cult by just deciding to move somewhere all avowed 'rationalists' choose.
I doubt most 'rationalists' or even most of the people who are likely to contribute to the literature of a rationalist movement have yet been converted to a specific sort of tribal self-identification that would lead them to pick up roots and all go to the same place at one time.
"Let's all leave and pick somewhere obscure" seems a lot more like a way for a movement that has decided to gracefully and deliberately coordinate self-annihilation than a strategy for growth.