I was talking to someone the other day about the ways in which I've noticed the [Berkeley] rationalist community changing. One of the main ways was that group houses seemed to be disappearing. People were getting older, moving away, or just moving into their own houses to have kids. It then occurred to me that it doesn't seem like this is happening with the EA side of the community. Thinking about it more, it seems to me that EA has a quite strong funnel in the form of student groups. I semi-regularly hear about events, companies, projects, or just impressive people that are coming from EA student groups. Meanwhile I'm not even aware of a rationalist student group (although I'm sure there are some).
When I think about where rationalists came from, my answer is 1) EY writing the original sequences, and 2) EY writing HPMOR. It feels like those things happened, tons of people joined, and then they stopped happening, and people stopped joining. Then people got older, and now we have a population pyramid problem.
I think this is something of a problem for the mission of preventing AI x-risk. It is of course great to have lots of EAs around, but I think that people that the rationalist community would differentially appeal to would provide a lot of value that EA-learning people would be a lot less likely to provide (focus on AI, obsessive investigation into understanding confusing yet important subjects, etc.).
Do others agree with the pattern? Do you also see it as a problem? Any suggestions for what we could do about it? Why aren't there many rationalist student groups?