Not sure if this is the best place to ask this question. If not please suggest a better.
I came across a great quote about evolution a while back. I thought I grabbed it, but now can't find it anywhere (including searching LW and the Web).
I seem to recall it was on LW and was by Eleizer, but I'm not sure.
It went something like: "Bodies (organisms) are not the meaningful units of evolution. They come and go, like evanescent wisps of smoke. What endures, what persists, are alleles." And I think there was some mention of competition between two animals...the image of two mountain goats butting heads sticks in my mind, but I may be conflating with something else.
Does this ring a bell with anyone? Thank you!
Ernie
Sorry, I'm not too familiar with the community, so not sure if this question is about AI alignment in particular or risks more broadly. Assuming the latter: I think the most overlooked problem is politics. I worry about rich and powerful sociopaths being able to do evil without consequences or even without being detected (except by the victims, of course). We probably can't do much about the existence of sociopaths themselves, but I think we can and should think about the best ways to increase transparency and reduce inequality. For what it's worth, I'm a negative utilitarian.