AI as a powerful meme, via CGP Grey
In episode 158 of the Cortex podcast, CGP Grey gives their high-level reason why they are worried about AI. My one line summary: AI should not be compared to nuclear weapons but instead to biological weapons or memes, which evolve under the implicit evolutionary pressures that exist, leading to AI's that are good at surviving and replicating. The perspective is likely known by many in the community already, but I had not heard it before. Interestingly, there have actually been experiments where they just put random strings of code in an environment where they interact, and self-replicating code appeared. See Cognitive Revolution podcast on 'Computational Life: How Self-Replicators Arise from Randomness', with Google researchers Ettore Randazzo and Luca Versari. I quote the relevant part of the podcast below, but I recommend listening because the emotion and delivery is impactful. It is from 1:22:00 onwards. > To be explicit and not beat around the bush, when I try to think, “Oh, what is beyond this barrier, beyond which it might be impossible to predict?” it's like, well, if I’m just in Vegas and placing odds on this roulette wheel, almost all of those outcomes are extraordinarily bad for the human species. There are potentially paths where it goes well, but most are extremely bad for a whole bunch of reasons. > > I think of it like this: people who are concerned like me often analogize AI to something like building nuclear weapons. It’s like, “Ah, we’re building a thing that could be really dangerous.” But I just don’t think that’s the correct comparison, because a nuclear weapon is a tool. It's a tool like a hammer. It’s a very bad hammer, but it is fundamentally mechanical in a particular way. > > But the real difference, where do I disagree with people, where do other people disagree with me, is that I think a much more correct way to think about AI is to compare it to biological weaponry. You’re building a thing able to act in the world differently than
"Today, we're [Goodfire] excited to announce a $150 million Series B funding round at a $1.25 billion valuation." https://www.goodfire.ai/blog/our-series-b
Is my instinct correct that this is a big deal? How does $150 million compare to all other interp research?