The Personal Implications of AGI Realism
Superintelligence Is On The Horizon It’s widely accepted that powerful general AI, and soon after, superintelligence, may eventually be created.[1] There’s no fundamental law keeping humanity at the top of the intelligence hierarchy. While there are physical limits to intelligence, we can only speculate about where they lie. It’s reasonable...
I agree that there are significant uncertainties on the specific consequences of AI accelerating bio/medicine R&D, but I think even without buying into Amodei's specific speculations on life extension, you would still get wildly transformative breakthroughs and unforeseen consequences. I do agree it seems to make sense to be wary of just extrapolating past increases in life expectancy.
Time will tell!