I'd like recommendations for articles dealing with slow and hard takeoff scenarios. I already found Yudkowsky's post 'hard takeoff', I know 'Superintelligence' has a section on it, and I think the Yudkowsky/Hanson debate mostly dealt with it.

Is there anything else?

New Comment
4 comments, sorted by Click to highlight new comments since: Today at 12:11 PM

Besides Superintelligence, the latest "major" publication on the subject is Yudkowsky's Intelligence explosion microeconomics. There are also a few articles related to the topic at AI Impacts.

Both unknown to me, thanks :)

I found Intelligence Explosion Microeconomics less helpful for thinking about this than some older MIRI papers: