I agree it fits well here. However, it has a very different tone from other posts on the MIRI blog, where it has also been posted.
Laziness. Though I note Stuart_Armstrong had the same opinion as me, and offered even fewer means of improvement, and got upvoted. I should have also said I agree with all points contained herein, and that the message is an important one. That would have reduced the bite.
This article is very heavy with Yudkosky-isms, repeats of stuff he's posted before, and it needs a good summary, and editing to pare it down. I'm surprised they posted it to the MIRI blog in its current form.
Edit: As stated below, I agree with all the points of the article, and consider it an important message.
Any RSS feeds?
Eliezer thinks it's a big deal.
Even in that case, whichever actor has the most processors would have the largest "AI farm", with commensurate power projection.
That interview is indeed worrying. I'm surprised by some of the answers.
Great news! I've been waiting for this kind of thing.
More likely, he also "always thought that way," and the extreme story was written to provide additional drama.
Thank you for replicating the experiment!