Posts

Sorted by New

Wiki Contributions

Comments

Probable first mention by Yudkowsky on the extropians mailing list:

I wouldn't be as disturbed if I thought the class of hostile AIs I was
talking about would have any of those qualities except for pure
computational intelligence devoted to manufacturing an infinite number of
paperclips. It turns out that the fact that this seems extremely "stupid"
to us relies on our full moral architectures.