LESSWRONG
LW

1256
Metal
0010
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No posts to display.
No wikitag contributions to display.
"Dangers of AI and the End of Human Civilization" Yudkowsky on Lex Fridman
Metal2y10

Also new here. One thing I did not understand about the "intelligence in a box created by less intelligent beings" analogy was why would the 'intelligence in a box' be impatient with the pace of the lesser-beings? It would seem that impatience/urgency is related to the time-finiteness of the intelligence. As code with no apparent finiteness of existence, why does it care how fast things move? 

Reply