LESSWRONG
LW

Bogdan_Butnaru2
23020
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
Failed Utopia #4-2
Bogdan_Butnaru216y150

That's not the message Eliezer tries to convey, Russell.

If I understood it, it's more like "The singularity is sure to come, and transhumanists should try very hard to guide it well, lest Nature just step on them and everyone else. Oh, by the way, it's harder than it looks. And there's no help."

Reply
Failed Utopia #4-2
Bogdan_Butnaru216y80

I was just thinking: A quite perverse effect in the story would be if the genie actually could have been stopped and/or improved: That is, its programming allowed it to be reprogrammed (and stop being evil, presumably leading to better results), but due to the (possibly complex) interaction between its 107 rules it didn't actually have any motivation to reveal that (or teach the necessary theory to someone) before 90% of people decided to kill it.

Reply
No wikitag contributions to display.
No posts to display.