Sorted by New

Wiki Contributions



"Ideals are like stars". All Schurz is doing is defining, yet again, desire. Desire is metonymic by definition, and I think it is one of the most important evolutionary traits of the human mind. This permanent dissatisfaction of the mind must have proven originally very useful in going after more game that we could consume, and it is still useful in scientific pursuits. How would AI find its ideals? What would be the origin of the desire of AI that would make it spend energy for finding something utterly useless like general knowledge? If AI evolves it would be focused on energy problems (how to think more and faster with lower energy consumption) and it may find interesting answers, but only on that practical area. If you don't solve the problem of AI desire (and this is the path of solving friendliness) AI will evolve very fast on a single direction and will reach real fast the limits of its own "evolutionary destiny". I still think the way to go is to replace biological mass with replaceable material in humans, not the other way around.