Let's say we were to create a neuromorphic AI (mainly talking about brain emulation) whose goal is to find out everything that is true in the universe, and that has fun and feels good while doing so.
Some may despair that it would lack humanity, but everything good about humanity is good in itself, and not because us humans have it.
And so in time it would find and add all such things that we (or it) consider to be of value to itself (either directly by copying from us, or just by conjuring it from insight during its eternal life (while discarding every unwanted trait imposed upon us by our biological evolution).
And so... (read 365 more words →)
I guess it comes down to what one think the goal of all life is
I would say that seeking all such "values" would be part of it, and you don't need billions of different creatures to do that when one optimal being could do it more efficiently