Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

I think some of the assumptions here have lead you to false conclusions. For one, you seem to assume that because humans share some values, all humans have an identical value system. This is just plain wrong, humans each have their own unique value "signature" more or less like a fingerprint. If there is one thing that you place more value weight on than a person who is otherwise identical, you are different. That being said, does your argument still hold with this, albeit minor in the grand scheme of things, heterogeneity added to human value systems? I don't think so. I think there is plenty of reason to think that human values will be much more robust because of the person-to-person differential.

Furthermore, I think the premise of this article kind of comes back to your claim that boredom is an absolute value. After you claim this, you go on to say how it was evolved over time (which is correct), but still hold that it is absolute (can't you see the contradiction here?). How can something be absolute if it evolved over time in humans to enhance survival?
Further, who's to say that with the advent of ASI this wouldn't be "cured" (so to speak). That is, an ASI should be able to detect the cause of human boredom and can thus genetically reprogram us to fix it. How can something that is structural due to evolutionary and environmental components of human development be considered a "human value"? Being a value implies that it somehow transcends biological constraints, I.e. tradition like religion, etc. You are painting boredom as a value when it is little more than an instinct. One can argue that even though something causes a biologically structural change, it constitutes a value. I can concede that, but how can you insist that the universe will have no "point" if these "values" get adjusted to compromise with the existence of an ASI? Value is completely subjective to the organism that holds it. The transhuman will have different values, and the universe will not necessarily contain less values for him/her/it at that time. In fact, it will likely be much richer to them.

Lastly: "A paperclip maximizer just chooses whichever action leads to the greatest number of paperclips." I counter with "a biological system just chooses (through natural selection) whichever action leads to greatest number of biological systems". How did this argument help you, exactly? Humans are subject to the same subjective value that a machine ASI would be subjected to. The only way to pretend that human value isn't just another component of how humans historically have done this, is by bestowing some sort of transcendent component to human biology (i.e. a soul or something). I think this is a methodological flaw to your argument.