Sorted by New

Wiki Contributions


I don't believe in the existence of morals, which is to say there is no "right" or "wrong" in the universe. However, I'll still do actions that most people would rate "moral". The reasons I do this are found in my brain architecture, and are not simple. Also, I don't care about utilitarianism. One can probably find some extremely complex utility function that describes my actions, which makes everybody on earth a utilitarianist, but I don't consciously make utility calculations. On the other hand, if morality is defined as "the way people make decisions", then of course everybody is moral and morality exists.

I believe that "nothing is right or wrong", but that doesn't affect my choices much. There is nothing inconsistent with that.

Roko, morals are in the end arbitrary, and there is no "correct" moral code for the AI to choose. The AI can be programmed to generalize a moral code from all humans though.

You can have real X-Men, check out a discovery special about "real superhumans". There was one guy who could withstand cold so well that the doctors thought it shouldn't be possible. A single mutation sometimes does create significant changes (and in this case advantages).

If you believe that p-zombies are logically impossible, you're claiming that when one does an atom simulation, and those atoms happen to form a human brain, then it creates a pathway to the consciousness-stuff, and not only that, but that consciousness-stuff has a precise, causal effect on your atom simulation. And not only that, but the effect amazingly changes the thought process using a protocol that evolution has just happened to choose! Pretty remarkable claim to me.

"3. Intuitively, it sure seems like my inward awareness is causing my internal narrative to say certain things."

Intuitively maybe, but in the epiphenomenalism you only have conscious experience of the 'inward awareness', and it is in reality a physical function which creates the experience, so the experience does not cause anything.

"4. The word "consciousness", if it has any meaning at all, refers to that-which-is or that-which-causes or that-which-makes-me-think-I-have inward awareness." Your not using the correct definition for the zombie argument, therefore your point is invalid. Consciousness means in this context the sum of sensory experience.

"In worlds where it is impossible to measure a difference in principle, it shouldn't have any impact on what's the correct action to take, for any sane utility function."

Wrong, since it may be possible to estimate the probability of being in a p-zombie world, or more generally the probability that such a difference exists.

"However, this will necessarily mean that they're shown to refer to things that are actually measurable."

Things that cannot be measured can still be very important, especially in regard to ethics. One may claim for example that it is ok to torture philosophical zombies, since after all they aren't "really" experiencing any pain. If it could be shown that I'm the only conscious person in this world and everybody else are p-zombies, then I could morally kill and torture people for my own pleasure.

"Actually, currently my brain isn't particularly interested in the concepts some people call "qualia"; it certainly doesn't assume it has them. If you got the idea that it did because of discussions it participated in in the past, please update your cache: This doesn't hold for my present-brain."

Does your brain assume/think it creates sensory experiences (or what people often call consciousness)?

"We've already found the flaw."

What exactly is the logical flaw you've found? The zombie argument tells among other things that there can be no test that will tell if a person is really conscious or just a zombie. You might "know" that you're conscious yourself, but there can be no rational argument that proves this.

"What real reasons? I don't see any." If Zombie Worlds are possible, we might be living in it and therefore there can be no argument that proves otherwise. Your brain assumes that you have qualia, but I make no such assumption.

Load More