The intelligence, reflectivity, and 'Lawful'ness of a mind are mostly independent of its capacity to experience emotion. Minds can be very unintelligent, lack anything like a "utility function" or some critical OODA loop components, possess little or no self awareness, and generally seem completely foreign to considerations of an abstract utility maximizer while still experiencing a wide variety of deep and complex feelings.

There is obviously some necessary core of "minds" that must exist for any particular one to be a "mind" or feel feelings, and it may be possible to say with confidence that e.g. amoeba don't have qualitative experience. What is not currently obvious is that anybody yet has enough information and conceptual understanding to reason conclusively about what those properties are and which minds have them, in practice, without the ability to check their work. People who have important moral conditionals on whether or not {ems, animals, babies} experience suffering, should keep in mind the possibility that their galaxy brain reasoning, whichever way it leads, is wrong.


Your opinion on the validity of Orthogonality Thesis #2, or on the larger question of which minds have the tools necessary for subjective experience, is logically independent of your opinion on whether or not those minds have moral value. Capacity-to-suffer is not some universal barometer for moral relevance, even for humans. Just as people can be sadistic or altruistic towards others, people can also have instincts towards others that are completely sideways of these considerations. Humans have been known to save encrypted backups of fictional video game save files to preserve fictional character lives that they are fully aware don't really "exist".

Thus there is not a contradiction, necessarily, in considering a third trimester fetus a life with similar moral weight to a living human, while believing that it doesn't have the ability to feel things. Similarly, many meat-eaters will cheerfully explain to vegetarians their entirely consistent position, that they do in fact believe animals can suffer and do so extensively throughout factory farms, and agree with holocaust analogies, and yet they simply do not care about nonhuman experiences of suffering.


The range of emotions a particular mind experiences as a part of a typical natural life, or the range of emotions it can experience with the aid of current medical/narcotic technology, is an incidental fact of evolution. It does not reflect the range of all possible emotions minds with that shape can experience, and may be a tiny subset. There may be some underlying component to "bad" experiences like despair, sadness, and pain, but they are still particular flavors optimized by evolution to direct us toward gene spreading in appropriate circumstances.

The intensity of emotions a particular mind experiences as a part of a typical natural life, or the intensity of emotions it can experience with the aid of current medical/narcotic technology, is an incidental fact of evolution. These emotions possess a distinct, finite intensity, and may even exist along a conceptually infinite line. The largest degrees of emotion people experience in the extremes of life may represent evolution's earnestly-optimized target, but probably not.

Animals were built by evolution under different circumstances, with different resulting mesa-objectives. Thus, if you believe animals have the capacity to experience emotions, then just as humans have pretty extreme within-species variation regarding the degree and kind of their subjective experiences, other animals may possess much more foreign inner scoring mechanics that are both qualitatively and quantitatively different from anything described inside dictionaries built by humans for human communication.

Knowing this is compatible with any particular opinion about the effectiveness of EA efforts to improve animal welfare, but it's more compatible with some opinions than others.


Having posited orthogonality thesis #2, #3, and #4 on the current date does not imply any particular opinion or emotional reaction on the part of the author about any other related musings that have recently occurred.

New to LessWrong?

New Comment