Even with a high degree of moral uncertainty and a wide range of possible moral theories, there are still certain actions that seem highly valuable in any theory. Bostrom argues that Existential risk reduction is among them, showing that it is not only it is the most important task given most versions of consequentialism, but highly recommended by many of the other widely acceptable moral theories3.
Moral uncertainty (or normative uncertainty) is uncertainty about how to act given the diversity of moral doctrines. For example, suppose that we knew for certain that a new technology would enable more humans to live on another planet with slightly less well-being than on Earth1. An average utilitarian would consider these consequences bad, while a total utilitarian would endorse such technology. If we are uncertain about which of these two theories are right, what should we do?
Moral uncertainty includes a level of uncertainty above the more usual uncertainty of what to do given incomplete information, since it deals also with uncertainty about which moral theory is right. Even with complete information about the worldworld, this kind of uncertainty would still remain 1. In one level of uncertainty, one can have doubts on how to act because all the relevant empirical information isn’t available, for exampleexample, choosing whether to implement or not a new technology (e.g.: AGI, Biological Cognitive Enhancement, Mind Uploading) not fully knowing about its consequences and nature. But even if we ideally get to know each and every consequencesconsequence of a new technology, we would still need to know which is the right ethical perspective for analyzing these consequences.
Even with a high degree of moral uncertainty and a wide range of possible moral theories, there are still certain actions which seemsthat seem highly valuable in any theory. Bostrom argues that Existential risk reduction is among then,them, showing that not only it is the most important task given most versions of consequentialism, but highly recommended by many of the other widely acceptable moral theories3.
Moral uncertainty (or normative uncertainty) is uncertainty about
howwhat we ought, morally, toactdo given the diversity of moral doctrines. For example, suppose that we knew for certain that new technology would enable more humans to live on another planet with slightly less well-being than on Earth1. An average utilitarian would consider these consequences bad, while a total utilitarian would endorse such technology. If we are uncertain about which of these two theories are right, what should we do?Even with a high degree of moral uncertainty and a wide range of possible moral theories, there are still certain actions that seem highly valuable in any theory. Bostrom argues that Existential risk reduction is among them, showing that it is not only the most important task given most versions of
consequentialism,consequentialism but highly recommended by many of the other widely acceptable moral theories3.