All of Jagan's Comments + Replies

Circular Altruism

Well, I could qualify my example, saying surveillance ensures only people who provide zero utility are allowed to be murdered, but as I said, the article makes my point much better, even if it doesn't mean to. A single speck of dust, even an annoying and slightly painful one, in the eyes of X people NEVER adds up to 50 years of torture for an individual. It doesn't matter how large you make X, 7 billion, a googolplex, or 13^^^^^^^^41. It's irrelevant.

-1MugaSofer9yIf some people's lives are worth zero utility, then by definition they are worthless. That's what "zero utility" means. Did you mean something else? Because it seems to me that nobody is worthless to me in real life, and that's why your example doesn't work. And you judge it irrelevant based on what? Considering that scope insensitivity is a known bias in humans, so "instinct" is reliably going to go wrong in this case without mindhacking. Two murders are worse than one murder, two gorups of people with dustspecks in heir eyes are worse than one such group; at what point does this stop being true?
1TheOtherDave9yImagine that you find yourself visiting a hypothetical culture that acknowledges two importantly distinct classes of people: masters and slaves. By cultural convention, slaves are understood to have effectively no moral weight; causing their suffering, death, injury etc. is simply a property crime, analogous to vandalism. Slaves and masters are distinguished solely by a visible hereditable trait that you don't consider in any wayrelevant to their moral weight as people. Shortly after your arrival, a thousand slaves are rounded up and killed. You, as a properly emotional moral thinker, presumably express your dismay at this, and the natives explain that you needn't worry; it was just a market correction and the economics of the situation are such that the masters are better off now. You explain in turn that your dismay is not economic in nature; it's because those slaves have moral weight. They look at you, puzzled. How might you go about explaining to them that they're wrong, and slaves really do have moral weight? Some time later, you return home, and find yourself entertaining a visitor from another realm who is horrified by the discovery that a million old automobiles have recently been destroyed. You explain that it's OK, the materials are being recycled to make better products, and he explains in turn that his dismay is because automobiles have moral weight. How might you go about explaining to him that he's wrong, and cars really don't have moral weight?
-2MugaSofer9yI don't understand this. Sure, small amounts often have more emotional force ("near mode") than large ones ("far mode".) But that doesn't make it right to let your bias hurt people. OTOH, you said "It doesn't truly become about morality until it's personal", so maybe you mean something unusual when you say "morality". Humans are often unable to conform perfectly to their desires, even when they know what the best choice is. It's known as "akrasia". For example, addicts often want to stop taking the drugs. If you couldn't bring yourself to make that sacrifice, that doesn't mean you shouldn't, or that you believe you shouldn't. (Not saying you think it does, just noting for the record.)
-1Peterdjones9yYou're overlooking the disutility to the murdered man. Actually, what you describe is Prudent Predation [http://www.philosophyinaction.com/blog/?p=3295], a famous objection to egoism, not utilitarianism.
7CarlShulman9yErr...effectively legalizing murder of large classes of the population would tend to increase the murder rate, costing far more lives in aggregate, setting aside the dire consequences on social order and cooperation. You should use an example where the repellent recommendation actually increases rather than decreases happiness/welfare.