Look. Simple utilitarianism doesn't have to be correct. It looks like a wrong idea to me. Often, when reasoning informally, people confabulate wrong formal sounding things that loosely match their intuitions. And then declare that normative.
Is a library of copies of one book worth the same to you? Is a library of books of 1 author worth as much? Does variety ever truly count for nothing? There's no reason why u("AB") should be equal to u("A")+u("B"). People pick + because they are bad at math , or perhaps bad at knowing when they are being bad at math. edit: When you try to math-ize your morality, poor knowledge of math serves as Orwellian newspeak, it defines the way you think. It is hard to choose correct function even if there was any, and years of practice on too simple problems make wrong functions pop into your head.
Ohh, I agree. I just don't think that there is a corresponding neurological distinction. (Original quote was about evolution).
Propositional logic is made of many very simple steps, though.
What is analytical thinking, but a sequence of steps of heuristics well vetted not to lead to contradictions?
Replied in PM.
If worst comes to worst, refuse to sign any papers what so ever, you'll go to prison for a few years. Or shoot yourself in the foot on accident, that flips burden of proof. It's called non-violent resistance. I don't think US would allow any other form of objection (edit: besides e.g. being Amish). There are 2 types of conscription. Total war conscription to win an important war where you have a lot to lose; this one would go nuclear within the first hour. And majority enslaving minority, the only type of conscription possible in the US.
No one here felt distraught with religion? Not even a little? :)
Difference in values is a little overstated, I think. Practically, there's little difference between what people say they'd do in Milgram experiment, but a huge difference between what they actually do.
A crazy idea reflects badly on the ideology that spawned the crazy idea.
1st link is ambiguity aversion.
Morality is commonly taken to describe what one will actually do when they are trading off private gains vs other people's losses. See this as example of moral judgement. Suppose Roberts is smarter. He will quickly see that he can donate 10% to charity, and it'll take longer for him to reason about value of cash that was not given to him (reasoning that may stop him from pressing the button), so there will be a transient during which he pushes the button, unless he somehow suppresses actions during transients. It's an open ended problem 'unlike logic' because consequences are difficult to evaluate.
edit: been in a hurry.