Here is a paper in PLOS Biology re-considering the lessons of some classic psychology experiments invoked here often (via).

Contesting the “Nature” Of Conformity: What Milgram and Zimbardo's Studies Really Show

To me the crux of the paper comes from this statement in the abstract:

This suggests that individuals' willingness to follow authorities is conditional on identification with the authority in question and an associated belief that the authority is right.

Plus this detail from the Milgram experiment:

Ultimately, they tend to go along with the Experimenter if he justifies their actions in terms of the scientific benefits of the study (as he does with the prod “The experiment requires that you continue”) [39]. But if he gives them a direct order (“You have no other choice, you must go on”) participants typically refuse. Once again, received wisdom proves questionable. The Milgram studies seem to be less about people blindly conforming to orders than about getting people to believe in the importance of what they are doing [40].

New Comment
7 comments, sorted by Click to highlight new comments since: Today at 4:44 PM

Ultimately, they tend to go along with the Experimenter if he justifies their actions in terms of the scientific benefits of the study (as he does with the prod “The experiment requires that you continue”) [39]. But if he gives them a direct order (“You have no other choice, you must go on”) participants typically refuse.

This seems very suspicious because I remember offhand that the former is an early prompt and the second one is the final prompt given only after the other prompts have been exhausted.

I would strongly disagree.

My interpretation of these experiments is that they make a lot of sense if you consider morality from a system-1 and system-2 perspective. If we actually sit down and think about, humans tend to have somewhat convergent answers to moral dilemmas which tend to the utilitarian (in this case, don't shock the man). That's a system-2 response.

However, in the heat of the moment, faced with a novel situation, we resort to fast, cheap system-1 heuristics for our moral intuitions. Some of those heuristics are 'what is everyone else doing?' 'what is the authority figure telling us to do' and 'what have I done in similar situations in the past?' Normally, these work pretty well. However, in certain corner cases, they produce behavior that system-2 would never condone: lynch mobs, authoritarian cruelty, and the unfortunate results of Milgram's experiments.

People didn't decide, rationally, that it was morally right to torture a man to death for the sake of an experiment they knew nothing about and were paid a few dollars to participate in, and this paper is silly to suggest otherwise. They did it because they were under stress, and the strongest influence in their head was the ancestral heuristic of 'keep your head down, do what you're told, they must know what they're doing.'

There are a number of other possible explanations for that detail. For example:

"The experiment requires that you continue" invokes the larger apparatus of Science. It gives the impression that something much larger than you is at foot, and that ALL of it is expecting you to shut up and do what you're told.

"You have no other choice, you must go on" - that rankles. Of course there's a choice. We pattern match it to a moral choice, and system 2 comes in and makes the right call.

The best lesson you can learn from these experiments, as depressing as they are, is that when you feel rushed, and there's life and death at stake, and you don't feel you have time to breathe, the best possible thing you can do is to stop, sit down on the floor, clear your head, and take a moment to really try to think about what you're doing.

I feel like a more informative first sentence for this comment might be "While I agree that there is a distinction in circumstances to be made, which point to the Milgram experiment having poor methodology and questionable results, I disagree with the new interpretation of the circumstance given."

In my mind this is almost agreement, but with a bit of a difference at the end.

[-]zaph11y30

I believe the article the OP points to is actually more about how system 2 is being engaged in these systems, and is therefore not "blind obedience", i.e. a simple heuristic being engaged. From the conclusion:

On the other hand, it ignores the evidence that those who do heed authority in doing evil do so knowingly not blindly, >actively not passively, creatively not automatically. They do so out of belief not by nature, out of choice not by necessity. >In short, they should be seen—and judged—as engaged followers not as blind conformists

Equally, what is shocking about Milgram's experiments is that rather than being distressed by their actions, participants >could be led to construe them as “service” in the cause of “goodness.”

At root, the fundamental point is that tyranny does not flourish because perpetrators are helpless and ignorant of their >actions. It flourishes because they actively identify with those who promote vicious acts as virtuous [49]. It is this >conviction that steels participants to do their dirty work and that makes them work energetically and creatively to ensure >its success. Moreover, this work is something for which they actively wish to be held accountable—so long as it secures >the approbation of those in power.

To put words into their mouth, I believe they are arguing that people's system 2's are overriding the "don't hurt people" heuristic of system 1, as opposed to system 2 analysis being overridden by a simple obedience heuristic.

There are a number of other possible explanations for that detail. For example:

So, design a better test. What prompt would activate the System 1 you describe as 'keep your head down, do what you're told, they must know what they're doing' and not activate System 2, like the "You have no other choice" statement?

I think nigerweiss is asserting that "The experiment requires that you continue" activates System 1 but not System 2.

The claim made by the OP is "if people believe in what they're doing, they will hurt people;" the claim made by nigerweiss is "if people use system 1 thinking, they will hurt people." To differentiate between them, we need a statement intended to make people use system 1 thinking without relying on them believing what they are doing.

It's not clear to me that nigerweiss's division is more precise than the OP's division, or has significant predictive accuracy. I would have expected "you have no other choice" to evoke 'keep your head down, do what you're told, they must know what they're doing'; that is, the system 1 thinking that nigerweiss claims would lead people to push the button, when it led to less people pushing the button. Why is it a status attack that awakens system 2 (huh?), except because we know what we need to predict?