So I'm going through the sequences (in AI to Zombies) and I get to the bit about Asch's Conformity Experiment.

 

It's a good bit of writing, but I mostly pass by without thinking about it too much.  I've been taught about the experiment before, and while Eliezer's point of whether or not the subjects were behaving rationally is interesting, it kind of got swallowed up by his discussion of lonely dissent, which I thought was more engaging.

 

Later, after I'd passed the section on cult attractors and got into the section on letting go, a thought occurred to me, something I'd never actually thought before.

 

Eliezer notes:

 

Three-quarters of the subjects in Asch's experiment gave a "conforming" answer at least once.  A third of the subjects conformed more than half the time.

 

That answer is surprising.  It was surprising to me the first time I learned about the experiment, and I think it's surprising to just about everyone the first time they hear it.  Same thing with a lot of the psychology surrounding heuristics and biases, actually.  Forget the Inquisition - no one saw the Stanford Prison Experiment coming.

 

Here's the thought I had:  Why was that result so surprising to me?

 

I'm not an expert in history, but I know plenty of religious people.  I've learned about the USSR and China, about Nazi Germany and Jonestown.  I have plenty of available evidence of times where people went along with things they wouldn't have on their own.  And not all of them are negative.  I've gone to blood drives I probably wouldn't have if my friends weren't going as well.

 

When I thought about what my prediction would be, had I been asked what percentage of people I thought would dissent before being told, I think I would have guessed that more than 80% of subject would consistently dissent.  If not higher.

 

And yet that isn't what the experiment shows, and it isn't even what history shows.  For every dissenter in history, there have to be at least a few thousand conformers.  At least.  So why did I think dissent was the norm?

 

I notice that I am confused.

 

So I decide to think about it, and my brain immediately spits out: you're an American in an individualistic culture.  Hypothesis: you expect people to conform less because of the culture you live in/were raised in.  This begs the question: have their been cross-cultural studies done on Asch's Conformity Experiment?  Because if people in China conform more than people in America, then how much people conform probably has something to do with culture.

 

A little googling brings up a 1996 paper that does a meta-analysis on studies that repeated Asch's experiments, either with a different culture, or at a later date in time.  Their findings:

 

The results of this review can be summarized in three parts.

First, we investigated the impact of a number of potential moderator variables, focusing just on those studies conducted in the United States where we were able to investigate their relationship with conformity, free of any potential interactions with cultural variables. Consistent with previous research, conformity was significantly higher, (a) the larger the size of the majority, (b) the greater the proportion of female respondents, (c) when the majority did not consist of out-group members, and (d) the more ambiguous the stimulus. There was a nonsignificant tendency for conformity to be higher, the more consistent the majority. There was also an unexpected interaction effect: Conformity was higher in the Asch (1952b, 1956) paradigm (as was expected), but only for studies using Asch's (1956) stimulus materials; where other stimulus materials were used (but where the task was also judging which of the three comparison lines was equal to a standard), conformity was higher in the Crutchfield (1955) paradigm. Finally, although we had expected conformity to be lower when the participant's response was not made available to the majority, this variable did not have a significant effect.

The second area of interest was on changes in the level of conformity over time. Again the main focus was on the analysis just using studies conducted in the United States because it is the changing cultural climate of Western societies which has been thought by some to relate to changes in conformity. We found a negative relationship. Levels of conformity in general had steadily declined since Asch's studies in the early 1950s. We did not find any evidence for a curvilinear trend (as, e.g., Larsen, 1982, had hypothesized), and the direction was opposite to that predicted by Lamb and Alsifaki (1980).

The third and major area of interest was in the impact of cultural values on conformity, and specifically differences in individualism-collectivism. Analyses using measures of cultural values derived from Hofstede (1980, 1983), Schwartz (1994), and Trompenaars (1993) revealed significant relationships confirming the general hypothesis that conformity would be higher in collectivist cultures than in individualist cultures. That all three sets of measures gave similar results, despite the differences in the samples and instruments used, provides strong support for the hypothesis. Moreover, the impact of the cultural variables was greater than any other, including those moderator variables such as majority size typically identified as being important factors.

Cultural values, it would seem, are significant mediators of response in group pressure experiments.

 

So, while the paper isn't definitive, it (and the papers it draws from) show reasonable evidence that there is a cultural impact on how much people conform.

 

I thought about that for a little while, and then I realized that I hadn't actually answered my own question.

 

My confusion stems from the disparity between my prediction and reality.  I'm not wondering about the effect culture has on conformity (the territory), I'm wondering about the effect culture has on my prediction of conformity (the map).

 

In other words, do people born and raised in a culture with collectivist values (China, for example) or who actually do conform beyond the norm (people who are in a flying-saucer cult, or the people actually living in a compound) expect people to conform more than I did?  Is their map any different from mine?

 

Think about it - with all the different cult attractors, it probably never feels as though you are vastly conforming, even if you are in a cult.  The same can probably be said for any collectivist society.  Imagine growing up in the USSR - would you predict that people would conform with any higher percentage than someone born in 21st century America?  If you were raised in an extremely religious household, would you predict that people would conform as much as they do?  Less?  More?

 

How many times have I agreed with a majority even when I knew they probably weren't right, and never thought of it as "conformity"?  It took a long time for my belief in god to finally die, even when I could admit that I just believed that I believed.  And why did I keep believing (or keep trying to/saying that I believed)?

 

Because it's really hard to actually dissent.  And I wasn't even lonely.

 

So why was my map that wrong?

 

What background process or motivated reasoning or...whatever caused that disparity?

 

One thing that, I think, contributes, is that I was generalizing from fictional evidence.  Batman comes far more readily to my mind than Jonestown.  For that matter, Batman comes more readily to my mind than the millions of not-Batmans in Gotham city.  I was also probably not being moved by history enough.  For every Spartacus, there are at minimum hundreds of not-Spartuses, no matter what the not-Spartacuses say when asked.

 

But to predict that three-quarters of subjects would conform at least once seems to require a level of pessimism beyond even that.  After all, there were no secret police in Asch's experiment; no one had emptied their bank accounts because they thought the world was ending.

 

Perhaps I'm making a mistake by putting myself into the place of the subject of the experiment.  I think I'd dissent, but I would predict that most people think that, and most people conformed at least once.  I'm also a reasonably well-educated person, but that didn't seem to help the college students in the experiment.

 

Has any research been done on people's prediction of their own and other's conformity, particularly across cultures or in groups that are "known" for their conformity (communism, the very religious, etc.)?  Do people who are genuine dissenters predict that more people will dissent than people who genuinely conform?

 

I don't think this is a useless question.  If you're starting a business that offers a new solution to a problem where solutions already exist, are you overestimating how many people will dissent and buy your product?

New Comment
16 comments, sorted by Click to highlight new comments since: Today at 11:16 AM

I'm not wondering about the effect culture has on conformity (the territory), I'm wondering about the effect culture has on my prediction of conformity (the map). ... Is their map any different from mine?

Notice that their territory is different from yours. Just that would make you expect their map to be different.

One question that you may ask is whether the bias (the difference between the territory and the map) is a function of the territory: do people in collectivist cultures mis-estimate the prevalent conformity in a different way from people in individualist cultures?

I don't think this is a useless question.

It is not. Consider, for example, one of the issues in political studies: why repressive regimes which present a solid and impenetrable facade tend to collapse very rapidly when the first cracks in the facade appear? One of the answers is that it's a consequence of available information: a lot of people might be very unhappy with the regime but as long as they believe that they are a powerless minority they will hide and do nothing. The first cracks basically tell these people "you're not alone, there are many of you*, and the regime collapse follows soon thereafter.

Note the parallels to estimating the conformity of other people.

I lived in a communist regime until I was 13, and my general impression was that everything was mostly okay and everyone was mostly happy. That was partially a childhood naiveté, but also partially an effect of censorship. Even if I was dissatisfied with something, I didn't attribute it specifically to the political regime, but to failures of specific people, and the failures of bureaucracy which is a necessary evil of a civilized society. All problems seemed like "first-world problems". (Actually, attributing all failures to individuals was explicitly encouraged by the regime. Assuming the individuals were not powerful communists, of course.)

I had absolutely no idea that there were people around me who had their family members kidnapped by the secret police and tortured, sometimes to death, for "crimes" such as having a different opinion and debating it with other similar "criminals". I didn't understand why all documents about me emphasised that I had "workers' lineage" when in fact both my parents had university education; but I assumed it was just another weird bureaucratic way of speech. (It actually meant that I was free from the hereditary sin of "bourgeois lineage" i.e. having an entrepreneur among my ancestors. People with "bourgeois lineage" were not allowed to study at universities, and couldn't get any good job as long as someone else with "workers' lineage" was available for the same job.)

After learning all this information (and realizing that it actually explained a few weird things that I previously noticed but didn't have a good explanation for), I couldn't see the situation with the same eyes anymore.

The important thing is that this knowledge is now a public knowledge, which means that not only "I know" and "you know", but also "I know that you know" and "you know that I know that you know", et cetera.

If there is only one person "making problems", it is easy for the regime to get rid of them, while maintaining the façade. After midnight, a group of men in black coats with guns will knock on their door and take them away. Worst case, the family will never hear about them again, officially. Unofficially, sometimes a stranger on the street will later tell them to not expect their family member back because he's dead; and no, you won't even receive the body for burial, because fuck you, you anticommunist filth! (Also the relatives will get written in their documents "a relative of a suspected traitor", which means: forget ever studying at a university or getting a good job.)

If a group of people "makes problems" publicly, the police will quickly take them away, and no media will ever mention the story. But if a large group of people makes a public demonstration at the center of a big city, and if they refuse to surrender quickly and silently to the police, then too many people will notice that "something happened". The regime now cannot deal with the problem by usual silence. They will probably publish an official explanation, something like: "A small group of traitors paid by Americans was trying to disrupt our peace and prosperity, but don't worry, our brave policemen have eliminated the threat. Please stay calm and don't listen to any rumors; also report all suspicious behavior and rumor spreading to the police." But even this kinda admits that some people have some objections, so the façade of "we are all one big happy family" starts cracking apart. And people are too curious, so various rumors will start spreading anyway.

On the other hand, even this public knowledge can be reversed. One should never underestimate the capacity of motivated people to deny anything. A few years later, when the shock of seeing the true face of regime has faded, if some sympathizers of the previous regime remained at power, they can create a synchronized denial. All they have to do is to start saying publicly: "This never happened; actually this is all merely American propaganda". At the beginning everyone knows it's a lie, but now also people who want to turn a blind eye to everything know that actually there is a socially acceptable way to deny everything; that they are not a powerless minority. So they start repeating the denial as a way signal belonging to their tribe. And their children will grow up actually believing that nothing bad happened, and that everything is merely a propaganda. And it's just a question of time until people start saying: "Well, why don't we get rid of the American propaganda now, and return to the glorious old days?".

What you describe is the winding-down days of communism, during it's hayday the arrests and torture didn't happen in the middle of the night, but in broad daylight, to cheering crowds. This phenomenon, not limited to communist states, works as follows:

The official line is not that everybody is happy and everything is perfect, but that everything would be perfect if it wasn't for the rightists/heretics/sexists/racists/etc. (depending on the society). The insidious thing about this is that anybody who has a different opinion and debates it can be charged with rightism, and is in fact guilty by definition. Heck anyone arrested, even if he wasn't originally a rightist has almost no way to defend himself without making the charge true. The only chance he has is demonstrating his loyalty by being as fanatical as possible at the next rally.

One question that you may ask is whether the bias (the difference between the territory and the map) is a function of the territory: do people in collectivist cultures mis-estimate the prevalent conformity in a different way from people in individualist cultures?

Thank you for putting that so clearly.

Maybe there is a difference between conformity and collectivism; and between non-conformity and individualism. If a culture teaches people to worship individuality, and they do, that's conforming to the cultural rules. A non-conforming move in such culture could be e.g. to have a group of friends and decide things together; using the wisdom of crowds even if that is considered a heresy (but instead of "heresy" the culture would use its own low-status label, such as "stupid" or "immature").

The people in Jonestown probably considered themselves to be heroic rebels, against the mainstream society. But they were the "dressed-in-black" kind of rebels, instead of the "clown-suit" kind of rebels.

And the people who are "individualist" because the society told them that being "individualist" is a virtue and they never questioned it, and they are all "individualist" in the same way... that's like wearing a black uniform in a society where everyone is required to wear a black uniform. You can enjoy watching the horror on their faces when you suggest them to make something non-"individualist".

But I admit I wouldn't be able to predict whether being a "dressed-in-black" rebel is or isn't enough to dissent in the Asch's Conformity Experiment. I could explain it either way.

Maybe there is a difference between conformity and collectivism; and between non-conformity and individualism.

I think so. You can be a conformist to ideological individualism and a non conformist who is an ideological collectivist.

Finally, although we had expected conformity to be lower when the participant's response was not made available to the majority, this variable did not have a significant effect.

That is a really interesting variation.

It's not that they're consciously trying to conform, it's that their truth function is highly dependent on the opinions of others. A heeby jeeby inducing idea that I've suspected for a while, but can't recall such clear data for before.

For some people, two plus two really does equal five if other people think so.

I think the idea of a single-dimensional "dissenter-ness" measure of people requires much more evidence for me to believe it's a useful model of people's behavior in very different areas (politics, buying choices, socializing, religion).

People usually follow general rules or strategies of behavior. Reasoning consciously about things takes effort and motivation. Any strategy or heuristic leads to the wrong answer sometimes, and we tend to focus on those cases without considering the more usual successes of the strategy. Also, most strategies aren't conscious or deliberate, and people often invent wrong rationalizations for their actions that they honestly believe in.

These are all general reasons to be wary of studies showing that "people conform" by testing their behavior in one situation (the Asch test) and then extrapolating to their behavior in another (where they have a reason or desire to be right or to convince others).

Perhaps the people in the study followed a heuristic similar to "I'm asked to do a task; I don't get paid more if I do it better, so I'll put it in the minimum reasonable effort; and a great way to do a novel task is to copy whatever that other guy is doing, he seems sure of himself, and if I get it wrong at least I won't be blamed as much as for a novel error." Or: "it's polite to pretend to agree with others about questions that clearly don't matter in and of themselves; avoid pointless arguments to focus on the important ones."

I suggest a different experiment: ask the same question as Asch, but tell participants that a monetary prize (large enough that they'd care about it more than the trivial inconvenience of disagreeing with strangers) will be awarded only to those who answer correctly.

I think it probably matters a lot what people are conforming about. If it's about perception (which line is the same, which color is different) and several people all say the same thing that's different from what I thought I saw, I can see myself starting to doubt my perception. If it's about memory (what is the capital of Rumania?) I'd start thinking I must have misremembered. But if 4 people all said that 2+2=5, I'd realise the experiment wasn't about what they said it was.

Baring a fault in our visual cortex or optical systems - an optical illusion, in other words - how is determining that Black is Black or that two lines are the same length any different from mathematical statements? There's a bit in the sequences on why 2+2=4 isn't exactly an unconditional truth. The thought processes that go into both include checking your perceptions, checking your memory, and checking reality.

Maybe 2+2=4 is too simple an example, though; it would be downright Orwellian to stand in a room and listen to a group of people declare that 2+2=5. On the other hand, imagine standing in a room with a bunch of people claiming that there aren't an infinite amount of prime numbers - it might be easier to doubt your own perceptions.

Anyone else want to weigh in on this? Does Asch's methodology effect conformity?

Perhaps I'm making a mistake by putting myself into the place of the subject of the experiment. I think I'd dissent, but I would predict that most people think that, and most people conformed at least once.

I think that is exactly your mistake. We have the most access to our own minds, and therefore using it to model the minds of others seems rather natural. What would I do in their place?

Problem is, if you're here, you're likely an odd duck and on the tails of many statistical distributions. You're not a prototypical sample of the general population. Stop using yourself as a prototype for explaining the behavior of others.

Also, now I wonder, do other people grow up building multiple prototypical models of the internal thought processes of other people, instead of using their own thought processes as models, and setting a few belief and preference variables differently depending on the problem?

Am I just projecting my own intuitions of the "natural" way to solve the problem on others?

Problem is, if you're here, you're likely an odd duck and on the tails of many statistical distributions. You're not a prototypical sample of the general population. Stop using yourself as a prototype for explaining the behavior of others.

Yep, one of the most difficult lessons in my life.

It feels so convenient to imagine that other people are like me. There are so many people saying that people just need to be more open and they will realize how everyone is so similar. I guess that advice is most useful for the average people, who are the majority of the population; and perhaps useful to everyone in some areas of life. But whenever I start believing that everyone is "secretly just like me, they are just hiding it just like I do", it's enough to open my mouth, and I quickly get corrected. (LessWrong meetups being one of the very few exceptions, and even there it depends on who I talk with.)

Do people who are genuine dissenters predict that more people will dissent than people who genuinely conform?

Genuine dissenters generally predict that most people will conform, largely because it's a lot easier to notice people conforming when you disagree with the thing they're conforming to.

Is there any evidence to support this in general?

Also, a dissenter in one area (religion, for example) might be a conformer in another. I think it's worth looking at whether someone who actively protests racial discrimination (in a non-conforming way, so maybe someone from the early civil rights movement) would dissent in Asch's experiment. Does willingness to dissent in one area of your life transfer over to a larger willingness to dissent in other areas of your life?

The research indicates that most people's responses to any social science result is "that's what I would have expected," although that doesn't actually seem to be true; you can get them to say they expected conflicting results. Have there really been no studies of when people say they think studies are surprising, comparing the results to what people actually predicted beforehand (I know Milgram informally surveyed what people expected before his study, but I don't think he did any rigorous analysis of expectations)? Perhaps people are as inaccurate in reporting what they find surprising as they are in reporting what they expected. It would certainly be interesting to know!

There are studies on hindsight bias, which is what I think you're talking about.

In 1983, researcher Daphna Baratz asked undergraduates to read 16 pairs of statements describing psychological findings and their opposites; they were told to evaluate how likely they would have been to predict each finding. So, for example, they read: “People who go to church regularly tend to have more children than people who go to church infrequently.” They also read, “People who go to church infrequently tend to have more children than people who go to church regularly.” Whether rating the truth or its opposite, most students said the supposed finding was what they would have predicted.

From her dissertation.

(I couldn't find a pdf of the dissertation, but that's its page on worldcat).

As for your specific question:

Have there really been no studies of when people say they think studies are surprising, comparing the results to what people actually predicted beforehand

I have no idea, but I want them.