Part of the sequence: The Science of Winning at Life

In 1961, Stanley Milgram began his famous obedience experiments. He found that ordinary people would deliver (what they believed to be) excruciatingly painful electric shocks to another person if instructed to do so by an authority figure. Milgram claimed these results showed that in certain cases, people are more heavily influenced by their situation than by their internal character.

Fifty years and hundreds of studies later, this kind of situationism is widely accepted for broad domains of human action. People can inflict incredible cruelties upon each other in a prison simulation.b Hurried passersby step over a stricken person in their path, while unhurried passersby stop to help.a Willingness to help varies with the number of bystanders, and with proximity to a fragrant bakery or cofee shop.c The list goes on and on.d

Our inability to realize how powerful the effect situation has on human action is so well-known that it has a name. Our tendency to over-value trait-based explanations of others' behavior and under-value situation-based explanations of their behavior is called the fundamental attribution error (aka correspondence bias).

Recently, some have worried that this understanding undermines the traditional picture we have of ourselves as stable persons with robust characteristics. How can we trust others if their unpredictable situation may have so powerful an effect that it overwhelms the effect of their virtuous character traits?

But as I see it, situationist psychology is wonderful news, for it means we can change!

If situation has a powerful effect on behavior, then we have significant powers to improve our own behavior. It would be much worse to discover that our behavior was almost entirely determined by traits we were born with and cannot control.

For example, drug addicts can be more successful in beating addiction if they change their peer group - if they stop spending recreational time with other addicts, and spend time with drug-free people instead, or in a treatment environment.e


Improving rationality

What about improving your rationality? Situationist psychology suggests it may be wise to surround yourself with fellow rationalists. Having now been a visiting fellow with the Singularity Institute for only two days, I can already tell that almost everyone I've met who is with the Singularity Institute or has been through its visiting fellows program is a level or two above me - not just in knowledge about Friendly AI and simulation arguments and so on, but in day-to-day rationality skills.

It's fascinating to take part in a conversation with really trained rationalists. It might go something like this:

Person One: "I suspect that P, though I know that cognitive bias A and B and C are probably influencing me here. However, I think that evidence X and Y offer fairly strong support for P."

Person Two: "But what about Z? This provides evidence against P because blah blah blah..."

Person One: "Huh. I hadn't thought that. Well, I'm going to downshift my probability that P."

Person Three: "But what about W? The way Schmidhuber argues is this: blah blah blah."

Person One: "No, that doesn't work because blah blah blah."

Person Three: "Hmmm. Well, I have a lot of confusion and uncertainty about that."

This kind of thing can go on for hours, and not just on abstract subjects like simulation arguments, but also on more personal issues like fears and dreams and dating.

I've had several of these many-hours-long group conversations already - people arguing vigorously, often 'trashing' others' views (with logic and evidence), but with everybody apparently willing to update their beliefs, nobody getting mad or hurt, and people even making decisions to change something in their life in response to a Bayesian update about something.

The community norms reinforce this behavior, and it has had an obvious effect. All these people have spent time living with at least two other rationalists for many months - most of them, for longer than that. I haven't done an experiment that allows causal inference, but... community seems to be working splendidly for improving rationality. And situationist psychology explains why.



Want to change your behavior, your self? In many cases, one of the most effective things you can do is to change your situation.

Live with rationalists. Stop hanging out with downward-spiral, drug-abusing friends. Move to another state or province or nation. Get a different job. Spend more time at the park, less time at home. Or less time at the park, and more at home. Consider what you want to achieve, and how a change of situation might help you do that. Then change your situation, and change yourself.


Next post: The Power of Reinforcement

Previous post: How to Be Happy




a Darley & Batson (1973).

b Zimbardo et al. (1973).

c Baron (1997).

d Much of the literature is helpfully reviewed in Doris (2005).

e Velasquez et al. (2001); Connors et al. (2004, ch. 6.); Galanter (2010).



Baron (1997). The sweet smell of... helping: Effects of pleasant ambient fragrance on prosocial behavior in shopping malls. Personality and Social Psychology Bulletin, 23: 498-503.

Connors, Donovan, & DiClemente (2004). Substance abuse treatment and stages of change: Selecting and planning interventions. Guilford.

Darley & Batson (1973). From Jerusalem to Jericho: a study of situational and dispositional variables in helping behavior. Journal of Personality and Social Psychology, 27: 100-108.

Doris (2005). Lack of Character. Cambridge University Press.

Galanter (2010). Network therapy. In Marc Galanter and Herbert Kleber (eds.), Psychotherapy for the treatment of substance abuse (pp. 249-276). American Psychiatric.

Velasquez, Maurer, Crouch, & DiClemente (2001). Group Treatment for Substance Abuse: A Stages-of-Change Therapy Manual. Guilford.

Zimbardo, Banks, Haney, & Jaffee (1973). The mind is a formidable jailer: a pirandellian prison. New York Times Magazine, April 8, 1973, pp. 38-60.

Moderation Guidelinesexpand_more

Scott Adams made this observation in a blog post:

Years ago I worked with a young intern at Crocker Bank who believed his first step toward success was to find a place to live in a prosperous suburb. His theory was that the external environment would program his brain for the sort of success that his neighbors would have already found. I remember mocking him for his offbeat and naive theory. Now I think he's a genius for understanding at such an early age that his environment was a tool for programming his brain. I lost touch with him, but 'll bet he's a millionaire now.

This is definitely one lesson I should have learned earlier than I did.

Without the follow-up report, this is hardly evidence that the theory works. I guess it counts as evidence that the theory is convincing.

For some evidence, it might be worthwhile to take a look at how agile software development works.

(Or that it works at all.)

At my current workplace, there are teams of around 6-8 people, working together in one big room for each team. The way it works is the following: we get a task every 2 weeks, generate lots of post-its with sub-tasks, then during the 2 weeks, everyone is free to pick and solve these. (see )

The interesting part is that there is no boss telling you what to do (and making you responsible for it). There is a "scrum master" of each team who is there to ensure that everything is democratic enough and we have all that we need. And there is the "product owner" who gives the tasks... but to the whole team, every two weeks.

There is nothing to prevent you from reading Slashdot the whole time. Except... well, there are the post-its. And a big TV screen showing where the code is buggy. And all of your teammates who are working, or talking about work. And the 10 minute meeting every day, where you can show off with what you have done.

End result: somehow, everyone ends up working, without exerting too much willpower, just because... that is the obvious thing to do? (this stuff is really good news for those who otherwise tend to procrastinate a lot... you just have to surround yourself with the things that need to be done and people who do the same...)

This post has convinced me so thoroughly that I'm going to pick up and go to New York City to hang out with a rationalist community... Right Now!

What we need, then, is a rationalist commune. I'd drop everything for something like that.

My husband and I enjoy living with other people, and eventually plan to buy a house to set up as some kind of cooperative. My experience of what makes good housemates has more to do with living style than thinking style. But yeah, finding rationalists whose location/cleanliness/food/decor preferences worked with ours would be great.

My wife and three children are not particularly attached to this job or home, and we are all very well behaved. I actually enjoy cleaning as my way of imposing order on the universe while still achieving something useful. It's hard to find a cooperative that's neither a hippie commune nor a simple business agreement, though. Let me know if we could factor into your plans.

We're planning to buy a house in the Boston area in two or three years. Given my history of changing plans, it's certainly possible the plan will have changed by then. But I'll let you know when/if we get to a point when it looks imminent.

That would be a great area to live. I've always wanted to get back to my native New England roots; barring the Singularity, I look forward to your plan's good fortune.

If our situation controls our behavior (let's try to bracket "to what extent" and "how" it does so), then wouldn't it also control what kind of situation we will go for?

Here's an example from an Orwell essay: "A man may take to drink because he feels himself to be a failure, and then fail all the more completely because he drinks."

And then I've always wondered about the following: If situationism is true, why do the folk have such a robust theory of character traits? Can we provide an error theory for why people have such a theory?

Note that the folk do seem to allow for some 'situationism' - for example, when someone gets drunk, we admit they'll have a different persona and some more than others.

If situationism is true, why do the folk have such a robust theory of character traits? Can we provide an error theory for why people have such a theory?

Jones and Nisbett attempted to answer this question in their classic paper on actor-observer bias. It's an interesting read.

However, beware of falling into an overly strict interpretation of situationism (as I think Jones and Nisbett did) which amounts to little more than behaviorism in new clothes. People do tend to underestimate the extent to which their behavior and the behavior of others is driven by the environment, but there is nevertheless still good evidence that stable dispositions predict a respectable chunk of the variance in a person's behavior (where "respectable" means "similar in size to that for the situational data"). One of the errors of the strict situationist movement that arose in the late 1960s and 1970s is that it relied on an implicit endorsement of Kahneman and Tversky's "law of small numbers": situationist researchers erroneously expected a very small sample of a person's behavior (such as a single encounter with a person in need of help) to be representative of the general population of behaviors from which it was drawn. Not surprisingly, they found that stable dispositions are a modest predictor of these small samples of behavior. However, we have since learned that when behavior is properly aggregated across time, a robust effect of stable dispositions reliably emerges.

So in short, part of the reason that people have robust theories of character traits is almost surely that they actually map pretty well onto the territory. Situationism remains a useful paradigm, but it can easily be taken too far.

Will our situation affect which situation we will go for? Of course.

One reason the folk may have such a robust theory of character traits is that it successfully predicts behavior. But the reason for this is because we mostly only see people in the same situations, not because they do or would behave reliably in very different situations.

Then again, maybe we theorize in terms of character traits only because we hang out in communities of people who demonstrate the fundamental attribution fallacy.

Interestingly, I've read that the fundamental attribution error is less strong in East Asian cultures, such as China and Japan.

[Insert comment that is itself an instance of the fundamental attribution error here!]

I managed to read Anlamk's comment without this occurring to me. Thanks for saying it.

So the fundamental attribution error could be situational! It may have been a fundamental attribution error for me to have immediately assumed that it needs a "deeper" explanation.

The explanation that I usually read is that it's a cultural phenomenon, that within Chinese culture in particular, people are more inclined to describe others as inhabiting various roles instead of having persistent character traits (with this being reflected in some older Chinese literature and philosophical traditions) - but this is mostly just a vague impression I have that was probably formed by reading blog posts by people who don't really know what they're talking about, so take this with a grain of salt. ;)

An amusing bit of trivia: among the Japanese nobility at the time The Tale of Genji was being written, referring to someone by their name was a privilege reserved for family and very close acquaintances (and not something that would be appropriate to do in public), so all the characters in the story are referred to by titles and descriptions of various kinds - and these "names" change when the characters end up in different life circumstances.

within Chinese culture in particular, people are more inclined to describe others as inhabiting various roles instead of having persistent character traits.

Fascinating. Makes me want to do some research on this to see whether Chinese-raised people would behave differently because of this.

Luke, I just want to make sure we're keeping distinct things distinct. Here are three things to keep distinct:

  1. The immediate situation in which we find ourselves

  2. Our character, by which I mean that distinctive aspect of us which persists over a significant length of time (this may include skills)

  3. Our immediate behavior

The Milgram torture experiment(s) showed that our immediate behavior is greatly influenced by our immediate situation. It did not show, going by what I have read of it, that our character is shaped by our immediate situation. I'm not saying it isn't! I'm sure that a long succession of immediate situations will, over time, shape our character. All I'm saying is that the Milgram experiment, from what I've read of it, was not about that.

Similarly, the fundamental attribution error, as I understand it, is to blame immediate behavior on character on occasions when in truth the immediate situation has a greater influence on our immediate behavior. So this, too, is not about the role of our immediate situation in shaping our character.

The question of whether and how much our immediate situation - a succession of immediate situations - shapes our character over time in a lasting way is the question of the effect of environment on character.

You discuss people who you call "really trained rationalists". That very fact - that they are "really trained", and they are "a level or two above me" paints a picture of a lasting change on a person's character brought about by training. You say furthermore that "all these people have spent time living with at least two other rationalists for many months - most of them, for longer than that," which reinforces a picture of a long-lasting change in character brought about by immersion in an environment over an extended period of time.

I don't think separating character from situation is useful; rather than say "they are a giving person" and using the situation to explain an overly generous or stingy action, I try to think "they are the kind of person who is generous in this situation and stingy in this situation" - and I've noticed improvements in my ability to predict coworkers behaviours since I made that change.

The other thing to remember is that, a la no perfect philosophy student of emptiness, there is no character outside of situation: you can't examine a person's character sans situational modifiers. You can describe people as having this or that character trait, but only insofar as it applies to situations.

Shokwave's read on this is my take away from experiments like Milgram's: the situational context is not truly separable from character. Zimbardo's book The Lucifer Effect really drove this point home to me. He argued strongly against the idea that the abuses at Abu Ghriab were the result of "bad apples" (i.e. result of people with poor character), but that the situation itself led to the abuses, and further (and more controversially), the situation itself was created in order to bring about those behaviors. I don't mean to say that there is necessarily only one behavioral outcome for a given situation, only that the situation weighs very heavily on the outcome, to the point where finding an unchangeable "character" across situations doesn't seem feasible.

I can second the visiting fellow experience. I could notice a pretty clear and quick shift in my thinking when I got back home from my own visit in the program. I'd like to think that I managed to keep a part of it, but I do get the feeling that my thinking is far less strategic and rigorous than it was while I was there. It's one of the things that I miss about the program.

An aside: I read this:

but with everybody apparently willing to update their beliefs

-- and my first thought was "No really, I've updated myself."

Luke, you know what belief in belief feels like. How does one practically be sure that one has updated, rather than merely believing one has updated? Particularly when one cares a lot about the answer.

Luke, you know what belief in belief feels like. How does one practically be sure that one has updated, rather than merely believing one has updated? Particularly when one cares a lot about the answer.

You think a different thing than what you remember thinking and, more importantly, what you are on record as thinking.

Great quote on this:

Your environment will eat your goals and plans for breakfast.

Steve Pavlina

I've noticed this too. If I hang out for a group of people for a long time, my style of thinking starts to mimic theirs. If I'm with a group of school peers who think a given task is easy and fun, it literally feels easier to me than if the group of people around me is complaining about the boringness and difficulty of the task. Moods are contagious. Memes are contagious. Attitudes are contagious... And from an evo-psych perspective, being influenced heavily by your situation (especially your social situation) makes sense, because a group of people that all mirror each other in attitude and life goals will work better as a team.

a group of people that all mirror each other in attitude and life goals will work better as a team.


There's no contradiction -- you're both right. From an evolutionary psychology perspective, groupthink would be adaptive for small tribes living in an extremely familiar environment for generations. In the ancestral environment it was possible to know everything, and nearly everyone did.. Truly new problems would be rare or nonexistent. In the modern era ... things have changed a bit.

OK, rereading the OP with that proviso mentally tacked on ("a group that, etc. will fare better in the ancestral environment") makes more sense.

Aren't there equally obvious reasons to consider viewpoint diversity adaptive, though, even in the EEA?

Exactly what I was about to write. At some point I'm sure it was adaptive, when groups of humans were small and faced harsh but relatively simple threats. Now, it's sometimes adaptive but sometimes not.

"Situationist psychology," as I understand it, doesn't imply that the sitaution changes behavior because it changes us, and you don't cite to any evidence to show this connection. Situationist psychology holds that behavior is determined by the immediate situation. If individual character is changed by association, then the fact remains that you are using character to predict behavior, not using the situation. In other words, you have created a kind of hybrid between situationist and presonality psychology, where personality does play a big role, but is subject to change in the immediate term (rather than in the long term, which is the usual personality position. But it is also different from the usual situationist position in that change in situation doesn't determine behavior by changing a disposition. In your model, the situation is important because it determines dispositions. According to the situationist, a prison environment, for example, doesn't create a mean character. Its tenets favor the prediction that prisoners return to old character, more or less immediately, if they return to the same situation as before their imprisonment.

You could take this point as carping, and it might be so classified were it the case that situationists have evidence that important dispositions do change (substantially) because of association. While I don't have hard evidence either, it seems to me that both long-term, stable dispositions AND the immmediate situtation each determine more behavior than any intermediate-term dispositions. (Sometimes living among people whom you want want to be different from strengthens the opposed tendency to practice behavior that's different from those surrounding you.)

The drug addict doesn't want to change his disposition towards drug use; he wants to stop using drugs. Behavior begets character begets the person--lukeprog argues that you can change your behavior (and therefore yourself) by changing your situation.

The End of My Addiction is by a man who's a cardiologist and was an alcoholic. He considers craving alcohol to be a problem in itself, and found that taking Baclofen (a muscle relaxant) caused him to not have the craving.

Only a little research has been done to test the usefulness of Baclofen for that purpose.

Hmm. I wonder what situationism says about living alone and not interacting with anyone. Does it mean no influence, or feedback from your own traits, or what?

There's very little data about such people. (I assume you're thinking in terms of hermits, not just people are embedded in a social context but happen to live alone and not have many friends.)

In 1961, Stanley Milgram began his famous obedience experiments. He found that ordinary people would deliver (what they believed to be) excruciatingly painful electric shocks to another person if instructed to do so by an authority figure.

lukeprog: After listening to the latest Radiolab episode "The Bad Show", I think you should reconsider Stanley Milgram's findings.

Most people seem to be focusing on you point about situationist psychology. Your description of the community at the Singularity Institute was more interesting to me. Is there a concrete example argument that impressed you while at the Singularity Institute that you would feel confortable sharing?

That resonates well with me...when I decided I liked the way the people in the Bay area did things, I moved here. It had made my life much better.

It's not confirmation bias when two people agree. It's confirmation bias when one refuses to take into account contradictory evidence. Do you have any?

It is more or less what khafra stated. I'm not saying it is true in your case (hint: winking smiley) but it is very common for people to evaluate their life choices as you did without regard for the evidence. To put it another way, your statement would only distinguishable from the ubiquitous life choice confirmation bias if you stated it had made your life much worse.

I can imagine several places worse than the Bay Area for many people (and several places better), so it is not as though your statement was not plausible on its face. :-)

I read Rogers as saying "that does not sound materially different from what someone under the influence of confirmation bias would say, so I'm not updating my beliefs based on your story."

The bayes-worshippers at Lesswrong would be shocking people to death with the best of Milgram's blue-blooded conformists (unless they simply recognized the study). What's needed to avoid being the tool of sociopaths is emotional intelligence, and the willingness to buck authority, and I'm sorry but: this crowd doesn't have that special sauce.

All that would be necessary would be for Eliezer Yudkowsky to tell them to "shock the non-bayesian," and they'd stumble over each other to turn the dial up to 450. Or, "11" if he comically and perversely used the "Spinal Tap" version of Milgram's fake EST machine.