An important, ongoing part of the rationalist project is to build richer mental models for understanding the world. To that end I'd like to briefly share part of my model of the world that seems to be outside the rationalist cannon in an explicit way, but which I think is known well to most, and talk a bit about how I think it is relevant to you, dear reader. Its name is "normalization of deviance".

If you've worked a job, attended school, driven a car, or even just grew up with a guardian, you've most likely experienced normalization of deviance. It happens when your boss tells you to do one thing but all your coworkers do something else and your boss expects you to do the same as them. It happens when the teacher gives you a deadline but lets everyone turn in the assignment late. It happens when you have to speed to keep up with traffic to avoid causing an accident. And it happens when parents lay down rules but routinely allow exceptions such that the rules might as well not even exist.

It took a much less mundane situation for the idea to crystalize and get a name. Diane Vaughan coined the term as part of her research into the causes of the Challenger explosion, where she described normalization of deviance as what happens when people within an organization become so used to deviant behavior that they don't see the deviance, even if that deviance is actively working against an important goal (in the case of Challenger, safety). From her work the idea has spread to considerations in healthcare, aeronautics, security, and, where I learned about it, software engineering. Along the way the idea has generalized from being specifically about organizations, violations of standard operating procedures, and safety to any situation where norms are so regularly violated that they are replaced by the de facto norms of the violations.

I think normalization of deviance shows up all over the place and is likely quietly happening in your life right now just outside where you are bothering to look. Here's some ways I think this might be relevant to you, and I encourage you to mention more in the comments:

  • If you are trying to establish a new habit, regular violations of the intended habit may result in a deviant, skewed version of the habit being adopted.
  • If you are trying to live up to an ideal (truth telling, vegetarianism, charitable giving, etc.), regularly tolerating violations of that ideal draws you away from it in a sneaky, subtle way that you may still claim to be upholding the ideal when in fact you are not and not even really trying to.
  • If you are trying to establish norms in a community, regularly allowing norm violations will result in different norms than those you intended being adopted.

Those mentioned, my purpose in this post is to be informative, but I know that some of you will read this and make the short leap to treating it as advice that you should aim to allow less normalization of deviance, perhaps by being more scrupulous or less forgiving. Maybe, but before you jump to that, I encourage you to remember the adage about reversing all advice. Sometimes normalized "deviance" isn't so much deviance as an illegible norm that is serving an important purpose and "fixing" it will actually break things or otherwise make things worse. And not all deviance is normalized deviance: if you don't leave yourself enough slack you'll likely fail from trying too hard. So I encourage you to know about normalization of deviance, to notice it, and be deliberate about how you choose to respond to it.

New to LessWrong?

New Comment
9 comments, sorted by Click to highlight new comments since: Today at 6:35 AM

Specifically, the normalization of deviance process in Challenger Launch Decision involved a 5-step process:

  1. Signal of potential danger
  2. Official act acknowledging potential danger
  3. Review of the evidence
  4. Official act indicating the normalization of deviance (i.e. accepting the risk)
  5. Shuttle launch

The key thing that Vaughan identifies is that in every iteration of the above cycle, the standard that was compared against was the output of the previous iteration. Because of this, the notion of what was "acceptable" joint rotation subtly shifted over time, from what a conservative standard in 1977 to a very risky one 1986. The problem was that NASA was updating its beliefs about what was acceptable O-ring performance, but, as an organization, was not realizing that it had updated. As a result it drifted in an uncontrolled manner from its original standards, and thus signed off as safe a system that was, in retrospect, a disaster waiting to happen.

Normalization of deviance is a difficult problem to combat, because the process that leads to normalization of deviance is also the process that leads to helpful and beneficial updates about the state of the world. I would suggest that some normalization of deviance, within limits is acceptable. The world is not always going to be what your model says it will be, and you have to have some leeway to adapt to circumstances that aren't what you were expecting. However, when doing so, it's important to ensure that today's exception remains an exception, and that the next time deviance occurs, it's checked against the original standard, not an updated standard that results from the exception process.

I was shocked to hear about doctors in hospitals not washing their hands (from a medical student who was shocked to see it during his internship), and when I discussed it privately with some doctors, they told me it all depends on the boss. When the boss in the hospital washes his hands religiously, and insists that all employees must wash their hands all the time, they will. But when the boss ignores this norm, then... ignoring the norm becomes a local symbol of status. So the norm within the same hospital may change dramatically in short time, in either direction, when the boss is replaced.

I saw a similar thing in software projects. You almost always have a list of "best practices", but it makes a big difference whether the highest-status developer is like "we do this all the time, no exceptions", or -- much more frequently -- he is like "of course, sometimes it doesn't make much sense to ... ", and of course the scope of "sometimes" gradually expands, and it becomes a symbol of high status to not write unit tests. You can have two projects in the same company, with the same set of "best practices" on paper, with the same tools for automatically checking conformance (only, in one team, sending of the error messages is turned off), and still dramatically different code quality.

(And actually this reminds me of a time period when making fun of "read the Sequences" was kinda high-status here. I don't hear it recently, and I am not sure what it means: maybe everyone read the Sequences, or everyone forgot about them so that the joke is no longer funny because no one would know what it refers to, or maybe both sides just ageed to not discuss this topic publicly anymore.)

This is important We don't spend enough time thinking about WHY values drift or rule-violations get normalized. Use of "deviance" as a descriptor implies that the initial stated rule is correct and the deviations are undesirable. I'd argue that this is rarely the case - most rules and procedures are both oversimple (don't cover many common situations) and incorrect (are not optimal even for common situations).

IMO, if you're trying to establish a habit or live up to an ideal, there are two types of exception: those that actually improve the outcome (whatever reasons you have for trying those things), and those that recognize tradeoffs not considered in the initial establishment. The first kind of exception is good, IMO - you can meet your goal better with a more complicated model. The second kind can be good or bad, depending on your weighting of the tradeoffs.



I thought it might be useful to give an example of when normalisation of deviance is functional. Let's suppose that a hospital has to treat patients, but because of short-staffing there would be no way of filling out all of the paperwork properly whilst treating all the patients, so the doctors don't fill out all of the fields.

It's also important to mention the possibility of scapegoating - perhaps the deviance is justified and practically everyone is working in that manner, but if something goes wrong you may be blamed anyway. So it's very important to take this small chance of an extremely harsh punishment into account.

Interestingly, I think this could also be an example of normalization of deviance being adaptive proving that it's a problem. For example, let's suppose that by failing to fill out all the paperwork something bad happens, so bad that someone bothers to investigate the causes (maybe not on the order of the Challenger explosion but more than a typical postmortem), and they would (hopefully) identify normalization of deviance as part of the reason for the bad thing happening, although hopefully they won't stop there and notice that short staffing caused the normalization of deviance and not something else. This points, in my mind, to one of the tricky things about normalization of deviance in real organizations: it, like anything, can be used to rationalize whatever outcome is politically expedient, such that even if it's happening, it might not be an ultimate cause, but acting as if it's an ultimate cause might allow shifting the blame around to a more convenient location.

I think this has obvious implications if you are thinking about normalization of deviance in broader contexts, including normalization of deviance within a single person.

As long as you remain explicitly aware of the difference between emergency medicine and normal operations. If the hospital is just understaffed compared to their case load, then by accepting that situation and not following accepted practices, they need to realize that they are accepting the trade-off to treat more patients at a lower standard of care.

And the analogy to software teams is clear. If you accept the declaration of an emergency for your development team, and you don't clearly go back to normal operation when it's done, then you are accepting the erosion of standards.

[-]jmh4y40

I just wonder if it might be worth distinguishing between personal and social modes of this behavior. Not sure here though. Initially my thought on your first two examples were they are not really normalized deviations but simply poor discipline -- and somewhat still view them as that. The point about allowing some slack, however, is important to keep in mind here too. (Plus there are other aspects here -- like is the habit to be formed really something one wants or just thinks they should want it because its some general consensus or it works for other people).

Much of your view here does seem to apply to the environment in which I work and always find myself oscillating between thinking I need to try to help enforce the stated rule/goal/behavior and realizing it is not to be taken at face value and interpreted in a slightly different way (that would be a deviation from the ostensible policy. I find it very difficult though, it creates a lot of frustration for me and a sense of cognitive dissonance for me mentally.

I just wonder if it might be worth distinguishing between personal and social modes of this behavior.

Probably. By the time the idea of normalization of deviance reached me it had gone through a generalization process such that it was no longer about organizations, but systems generally, and individual people are also systems. The general phenomenon is worth noting, but I think how it manifests and how it is dealt with differ a lot based on the details, and there are likely important differences between the individual and group cases in terms of how it arises and what responses to it are tenable.

I find it very difficult though, it creates a lot of frustration for me and a sense of cognitive dissonance for me mentally.

Ah, right, I forgot about the cognitive dissonance aspect at play in normalization of deviance (for taboo reasons). A common narrative around normalization of deviance in organizations is that things would have worked out better if only management/leadership/others would have listened to workers/smart people/me, but interestingly this narrative mostly serves to inhibit action rather than correct deviance, because it gives a reason why the deviance can't be corrected, excusing the failure in advance such that there's no attempt to try to correct it (if that be the beneficial thing to do). Other narratives are also likely, but this is just the one that came to mind first.

Good article I recently came across about normalization of deviance in safety settings where it can produce a "drift towards danger".