What else could it be?

The break distance bias found in the papers?

You can't use two pieces of contradictory evidence to support the same argument. If the most highly contested cases still have a chance at success, finding 0% success rate at the furthest distance from the last break (because they are the longest cases and therefore placed last) should not increase your belief that there is no bias at work. It should reduce it. How significantly your belief is reduced depends on just how likely you would see 0% success rates at a high distance from break due only to scheduling, but I can't see any way it could legitimately raise your belief that there is no bias.

The break distance bias found in the papers?

I kinda doubt it . . . it goes against common sense that there are judges who, once they get hungry, rule against any parole applications no matter how compelling.

You can't use two pieces of contradictory evidence to support the same argument.

Yes you can, and I can demonstrate it by stepping back and demonstrating this point with an example in abstract terms:

Let's suppose that we are debating whether Hypothesis X is correct or Hypothesis Y is correct. I am relying on evidence A which seems to support hy... (read more)

The Bias You Didn't Expect

by Psychohistorian 1 min read14th Apr 201192 comments


There are few places where society values rational, objective decision making as much as it values it in judges. While there is a rather cynical discipline called legal realism that says the law is really based on quirks of individual psychology, "what the judge had for breakfast," there's a broad social belief that the decision of judges are unbiased. And where they aren't unbiased, they're biased for Big, Important, Bad reasons, like racism or classism or politics.

It turns out that legal realism is totally wrong. It's not what the judge had for breakfast. It's how recently the judge had breakfast. A a new study (media coverage) on Israeli judges shows that, when making parole decisions, they grant about 65% after meal breaks, and almost all the way down to 0% right before breaks and at the end of the day (i.e. as far from the last break as possible). There's a relatively linear decline between the two points.

Think about this for a moment. A tremendously important decision, determining whether a person will go free or spend years in jail, appears to be substantially determined by an arbitrary factor. Also, note that we don't know if it's the lack of food, the anticipation of a break, or some other factor that is responsible for this. More interestingly, we don't know where the optimal result occurred. It's probably not the near 0% at the end of each work period. But is it the post-break high of 65%? Or were judges being too nice? We know there was bias, but we still don't know when bias occurred.

There are at least two lessons from this. The little, obvious one is to be aware of one's own physical limitations. Avoid making big decisions when tired or hungry - though this doesn't mean you should try to make decisions right after eating. For particularly important decisions, consider contemplating them at different times, if you can. Think about one thing Monday morning, then Wednesday afternoon, then Saturday evening, going only to the point of getting an overall feel for an answer, and not to the point of really making a solid conclusion. Take notes, and then compare them. This may not work perfectly, but it may help you realize inconsistencies, which could help. For big questions, the wisdom of crowds may be helpful - unless it's been a while since most of the crowd had breakfast.

The bigger lesson is one of humility. This provides rather stark evidence that our decisions are not under our control to the extent we believe. We can be influenced by factors we don't even suspect. Even knowing we have been biased, we may still be unable to identify what the correct answer was. While using formal rules and logic may be one of the best approaches to minimizing such errors, even formal rules can fail when applied by biased agents. The biggest, most condemnable biases - like racism - are in some ways less dangerous, because we know we need to look out for them. It's the bias you don't even suspect that can get you. The authors of the study think they basically got lucky with these results - if the effect had been to make decisions arbitrary rather than to increase rejections, this would not have shown up.

When those charged with making impartial decisions that control people's lives are subject to arbitrary forces they never suspected, it shows how important it is and much more we can do to be less wrong.