Nick Bostrom asks:

One sign that science is not all bogus is that it enables us to do things, like go the moon. What practical things does debiassing enable us to do, other than refraining from buying lottery tickets?

It seems to me that how to be smart varies widely between professions.  A hedge-fund trader, a research biologist, and a corporate CEO must learn different skill sets in order to be actively excellent - an apprenticeship in one would not serve for the other.

Yet such concepts as "be willing to admit you lost", or "policy debates should not appear one-sided", or "plan to overcome your flaws instead of just confessing them", seem like they could apply to many professions.  And all this advice is not so much about how to be extraordinarily clever, as, rather, how to not be stupid.  Each profession has its own way to be clever, but their ways of not being stupid have much more in common.  And while victors may prefer to attribute victory to their own virtue, my small knowledge of history suggests that far more battles have been lost by stupidity than won by genius.

Debiasing is mostly not about how to be extraordinarily clever, but about how to not be stupid.  Its great successes are disasters that do not materialize, defeats that never happen, mistakes that no one sees because they are not made.  Often you can't even be sure that something would have gone wrong if you had not tried to debias yourself.  You don't always see the bullet that doesn't hit you.

The great victories of debiasing are exactly the lottery tickets we didn't buy - the hopes and dreams we kept in the real world, instead of diverting them into infinitesimal probabilities.  The triumphs of debiasing are cults not joined; optimistic assumptions rejected during planning; time not wasted on blind alleys.  It is the art of non-self-destruction.

Admittedly, none of this is spectacular enough to make the evening news.  It's not a moon landing - though the moon landing did surely require thousands of things to not go wrong.

So how can we know that our debiasing efforts are genuinely useful? Well, this is the worst sort of anecdotal evidence - but people do sometimes ignore my advice, and then, sometimes, catastrophe ensues of just the sort I told them to expect.  That is a very weak kind of confirmation, and I would like to see controlled studies... but most of the studies I've read consist of taking a few undergraduates who are in it for the course credit, merely telling them about the bias, and then waiting to see if they improve.  What we need is longitudinal studies of life outcomes, and I can think of few people I would name as candidates for the experimental group.

The fact is, most people who take a halfhearted potshot at debiasing themselves do not get huge amounts of mileage out of it.  This is one of those things you have to work at for quite a while before you get good at it, especially since there's currently no source of systematic training, or even a decent manual.  If for many years you practice the techniques and submit yourself to strict constraints, it may be that you will glimpse the center.  But until then, mistakes avoided are often just replaced by other mistakes.  It takes time for your mind to become significantly quieter.  Indeed, a little knowledge of cognitive bias often does more harm than good.

As for public proof, I can see at least three ways that it could come about.  First, there might be founded an Order of Bayescraft for people who are serious about it, and the graduates of these dojos might prove systematically more successful even after controlling for measures of fluid intelligence.  Second, you could wait for some individual or group, working on an important domain-specific problem but also known for their commitment to debiasing, to produce a spectacularly huge public success. Third, there might be found techniques that can be taught easily and that have readily measureable results; and then simple controlled experiments could serve as public proof, at least for people who attend to Science.

New Comment
21 comments, sorted by Click to highlight new comments since: Today at 3:46 PM

"One sign that science is not all bogus is that it enables us to do things, like go the moon."

I was wondering if engineers were less biased than other scientific types? They deal with the practical and concrete all day long, and they see their ideas either succeed or fail before their eyes--such as landing on the moon or exploding on the launch pad. Unlike social or psychological researchers who have the option of clinging to their theories through thick and thin, engineers are trained to identify and abandon incorrect ideas as quickly as possible.

I studied engineering as an undergrad, and I believe it taught a form of objectivism. Or perhaps it simply revealed it.

I was also a fighter pilot for a number of years. Clinging to incorrect assessments about one's abilities, strenghts, weaknesses, or about others' could get one killed pretty quickly even in peacetime. Or perhaps overconfidence is necessary even to begin such a dangerous career. I think it's a wonder I'm still alive after all I've read here on your blog.

I was wondering if engineers were less biased than other scientific types?

Apparently not.

There's a surprising correlation between studying engineering and being a terrorist. I don't know if the correlation holds up for people who actually work in engineering rather than just having studied it.

I also haven't seen anything that looks solid about why the correlation exists.

There's a surprising correlation between studying engineering and being a terrorist. I don't know if the correlation holds up for people who actually work in engineering rather than just having studied it.

I wonder how much of that is due to possible cognitive traits or simple competence. Because it's only natural to be more inclined to sign up for things if you're going to good at it. And apart from commandos engineers seem to be the ideal recruits.

Sorry no source, but from what I've read, people with engineering degrees are recruited because they're recruitable as suicide bombers, they're not wanted for bombmakers or because they'd be good at positioning themselves for maximum damage.

Danger, wild speculation ahead: I'd assume it has something to do with the saying "Engineers can't lie." I can imagine constantly experiencing that doing things in violation with reality leads to failure, while at the same time hearing politicians lie pretty much every time they open their mouth and having them get elected again and again (or not failing in another way), to make quite a few of them seriously fed up with the current government in particular and humanity in general. Some less stable personalities might just want to watch the world burn at that point. Which should make them recruitable as terrorists, if you use the right sales pitch.

Might it be that engineering teaches you to apply a given set of rules to its logical conclusion, rather than questioning if those rules are correct? To be a suicide bomber, you'd need to follow the rules of your variant of religion and act on them, even if that requires you to do something that goes against your normal desires, like kill yourself.

I'd figure questioning things is what you learn as a scientist, but apparently the current academic system is not set up to question generally accepted hypotheses, or generally do things the fund providers don't like.

Looking at myself, studying philosophy and also having an interest in fundamental physics, computer science and cognitive psychology helps, but how many people do that.

It's hard to say especially since terrorists are a tiny proportion of engineers, and it would be good to study engineers rather than guessing about them.

Engineer-terrorists mystify me. Shouldn't engineers be the people least likely to think that you can get the reaction you want from a complex system by giving it a good hard kick?

As another datapoint (though I don't have sources), I heard that among evangelical church leaders you also find a relatively higher proportion of engineers.

Isn't that assuming the reaction they want isn't for part of the system to break? We wouldn't have inducing armageddon as a goal, but other optimization systems don't work exactly as we do.

The reaction they want is the one their religious system directs, and the process that created those rules was different than the process that created humans, so I think you're indirectly anthropomorphising the religious system.

Some religious people do want to set off armageddon, and some terrorists want to start ethnic/racial wars. However, my impression is that some terrorists have more specific political goals.

Shouldn't engineers be the people least likely to think that you can get the reaction you want from a complex system by giving it a good hard kick?

Anyone who's done much repair knows the value of percussive maintenance.

Engineers would be much more used to received abstract rules being useful than other people (would be).

Studies have shown chess to teach some forms of debiasing relating to looking for disconfirmation rather than confirmation.

As for public proof

Should be want a public proof? Would that not attract lots of people who are more interested in signaling than actually overcoming bias?

People should be aware of the advantages that de-biasing can bring, but we should let them know of it - quietly.

This is in response to Brian's comment, above.

Some years ago, I had the misfortune of being a member of a faculty senate, which gave me regular opportunities to hear highly intelligent people saying stupid things. At the university at which I then taught, some faculty senate members were elected on a university-wide basis, so one had to choose candidates from a group of people one didn't know and couldn't learn much about. One of my colleagues voted strictly according to department affiliation, using this system, which seemed good to me: engineers and business-school people, yes. Scientists, yes, except for physicists. Veterinarians, yes. Economists, yes. Everybody else, no. (We were lawyers, so we didn't have to make a decision about them as a group, as we knew the candidates, and the medical school wasn't part of the process). Looking back on the list, it does boil down roughly to a distinction between fields in which ability and lack of it lead to real-world consequences. I'm not at all sure how that would apply to academic lawyers. Most of us tend to be litigators (I'm not), and that's a profession much of like confidence tricksters.

I agree for the most part with Brian and Alan, but on the other hand Razib's Gene Expression post Nerds Are Nuts also rings true.

"One sign that science is not all bogus is that it enables us to do things, like go the moon."

Does this mean that "One sign that the humanities are at least partially bogus is that they don't really enable us to do things, like go the moon."?

Cynical prof, I don't think the humanities are meant to enable us to go to the moon. They're meant to output good literature. Expecting otherwise is like castigating physicists who can't play bongo the way Feynman did. So, the proper reason to label postmodernist poseurs as "bogus" is that they cannot output good literature, not that they are incapable of producing more "practical" benefits.

Actually the humanities people I know used to argue vehemently that the humanities were NOT supposed to output good literature (I seem to remember someone commenting that Nabakov stands almost alone amongst writers in also having been an academic). Rather they would wave their hands and produce some version of the following:

1)Allowing people to better appreciate existing literature

2)Something about knowledge for it's own sake (fair enough I guess)

3)Preserving critical thought (the kind of critical thought you write long essays with, not the kind you can measure in any way)

4)Defending society from the vulgarity/arrogance of Scientific Thought - think Foucault or Derida

On public proof...

1) "Order of Bayescraft" not likely to be seen as anything other than a self-help cult, like Scientology or Landmark.

2) A single spectacularly huge public success will be unconvincing and considered just a normal scientific breakthrough or luck.

3) If there existed techniques that could be taught easily, would they not already have been discovered? And what about the more-harm-than-good landmines that exist everywhere?

To call a de-biasing program a success, one or more individuals would have to show repeated scientific breakthroughs and be able to document how in each case known biases got on the way of the breakthrough, and how the de-biaser(s) got around these biases and saw the truth more clearly. An dojo would probably help, but "systematically more successful" would not be a sufficient criteria to distinguish it from a self-help school. Systematic breakthrough would.

[-][anonymous]11y30

The link to Nick's post is broken. Here's a working one: http://www.overcomingbias.com/2007/04/overcoming-bias-what-is-it-good-for.html