Terrorist baby down the well: a look at institutional forces

by Stuart_Armstrong1 min read18th Mar 201423 comments


Personal Blog

Two facts "everyone knows", an intriguing contrast, and a note of caution.

"Everyone knows" that people are much more willing to invest into cures than preventions. When a disaster hits, then money is no object; but trying to raise money for prevention ahead of time is difficult, hamstrung by penny-pinchers and short-termism. It's hard to get people to take hypothetical risks seriously. There are strong institutional reasons for this, connected with deep human biases and bureaucratic self-interest.

"Everyone knows" that governments overreact to the threat of terrorism. The amount spent on terrorism dwarfs other comparable risks (such as slipping and falling in your bath). There's a huge amount of security theatre, but also a lot of actual security, and pre-emptive invasions of privacy. We'd probably be better just coping with incidents as they emerge, but instead we cause great annoyance and cost across the world to deal with a relatively minor problem. There are strong institutional reasons for this, connected with deep human biases and bureaucratic self-interest.

And both these facts are true. But... they contradict each other. One is about a lack of prevention, the other about an excess of prevention. And there are more examples of excessive prevention: the war on drugs, for instance. In each case we can come up with good explanations as to why there is not enough/too much prevention, and these explanations often point to fundamental institutional forces or human biases. This means that the situation could essentially never have been otherwise. But the tension above hints that these situations may be a lot more contingent than that, more dependent on history and particular details of our institutions and political setup. Maybe if the biases were reversed, we'd have equally compelling stories going the other way. So when predicting the course of future institutional biases, or attempting to change them, take into account that they may not be nearly as solid or inevitable as they feel today.


23 comments, sorted by Highlighting new comments since Today at 11:00 AM
New Comment

Seems like politicians are willing to invest in prevention if the prevention is fighting against a human enemy, because then it moves from the "prevention" category to the "war" category.

War against terrorism = soldiers go and kill some foreigners. War against drugs = policemen go and kill some dealers, or arrest some users.

War against flood = ???. Not gonna happen.

And probably the "prevention" aspect is completely irrelevant. You can get votes for being tough on drugs or terrorism, even if your policies do not in fact reduce drug usage or terrorism. The war itself is the product you sell; prevention is just an excuse.

If people would believe that Flood Fairy exists and causes flood, you could get some points as a politician for assembling a special team of super fighters to kill the Flood Fairy. That would be exciting. Other ways of preventing floods are boring.

The average voter does not care about rationality, only about killing.

fighting against a human enemy

You've identified where the distinction lies, but missed the reason why there is a distinction.

It is entirely appropriate to take different actions against an agent vs. a force of nature. One can't deter nature, and nature shows no intent. Agency matters. The most dangerous threats to humans are other humans.

(Except for aging, about which people are particularly crazy, but that's a special case.)

The most dangerous threats to humans are other humans.

In what sense? Other humans are certainly not very high on the list of top causes of death.

They are somewhat high on the list of top black swan events, however.

Not as in "murder" but as in omission or acting in self-interest (and tragedy of the commons).

It's not just that. A lot of our ethical injunctions need to be suspected during wartime, thus it makes sense to be suspicious of attempts to make use of this loophole by expanding the definition of "war".

I think this is an excellent distinction, +1

That would be exciting

I disagree with the psychology a bit here. It's not that the war is exciting. It's that prevention has costs; taxes, inconvenience, etc. When people feel they're in a state of war they're very much willing to overlook the inconveniences; when there is no enemy in sight this does not happen. It feels like "state of war" is a basic (evolutionarily developed) psychological state.

Seems to be a name thing.

The popularity of a war on poverty waned after the 1960s

The War on Cancer leads to people getting cancer "prevention" screenings that produce unnecessary operations and don't increase life expectancy.

It's about using violence to cut out the cancer in the early stages. It fits well into the pattern of the other examples.

"Governments overreact to the threat of terrorism" is an overly charitable way of describing it. It's like describing DRM as "software companies overreacting to the threat of piracy". No, it's not. Most of their motive for DRM is other things, such as preventing sales of used product. Piracy is just the official excuse; it's not their real reason.

Governments don't "overreact to terrorism". Governments seize power and divert money to the politically well-connected. Terrorism is just how they justify it to the public.

You may as well claim that Russia in Crimea is "overreacting to the threat of Nazis in Ukraine".

Yep. To quote Rahm Emanuael, a senior adviser to Obama and currently mayor of Chicago,

You never let a serious crisis go to waste. And what I mean by that it's an opportunity to do things you think you could not do before.

When we talk about prevention in health care we are talking about doing something about the root course.

In terms of terrorism that's having a foreign policy that doesn't make citizens of other countries want to bomb your country and having internal politics that no secession movement wants to bomb your country.

And both these facts are true. But... they contradict each other.

No. If you have to invest into strong security that's a sign that you didn't do anything to prevent the threat from arising in the first place.

One is about a lack of prevention, the other about an excess of prevention. And there are more examples of excessive prevention: the war on drugs

The war on drugs doesn't prevent anything. Decriminalization in Portugal didn't increase the amount of drug use. The war of drugs is about punishing people who take drugs or deal drugs, it's not about prevention at all.

There's not enough money spent to prevent the social conditions that lead to drug use.

Humans overestimate risks of hostile action, and underestimate risks of natural phenomena. The base rate of terrorism is tiny compared to that of natural disasters. So yes, we deploy excessive preparation in one case and insufficient preparation in the other.

Another is to note that some forms of preparation work against both natural and artificial disasters. For instance, emergency medical training is useful in treating people who've been injured by a terrorist bomb or an earthquake.

The natural phenomena are developing slower, are more random, usually show no exponential grows and can be altered less.

Wheras the exponential growth effect of human action mandates a quick reaction.

And both these facts are true. But... they contradict each other.

No, they don't. First, people and institutions are different. Second, institutions have their sets of incentives (not necessarily coherent) which you must take into account when analysing their behaviour. In particular, the reasons they do things and the reasons they say they do things are often different.

For example, I tend to see the war on terror (as well as the war on drugs, by the way) as a government's way of extending and solidifying its control of the population.

For example, I tend to see the war on terror (as well as the war on drugs, by the way) as a government's way of extending and solidifying its control of the population.

And not to forget, a lot of security contractors getting payed to achieve the 10% of growth that every industry wants for themselves.

I made a related argument recently:

A theory that doesn’t account for detailed behavior is an approximation, and even in scientific domains, you can find conflicting approximations. When that happens—and if you’re not doing science, it’s “when,” not “if”—if you want to keep using your approximation, you have to use the details of the situation to explain why your approximation is valid. Your best defense against reductio ad absurdum, against Proving Too Much, is casuistry. Expect things to be complex, expect details to matter. Don’t ascribe intention or agency to abstract concepts and institutions. Look for chains of cause and effect. Look at individual moving parts and the forces acting on them. Make empirical predictions, and look for unintended empirical predictions. Ask what the opposite principle explains, and find the details that make those explanations compatible.

A penny of prevention could be worth a pound of cure, but that doesn't actually mean it's worth it to prevent unless the prevention lowers the chances of the adverse event by at least an absolute probability of (1 penny / 1 pound).

People are more afraid of things which could kill anyone at any time.

point to any particular natural disaster and the majority of people can either say that you'll see it coming hours, days or weeks in advance or the majority are not subject to risk of that disaster.

Terrorism on the other hand could technically strike anyone, at any time.

people also give a weighting to human actions. if 1 person per day in a big city is killed in road accidents that's no big deal. if one utterly random person a day is shot by a sniper the whole city will shut down.

Seems like a special case of facts being invertible. Generalize carefully.

In a second best sort of way, we don't spend too much on terrorism prevention if there are strong political forces that compel politicians to overreact to successful terrorist attacks.

It seems like the War on Terror, etc, are not actually about prevention, but about "cures".

Some drug addiction epidemic or terrorist attack happens. Instead of it being treated as an isolated disaster like a flood, which we should (but don't) invest in preventing in the future, it gets described as an ongoing War which we need to win. This puts it firmly in the "ongoing disaster we need to cure" camp, and so cost is no object.

I wonder if the reason there appears to be a contradiction is just that some policy-makers take prevention-type measures and create a framing of "ongoing disaster" around it, to make it look like a cure (and also to get it done).