Two facts "everyone knows", an intriguing contrast, and a note of caution.
"Everyone knows" that people are much more willing to invest into cures than preventions. When a disaster hits, then money is no object; but trying to raise money for prevention ahead of time is difficult, hamstrung by penny-pinchers and short-termism. It's hard to get people to take hypothetical risks seriously. There are strong institutional reasons for this, connected with deep human biases and bureaucratic self-interest.
"Everyone knows" that governments overreact to the threat of terrorism. The amount spent on terrorism dwarfs other comparable risks (such as slipping and falling in your bath). There's a huge amount of security theatre, but also a lot of actual security, and pre-emptive invasions of privacy. We'd probably be better just coping with incidents as they emerge, but instead we cause great annoyance and cost across the world to deal with a relatively minor problem. There are strong institutional reasons for this, connected with deep human biases and bureaucratic self-interest.
And both these facts are true. But... they contradict each other. One is about a lack of prevention, the other about an excess of prevention. And there are more examples of excessive prevention: the war on drugs, for instance. In each case we can come up with good explanations as to why there is not enough/too much prevention, and these explanations often point to fundamental institutional forces or human biases. This means that the situation could essentially never have been otherwise. But the tension above hints that these situations may be a lot more contingent than that, more dependent on history and particular details of our institutions and political setup. Maybe if the biases were reversed, we'd have equally compelling stories going the other way. So when predicting the course of future institutional biases, or attempting to change them, take into account that they may not be nearly as solid or inevitable as they feel today.