First, a scope limiting definition: "The sunk‐cost fallacy occurs when prior investments instead of future returns influence decisions about future investments."

Does anyone on this forum know of psychology research around sunk cost fallacies and how to persuade people out of them? It seems straightforward enough to persuade people into them ("We've lost thousands of lives and spent billions of dollars already! If we end the [unwinnable] war now it will all have been for nothing!") but very hard to persuade people out of them.

If you are not aware of any psychological research, please feel free to provide ideas or anecdotes about your own success/lack thereof in persuading others out of their sunk cost thinking.

New Answer
Ask Related Question
New Comment

2 Answers sorted by

I'd expect the Kahneman/Tversky work on system 1 and system 2 thinking would apply. Like many cognitive biases, sunk cost fallacy tends to be reactive rather than analytical. Framing the question not as "should we abandon all this work and start on something new", but "given these options for future investment, and the state of things today (which includes the impact of past investment), which future world do we prefer to find ourselves in"?

You should be prepared, though, to find out that they're right. If you consider additional factors seriously (reputation for abandonment, teardown costs, and greater uncertainty in the new investment), it often turns out that sunk-cost fallacy is the incorrect justification for the correct behavior.

That greater uncertainty is a HUGE one. You already know a lot about the old plan, you've made adjustments and have a much clearer view of it's trajectory if you continue. You just don't have as much knowledge about the new one, and it's extremely easy to compare the best-case of a new investment against the actual-case of the previous one. This leads to novelty bias, which can be far worse.

This complexity is what makes the sunk-cost fallacy useful - there's a reason it's built into our type-1 (fast) thinking. It's actually a great heuristic for many things, and is the right thing to do more often than you might expect.

The existence of the "do not throw good money after bad" idiom is indirect evidence that this kind of reframing is helpful in pursuading people against the fallacy, at least in some contexts.

New to LessWrong?