(Update: Patrick points out the subject of this post is already well-known as the gambler's fallacy. I really should have read Tversky and Kahneman before posting.)

You're flipping a coin 100 times, the first five throws came up heads, what do you expect on the next throw? If you believe the coin to be fair, you allocate 0.5 credence to each face coming up. If your Bayesian prior allowed for biased coins, you update and answer something like 0.6 to 0.4. So far it's all business as usual.

There exists, however, a truly bizarre third possibility that assigns reduced credence to heads. The reasoning goes like this: at the outset we expected about 50 heads and 50 tails. Your first five throws have used up some of the available heads, while all 50 tails are still waiting for us ahead. When presented so starkly, the reasoning sounds obviously invalid, but here's the catch: people use it a lot, especially when thinking about stuff that matters to them. Happy days viewed as payback for sad days, rich times for poor times, poor people suffering because rich people wallow, and of course all of that vice versa.

I initially wanted to dub this the "fallacy of fate" but decided to leave that lofty name available for some equally lofty concept. "Fallacy of scarcity", on the other hand, is actively used but doesn't quite cover all the scenarios I had in mind. So let's call this way of thinking the "fixed sum fallacy", or maybe "counterbalance bias".

Now contrarians would point out that some things in life are fixed-sum, e.g. highly positional values. But other things aren't. Your day-to-day happiness obviously resembles repeatedly throwing a biased coin more than it resembles withdrawing value from a fixed pool: being happy today doesn't decrease your average happiness over all future days. (I have no sources on that besides my common sense; if I'm wrong, call me out.) So we could naturally hypothesize that fixed-sum thinking, when it arises, serves as some kind of coping mechanism. Maybe the economists or psychologists among us could say more; sounds like a topic for Robin?

New to LessWrong?

New Comment
4 comments, sorted by Click to highlight new comments since: Today at 2:50 PM

It's known as the Gambler's Fallacy, the representativeness heuristic is thought to be responsible for it.

Great! Thanks. Somehow I managed to miss it when searching.


I was just going to say the same thing. Gambler's Fallacy is a nice name too.

This is related to probability matching: you intuitively expect the sequences featuring typical frequencies to play a role in the correct answer. This is also one reason doomsday argument goes wrong: you expect the future to correct the untypicality of your personal observations.