I don't generally like evolutionary psychology explanations for things, for a variety of reasons, but as long as we're listing them, the obvious one to me seems likely to be crab mentality. Living in a tribe, you would rather nobody have too much power, so people can gain support if it looks like the other guy has a ton of power until everyone has around equal power. See also Balance of power, and evolutionary explanations for egalitarianism.
I think this is the right response to the piece, but begs a more explicit challenge of the conclusion that underdog bias is maladaptive (@Garrett Baker offers both pre-modern tribal life and modern international relations as spheres in which this behavior is sensible).
One ought to be careful of the "anti-bias bias" leading one to accept evolutionary explanations for biases but then makes up reasons why they're maladaptive to fit the (speculative) narrative that the world can be perfected by increasing the prevalence of objectively true beliefs.
I don't think it's due to evolution or material conditions. I think it's cultural and goes back to the rise of Christianity. Pre-Christian stories, like the Greek myths, glorified the strong. Now we glorify the weak.
As an aside, it's a bit of miracle that the pro-weak worldview became so strong and in many places won outright. It's inherently strange and the result wasn't a given at all. I'll never stop recommending "The Girl in a Swing" by Richard Adams, which examines this conflict maybe better than any other, despite being a fiction book.
Thinking of the pre-christian body of thought I know the best, Plato definitely depicted Socrates as an underdog, and we were always meant to root for him. So I'm skeptical of this Nietzscheian dichotomy.
Edit: Hell, if the ancient Greeks were proud of one thing it was how an army of just 300 Spartans, despite all odds, defeated the Persian invasion of the peninsula. If that's not "rooting for the underdog" I don't know what is.
I'm not sure I agree with cousin_it's premise but lack any good knowledge to justify a strong disagreement. However, I would note that the 300 Spartans story might be read both ways, and perhaps at the same time. If 300 Spartans can fight off the entire Persian army seems to also suggest they are incredibly strong fighters with strong strategy and tactics.
I think one can make both cases and neither necessarily refutes the other.
The point of an underdog story is that the underdogs do ultimately win (in some sense) at the end. Socrates spreads reason, the Spartans defend Greece, and the poor go to heaven. So the question should be whether the story as presented by Greeks claims that the Spartans winning was surprising. Otherwise, we've just completely dissolved what an "underdog story" could possibly mean.
Thermopylae is not a good example here. They started with ~70,000 troops and absolutely expected to win due to the overwhelming advantage of their defended position.
The rearguard, who stayed behind to cover the retreat are where "the 300" comes from, (but there were actually over 1,200 of them) were wiped out except for about 400 who surrendered on the first day.
So not underdogs and didn't win.
I was making a claim about how the story was typically presented by greeks to greeks. It could be an underdog story or not in reality, but in fact them making their position seem more dire in their rhetoric I think actually supports my point more.
This conversation uses "underdog" in different ways, giving rise to confusion. Yes, the point of an underdog story is indeed that the underdog wins, but this just makes the heros of the story just more awesome. Ultimately, you emphasize with somebody who is super strong.
The OP, however, describes a phenomenon where the groups see themselves as weaker and in fact unlikely to win. cousin_it attributes this to weakness being desirable due to Christianity. Socrates is a good counterexample, but the 300 are less so.
I don't think it's due to evolution or material conditions. I think it's cultural and goes back to the rise of Christianity...
I think basically every time someone has a story like this it's wrong. I don't understand why people seem so eager to blame cultural forces for ubiquitous behavior in this fashion. I guess it makes humans seem more interesting.
I'll try defending his view: We're rewarding victimhood and humility more than ever before, and in the west, the main reason behind this change in values has been Christianity.
The leap from "We're rewarding weakness" to "We see others as stronger than they are" is not trivial, but:
I'm not saying this view is necessarily true, but I don't think it's unreasonable either. It's also my understanding that strength was much more valued in the past, but I don't know enough ancient history to judge the extent to which this is true. It might fluctuate or vary between continents.
Are trickster myths a type of underdog narrative? They typically show cleverness, courage, and ingenuity winning out over brute strength and established authority. Could they be viewed as a form of cognitive training? My impression is that they are widespread in non-Christian cultures.
Greek mythology also shows underdog preference. Sure, the heroes are demigods, but they're battling against gods, monsters, and mighty forces of nature. The Greek heroes are strong but usually the forces they contend with are even stronger.
Generally speaking, "the hero starts with bad odds" makes for better stories. Perseus is the underdog more in the sense that he hasn't yet proven himself - nobody knows what being Perseus means before he's defeated a legendary monster.
But then again, it's not like those stories disappeared from Christian culture either. I guess the knights from the epic chansons de gestes are humble in the sense that they submit to God, but they are still strong and brave warriors who put down all sorts of supernatural evil beings.
> I think it's cultural and goes back to the rise of Christianity.
This seems testable with a cross-cultural analysis. Not just the pre-Christian Greek stories that Garrett mentioned, but Chinese, Japanese, Indian, and Middle Eastern cultures should have plenty of non-Christian stories.
Isn't there a bit of a false equivalence tucked up in the logic here? Two sides could be equally scared of one another and both feel like underdogs, but that says nothing about who is correct to think that way. Sometimes people just are the underdog. People unable to use democracy to enact change versus elites that consider them dangerous is a good example. The masses in that case are definitely the underdog, as they threaten the status quo of every major power centre (often state, corporations, politicians, and elite institutions all at once). In many European countries, certainly, it is unclear that the masses can do very much to policy at all right now. They feel like underdogs because they are. I am sure the elites also feel that they are underdogs... they're just wrong.
There's always the third option: everyone thinks the other side is more in control than they are, no one is actually in control in anything like a satisfactory way. Each individual feels part of a system that narrows their choices so much they have basically no agency at all. The system churns on all the same, while everyone seemingly hates it (or at least would like something better).
In a stereotypical old-west gunfight, one fighter is more experienced and has a strong reputation; the other fighter is the underdog and considered likely to lose. But who's the underdog of a grenade fight inside a bank vault? Both sides are overwhelmingly likely to lose.
At least one side of many political battles believe they're in a grenade fight, where there's little or nothing they can do to prevent the other side from destroying a lot of value. and could reasonably feel like an underdog even if they have a full bandolier of grenades and the other side has only one or two.
This feels like a good example of the exact point being made by the essay.
The rise to power of populist politicians and the historic presence of violent revolutions could be a strong counterpoint to your assertion. Yes, sometimes it feels like democracies are the underdog when stacked up against powerful lobbyists, but ultimately there's a big power imbalance here that the elites are absolutely correct to fear: lobbyists are absolutely dependent on democratic institutions to leverage their wealth into political power, while 50,000 angry people with pitchforks are not. When the mob, or a mob empowered leader, decides to bypass democratic institutions in the exercise of power, this asymmetry matters.
Whether or not the revolting populace actual get what they want out of rebelling (historically this would be unexpected) it's a difficult case to make that they don't have some significant advantages in the games elites actually care about.
I would disagree fairly strongly: "lobbyists are absolutely dependent on democratic institutions to leverage their wealth into political power, while 50,000 angry people with pitchforks are not"
They are, I think. If they are angry that democracy is ignoring them then their pitchforks will likely not manage to enact some complicated change to legislation needed to fix the problem, as you point out. If we care about power to actually make a change about the things people want to happen, this is vested almost entirely within the hands of the elite and not within pitchforks. Pitchforks could maybe scare elites into doing it, but more likely it just generates chaos. Because pitchforks are not the tool for the job. The tools for the job are held by the elites and they refuse to use them accordingly.
I'm living through this day by day here in Britain. People protest all over the country every day and the government, despite knowing which positions have majority support, just do the opposite continuously and use every mechanism available to delay or obfuscate meaningful change.
Interesting, I wonder what you think of this:
My sense is that there is a underlying psychological dynamic going on that people tend to judge their current state relative to some "absolute victory" state:
* Two countries who are in total war with each other are both in a precarious state. Each of them judges the current state relative to "I have completely won and subjugated the opponent". Relative to this total victory state they feel very precarious.
* Islam is doing quite well globally. Islam "has" 1/4th the global population, it is increasing in members the fastest out of the major religions. Yet I suspect the reason for at least some Muslims that this feels very dis-satisfactory is because according to them Islam "should" have ~100% followers worldwide. (of course there are many other factors going on here). Relative to this total victory state, the current situation of merely 1/4th of the global population seems extremely weak.
* People in the AI x-risk community want ~everyone or ~every-serious-person to take AI x-risk seriously, rather than mock or ignore it. People who think it's all a big dangerous distraction want ~no-one to take it seriously and it to never get any serious news coverage. Relative to these diametrically opposed total victory conditions, both sides feel the other side has "too much" influence, and their own side seems too weak.
What do you think of this idea, that the sense of being an underdog is (often) downstream of a prior feeling of your side being weak/threatened relative to a total-victory condition? And that this causes a distorted picture in which your side is actually objectively weak relative to the other side.
The purpose of "underdog bias" is nearly the opposite of your best guess. It is because conflicts are too complicated for most people to model, and optional to get into. Even after several million years of evolution making brains smarter, humans still usually fail to see more than zero turns ahead in very simple games like Risk (e.g., if I break his bonus, and he goes right after me... well I can break his bonus now! Let's do it!). If you can't accurately model the effects of starting a conflict, but you're also prone to getting into conflicts you think you can win (thanks evolution), the best hack is to make you believe you won't win.
Why do I believe this? Well I've seen this evolution in Risk. Newer players will overattack, using all their troops on the first few turns to take territories and break bonuses. They slowly evolve into turtles, keeping all their troops in one stack blocked by their own territories so they couldn't do anything even if they wanted to, and only ever attacking one territory at a time. This is where most players stop their evolution, because after learning zeroth-order heuristics like, "the world is scary, better be super conservative," the only way to further progress is to start modelling conflicts more than zero turns ahead.
Seems weird to posit that evolution performed a hack to undermine an instinct that was, itself, evolved. If getting into conflicts that you think you can win is actually bad, why did that instinct evolve in the first place? And if it's not bad, why did evolution need to undermine it in such a general-purpose way?
I can imagine a story along the lines of "it's good to get into conflicts when you have a large advantage but not when you have a small advantage", but is that really so hard to program directly that it's better to deliberately screw up your model of advantage just so that the rule can be simplified to "attack when you have any advantage"? Accurate assessment seems pretty valuable, and evolution seems to have created behaviors much more complicated than "attack when you have a large advantage".
I agree that humans aren't very good at reasoning about how other players will react and how this should affect their own strategy, but I don't think that explains why they would have evolved one strategy that's not that vs another strategy that's not that.
(Also, I don't think Risk is a very good example of this. It's a zero-sum game, so it's mostly showing relative ability, not absolute ability. Also, the game is far removed from the ancestral environment and sending you a lot of fake signals (the strategies appropriate to the story the game is telling are mostly not appropriate to the abstract rules the game actually runs on), so it seems unsurprising to me that humans would tend to be bad at predicting behavior of other humans in this context. The rules are simple, but that's not the kind of simplicity that would make me expect humans-without-relevant-experience to make good predictions about how things will play out.)
Millions of years ago, the world was pretty much zero sum. Animals weren't great at planning, such as going back for reinforcements or waiting months to take revenge, so fights were brief affairs determined mostly by physical prowess, which wasn't too hard to predict ahead of time. It was relatively easy to tell when you can get away with bullying a weaker animal for food, instead of hunting for your own.
When humans come along, with tools and plans, there is suddenly much less common knowledge when you get into a fight. What allies does this other human have to call upon? What weapons have they trained in? If they're running away, are they just weaker, or are they leading you into a trap? If you actually can win the fight, you should take it, but the variance has shot up due to the unknowns so you need a higher expected chance of winning if you don't want an unlucky roll to end your life. If you enter fights when you instictively feel you can win, then you will evolve to lower this instictual confidence.
Agree that other players having tools, social connections, and intelligence in general all make it much harder to judge when you have the advantage. But I don't see how this answers the question of "why create underdog bias instead of just increasing the threshold required to attack?"
Strong disagree on the ancient world being zero-sum. A lion eating an antelope harms the antelope far more than it helps the lion. Thog murdering Mog to steal Mog's meal harms Mog far more than it helps Thog. I think very little in nature is zero-sum.
Underdog strategies often involve innovation. (OP mentions David vs Goliath as a classic underdog narrative; another commenter points out that David's sling & stones were a superior strategy in that combat.) Favoring underdogs could increase the larger group's exposure to innovation, indirectly helping it learn and adapt to new conditions.
You usually don't hear about the underdogs who are clearly losing. And if you do, it's either someone laughing at them, or someone crying at their funeral.
So the fact that you are hearing about the underdog, and you are invited to support their cause, suggests that their chances are good. Seems like a good moment to join the future winning side while you can still get some credit for doing so.
When you're budgeting resources, conflicts with adversaries are a little different than other sorts of categories of expense, which might be largely determined by your own consumption habits or, if put at risk by unexpected changes in nature or in the economy, don't change in a way to actively thwart us, are more or less random. When in a conflict, you're always going to want to be conservative in estimating the resources you need, which is something obvious in any book on military logistics, and being conservative requires overestimating what your opponent can do, and underestimating how far your current resources will actually go. If you weren't conservative, you could put more resources towards other things (guns vs butter debates) but being conservative is probably more evolutionarily fit than being more accurate in that estimation, as the conservative planner will be more prepared in unexpected situations.
A few thoughts on this:
One issue with underdog narratives nowadays is that they tend to be applied to large groups of hundreds of thousands (or millions) of people. Even if there are general statistical truths, by their very nature those large groups still tend to be very diverse and dynamic at the individual level. And the most standout of those tend to be the rich and powerful elites, which the average Joe is comparing themselves to.
"My group" = all the normal hard working people in my personal life
"Their group" = the rich and powerful elites I see on Television or in the news
But of course the perspective is the exact same for the average Joe of the other group! Their group is all the normies in their life while your group is the elites of your side mentioned on TV and in the news. My left wing father would talk about the Koch Brothers and other right wing millionaires/billionaires/elites and some of the right wing adults in my life would mention people like George Soros and other left wing millionaires/billionaires/elites.
I don't know if this is a big part of the explanation, but I do think it's a meaningful part at least.
The hostile media effect is particularly striking because it arises in settings where there’s relatively little scope for bias. People watching media clips and sports are all seeing exactly the same videos. And sports in particular are played on very even terms, where fairness just means enforcing the rules impartially.
If your beliefs about the world = base reality then any straying away from your beliefs is inherently going to be interpreted as biased against (your) reality. We can all generally agree on the obvious stuff like when a tennis ball is clearly outside the lines but if it just skirted the paint and it's hard to really tell then motivated reasoning starts to kick in, and your reality is whatever you want it to be.
And you don't see "ok your tennis ball skirted the line but I think you got it in" as biased towards you, you just see it as them making the obvious correct acknowledgment of the world. Each ruling with you is a ruling that's just going with obvious truth and each ruling against you is a biased denial of facts.
Say to yourself: “they're just as scared of us as we are of them.” It’s true far more often than you think.
Also applicable to many interpersonal conflicts.
Isn’t everyone the underdog at the same time? Whatever core ideology or religion you follow in life, if you are at all power-seeking (like the people you’ve listed) then your ideal world might include literally everyone following the same ideology, and the people crowning you world leader as a result. Relative to this, everyone has way less power and influence than they wish they had.
I'm very uncomfortable with this piece.
cultural elites (e.g. top university graduates) control almost all major institutions
the most elite groups (like billionaires or jews)
I thought we were done with blaming Jews for controlling almost all major institutions. I understand you are trying to give a two-sided view of things, but this paragraph comes across as propagating old libels that should have been laid to rest by now.
If you want a specific example of why this portrayal of Jews as elites is problematic and painting a false picture of reality, there are no poor powerless billionaires. There are plenty of poor powerless Jews.
The most elite groups (like billionaires or jews)
This quote looks pretty bad, but...
The most elite groups (like billionaires or Jews) are often the ones it’s most socially acceptable to blame for problems, or even call for violence against.
Now, you could maybe still critique this quote, but it reads very differently than when you cut the sentence off immediately!
Not a criticism but I thought this post was going to be about people having a bias towards supporting the underdog, whoever they perceive that to be. I think this bias also exists, though it's not universal.
These go together. People support the underdog, therefore everyone claims to be the underdog to get support. But there's a contradiction there. "Help us, we're losing!" isn't much of an appeal, especially when you have to keep on professing to be losing even when you're winning.
Underdogs lose. If you win, you weren't the underdog. David with ranged weaponry was stronger than Goliath who knew only close combat.
Underdogs lose. If you win, you weren't the underdog.
Is it not more like, p(underdog_loses) > 0.5? Sometimes the thing with lesser probability happens even if the prediction was well-calibrated.
Boiling this down for myself a bit, I want to frame this as a legibility problem: we can see our own limitations, but outsiders successes are much more visible than their limitations.
Curated! I found the evopsych theory interesting but (as you say) speculative; I think the primary value of this post comes from presenting a distinct frame by which to analyze the world, one which I and probably many readers either didn't have distinctly carved out or part of their active toolkit. I'm not sure if this particular frame will prove useful enough to make it into my active rotation, but it has the shape of something that could, in theory.
I found this very thought provoking. With the big caveat that I don't know a lot about this, a question came to mind by the end of the post:
At a personal/ individual level, how do you distinguish your underdog bias from your imposter syndrome?
Some other reasons this happens:
In pluralistic alliances (the strongest kind) or in egalitarian societies (common in the evolutionary background), the senate conspires to prevent any faction from getting too much power, or a kind of power that can be used to permanently entrench itself. So, how do you get power, in that kind of situation? By arguing that you have too little power and that others have too much. So naturally everyone spends all of their time and energy doing that.
There was probably also an evolutionary pressure be paranoid about your opponents' hidden advantages - they always have more of them than you can understand.
There's a tendency for powerful people to appoint meek people as their successors, or for key positions, because the meek are non-threatening, then the higher ups die and the new appointees stop being so meek, people forget that they ever were meek, but they were.
So at high positions, everyone has an incentive to be meek. And maybe that adds up to an effect where the organisation as a whole becomes meek. ie, an aspiring leader is eager to commit to binding their prospective power to be unable to reverse the decisions of the previous leader knowing that this will make the leader more willing to appoint them. They get their wish. Now the organisation as a whole is actually less powerful, there are things it can't do. Over successive transitions, this will lead to organisations who are only able to do things that the dominant morality (or the morality shared by a long succession of leaders at least) already wants them to do, if this goes on for long enough (so far it never has afaik) they will have roughly no power, they'll just be executors of a contingent set of social principles. And I guess that was one hope as to where social progress could have come from.
Regarding where underdog bias comes from, I think this study may hold a clue
https://paulbays.com/pdf/SheBayFri03.pdf
It shows that people consistently underestimate the physical forces they apply as opposed to the ones applied to them, or, in the researchers' words "self-generated forces are perceived as weaker than externally generated forces of the same magnitude, which arises from a predictive process in which the sensory consequences of the movement are anticipated and partially removed from the perception".
Predictive Processing theory explains this finding and others (such as people not generally being able to tickle themselves) by proposing that the brain cancels out stimuli it can predict.
I think a general version of this may be at work here. People underestimating the impact of their own actions and strongly feel the impact of others' actions. They can each truthfully say "from my own frame of reference I was perfectly stationary when that guy crashed into me".
It is unclear to me that the described phenomenon exists to the degree assumed. If two equally powerful countries or sports teams battle each other, each group of supporters will believe they are likelier to win on average.
I think another example of both sides thinking they are the underdog are environmentalists versus nuclear/agricultural (GMOs, pesticides, and artificial fertilizers)/fossil fuel companies.
I'm not convinced that there is that deep a phenomenon here. Many of the examples are current, and may be explained just as well by current ideologies that focus on "power dynamics" or want everyone to have equa power. Or by the somewhat plausible heuristic that groups are amoral and will therefore use their power unfairly.
Also, while the pro Israeli map is plausibly about being the underdog, I'm not sure that it is the point of the pro Palestinian one. I think that one try to say that they were robbed of what was rightfully their - not that they are weaker. And even the pro israel map is very likely about "arabs already have enough land" - if it was about power, Iran and other non arab Muslim countries would be included
More trivial explanation for the 'trait' that I find plausible:
Past: The world was simple, we used to know our tribesman's options almost as intimately as our own. So our bias: 'The constraints that apply, are those we know.'
Today: Too complex world! Huge epistemic asymmetry between our own and the others' constraints: a ton of types of things we know about our own limitations but not about others. Yet, we still implicitly believe: 'The constraints that apply are (constrained to) those we know'. --> We feel underdog simply as we're not evolutionarily trained to see 'the devil in the detail'.*
(*Oh, seems inline with even the broader phenomenon of failure to imagine the 'devil that lies in the detail' before we actually experience it; whether this would fit by chance or because it actually has related origin I dare not speculate too much)
I think it's appropriate to look for an evo psych explanation, but if you're going to do so, try to observe how other animals react to underdogs. Chimps probably can't construct narratives, but they can still identify underdogs and overdogs. I don't know what you'd learn by studying them, but I think it's where you'd want to begin. Not with first principles speculation.
People very often underrate how much power they (and their allies) have, and overrate how much power their enemies have. I call this “underdog bias”, and I think it’s the most important cognitive bias to understand in order to make sense of modern society.
I’ll start by describing a closely-related phenomenon. The hostile media effect is a well-known bias whereby people tend to perceive news they read or watch as skewed against their side. For example, pro-Palestinian students shown a video clip tended to judge that the clip would make viewers more pro-Israel, while pro-Israel students shown the same clip thought it’d make viewers more pro-Palestine. Similarly, sports fans often see referees as being biased against their own team.
The hostile media effect is particularly striking because it arises in settings where there’s relatively little scope for bias. People watching media clips and sports are all seeing exactly the same videos. And sports in particular are played on very even terms, where fairness just means enforcing the rules impartially.
But most possible conflicts are much less symmetric, both in terms of what information each side has, and even in terms of what game each side is playing. Consider, for instance, an argument about whether big corporations have too much power. The proponent might point to corporations’ wealth, employee talent, and lobbying ability; their opponent might point to how many regulations they have to follow, how much corporations compete between themselves, and how strong anti-corporate public sentiment is. In order to evaluate a question like this, people need to decide both how to draw coalition boundaries (to what extent should big corporations be counted as a single unified group?) and how to weigh different types of power against each other.
I think that biases in how these weighings and boundaries are evaluated are a much bigger deal than biases in evaluating fairness in isolated contexts. Specifically, I think that people typically underrate the types of power they have, and overrate the types of power their opponents have. You’re intimately familiar with the limitations of your own abilities—you run into them regularly, often in deeply frustrating ways. You track all the fractures inside your own coalition, and they often seem fundamental and intractable. Conversely, it’s easy to forget about the things which are much easier for you than for your opponents, and to view their internal rivalries as temporary and easily-resolved.
These effects are exacerbated by information asymmetries, aka the “fog of war”. You know who’s working with you; you don’t know who’s working against you. When outside observers sympathize with your side, you know that they’re not actually contributing very much to your cause; when outside observers sympathize with your opponents, you don’t know if that’s a sign of enmity. Similarly, you know how your own plans are progressing, but you don’t know what your opponents are scheming. To see how strong this effect can be, just look at fiction, where villains often implement arbitrarily-complicated schemes offscreen without breaking suspension of disbelief.
In addition to the hostile media effect, underdog bias is related to a number of other biases (like hostile attribution bias, siege mentality, the fundamental attribution error, and simple tribalism). But hopefully the description above conveys why I think it’s fundamental enough to be worth separating out.
In this section I give six examples of conflicts where each side thinks (with some justification) of themselves as the underdog, and rejects the idea of their opponents being the underdogs. I won’t try to defend them in detail, but they hopefully convey the pervasiveness of underdog bias.
The two maps below are sometimes shown by supporters of each side as a way of conveying how much of an underdog their side is. (To be clear, I’m not endorsing either of them, just illustrating the rhetorical strategies being used.)
Underdog bias doesn’t imply that any of these groups are wrong to be scared of their opponents’ power. But it does suggest that they’ll tend to underestimate their own power and, crucially, underestimate how scared their opponents are of them. (The applications of this principle to the politics of AI are left as an exercise for the reader.)
If underdog bias makes us so wrong about the world, why is it such a strong psychological effect? Some cognitive biases are just clear-cut mistakes, but we should expect that the strongest “biases” were evolutionarily adaptive in some way.
The descriptions I gave above suggest that there are qualitative differences in the types of reasoning about ourselves and our enemies—roughly corresponding to near vs far mode. In near mode we focus on concrete, nuanced details about our local situation. In far mode, we construct larger-scale narratives, in more black-and-white terms, often for the sake of signaling to others.
Why might signaling that you’re the underdog be more important than having accurate beliefs? One possibility is to gain allies. Vandallo et al. have a few studies on the effects of appearing to be the disadvantaged side. Scott Alexander summarizes their conclusions as follows: “if you get yourself perceived as the brave long-suffering underdog, people will support your cause and, as an added bonus, want to have sex with you”. And in this post he points to the longstanding prevalence of underdogs in narratives (stretching back to myths like that of David and Goliath).
But he also recognizes that there’s a big difference between reported and actual support. All else equal, underdogs are pluckier and their victories more impressive—so it makes sense that we support them on a narrative level, when there’s no cost to doing so. But in real life supporting the underdog means that you’re on the side most likely to lose. Whether or not underdog bias is beneficial for gaining allies will therefore depend on whether those allies are more concerned about being (or looking) virtuous, or more concerned about actually winning. And while the modern era is dominated by virtue signaling, that was much less true in the ancestral environment, where resources were much scarcer. In that setting you’d instead expect people to have “overdog bias” which makes them overestimate their own side’s strength (which is one way that tribalism, patriotism, etc. can be interpreted).
Another possibility is that underdog bias is most valuable as a way of firing up your own supporters. I see this in action whenever I accidentally give my email to a political candidate, and get bombarded with emails about how I need to donate because the other side is on the verge of an overwhelming victory. But again, there’s a missing link: why should fear make you fight harder? On a rational agent model, being the underdog could make you decide fighting isn’t worth it—or even make you defect to the enemy. And on an emotional level, being scared makes it much harder to think clearly or navigate complicated situations.
So my best guess is that underdog bias was useful because ancestral conflicts were simple and compulsory. In other words, our political intuitions are calibrated for a world where alliances are more tribal—where we don’t have freedom of movement or freedom of association. People used to be stuck with their family/tribe/ethnic group whether they liked it or not; if they tried to ally with another, they were often rejected, or at best permanently viewed as an untrustworthy outsider. So the only rational response to being in a worse position would be to fight harder, using fairly straightforward and intuitive strategies.
In one way, this response is maladaptive in the modern world—where fewer battle lines are based on immutable characteristics or irreconcilable differences, and the best way to approach conflict is less intuitive. Yet as I mentioned above, the modern world is also much more sympathetic to victims. This suggests that underdog bias may have gradually transitioned from a way of firing up one’s supporters, to something closer to a victim complex aimed at evoking sympathy from onlookers.
This whole section has been very speculative, and I’m still not confident in my answer to where underdog bias comes from. But we don’t need an explanation of underdog bias to believe that it corrupts many people’s thinking about complex issues. How can you actually reduce your underdog bias, though? The best approach I’ve found is simple in theory (though devilishly difficult in practice). Say to yourself: “they're just as scared of us as we are of them.” It’s true far more often than you think.