Author's note: This essay was written as part of an effort to say more of the simple and straightforward things loudly and clearly, and to actually lay out arguments even for concepts which feel quite intuitive to a lot of people, for the sake of those who don't "get it" at first glance. If your response to the title of this piece is "Sure, yeah, makes sense," then be warned that the below may contain no further insight for you.
Premise 1: Deltas between one’s beliefs and the actual truth are costly in expectation
(because the universe is complicated and all truths interconnect; because people make plans based on their understanding of how the world works and if your predictions are off you will distribute your time/money/effort/attention less effectively than you otherwise would have, according to your values; because even if we posit that there are some wrong beliefs that somehow turn out to be totally innocuous and have literally zero side effects, we are unlikely to correctly guess in advance which ones are which)
Premise 2: Humans are meaningfully influenced by confidence/emphasis alone, separate from truth
(probably not literally all humans all of the time, but at least in expectation, in the aggregate, for a given individual across repeated exposures or for groups of individuals; humans are social creatures who are susceptible to e.g. halo effects when not actively taking steps to defend against them, and who delegate and defer and adopt others’ beliefs as their tentative answer, pending investigation (especially if those others seem competent and confident and intelligent, and there is in practice frequently a disconnect between the perception of competence and its reality); if you expose 1000 randomly-selected humans to a debate between a quiet, reserved person outlining an objectively correct position and a confident, emphatic person insisting on an unfounded position, many in that audience will be net persuaded by the latter and others will feel substantially more uncertainty and internal conflict than the plain facts of the matter would have left them feeling)
Therefore: Overconfidence will, in general and in expectation, tend to impose costs on other people, above and beyond the costs to one’s own efficacy, via its predictable negative impact on the accuracy of those other people’s beliefs, including further downstream effects of those people’s beliefs infecting still others’ beliefs.
I often like to think about the future, and how human behavior in the future will be different from human behavior in the past.
In Might Disagreement Fade Like Violence? Robin Hanson posits an analogy between the “benefits” of duels and fights, as described by past cultures, and the benefits of disagreement as presently described by members of modern Western culture. He points out that foreseeable disagreement, in its present form, doesn’t seem particularly aligned with the goal of arriving at truth, and envisions a future where the other good things it gets us (status, social interaction, a medium in which to transmit signals of loyalty and affiliation and intelligence and passion) are acquired in less costly ways, and disagreement itself has been replaced by something better.
Imagine that we saw disagreement as socially destructive, to be discouraged. And imagine that the few people who still disagreed thereby revealed undesirable features such as impulsiveness and ignorance. If it is possible to imagine all these things, then it is possible to imagine a world which has far less foreseeable disagreement than our world, comparable to how we now have much less violence than did the ancient farming world.
When confronted with such an imaged future scenario, many people today claim to see it as stifling and repressive. They very much enjoy their freedom today to freely disagree with anyone at any time. But many ancients probably also greatly enjoyed the freedom to hit anyone they liked at anytime. Back then, it was probably the stronger better fighters, with the most fighting allies, who enjoyed this freedom most. Just like today it is probably the people who are best at arguing to make their opponents look stupid who enjoy our freedom to disagree today. Doesn’t mean this alternate world wouldn’t be better.
Reading Hanson’s argument, I was reminded of a similar point made by a colleague, that the internet in general and Wikipedia in particular had fundamentally changed the nature of disagreement in (at least) Western culture.
There is a swath of territory in which the least-bad social technology we have available is “agree to disagree,” i.e. each person thinks that the other is wrong, but the issue is charged enough and/or intractable enough that they are socially rewarded for choosing to disengage, rather than risking the integrity of the social fabric trying to fight it out.
And while the events of the past few years have shown that widespread disagreement over checkable truth is still very much a thing, there’s nevertheless a certain sense in which people are much less free than they used to be to agree-to-disagree about very basic questions like "is Brazil’s population closer to 80 million or 230 million?" There are some individuals that choose to plug their ears and deny established fact, but even when these individuals cluster together and form echo chambers, they largely aren’t given social license by the population at large—they are docked points for it, in a way that most people generally agree not to dock points for disagreement over murkier questions like “how should people go about finding meaning in life?”
Currently, there is social license for overconfidence. It’s not something people often explicitly praise or endorse, but it’s rarely substantively punished (in part because the moment when a person reaps the social benefits of emphatic language is often quite distant from the moment of potential reckoning). More often than not, overconfidence is a successful strategy for extracting agreement and social support in excess of the amount that an omniscient neutral observer would assign.
(, but also [gestures vaguely at everything]. I confidently assert that clear and substantial support for this claim exists and is not hard to find (one extremely easy example is presidential campaign promises; we currently have an open Guantánamo Bay facility and no southern border wall), but I'm leaving it out to keep the essay relatively concise. I recommend consciously noting that the assertion has been made without being rigorously supported, and flagging it accordingly.)
Note that the claim is not “overconfidence always pays off” or “overconfidence never gets punished” or “more overconfidence is always a good thing”! Rather, it is that the pragmatically correct amount of confidence to project, given the current state of social norms and information flow, is greater than your true justified confidence. There are limits to the benefits of excessively strong speech, but the limits are (apparently) shy of e.g. literally saying, on the record, “I want you to use my words against me, [in situation X I will take action Y],” and then doing the exact opposite a few years later.
Caveat 1: readers may rightly point out that the above quote and subsequent behavior of Lindsey Graham took place within a combative partisan context, and is a somewhat extreme example when we’re considering society-as-a-whole. Average people working average jobs are less likely to get away with behavior that blatant. But I’m attempting to highlight the upper bound on socially-sanctioned overconfidence, and combative partisan contexts are a large part of our current society that it would feel silly to exclude as if they were somehow rare outliers.
Caveat 2: I've been equivocating between epistemic overconfidence and bold/unequivocal/hyperbolic speech. These are in fact two different things, but they are isomorphic in that you can convert any strong claim such as Graham’s 2016 statement into a prediction about the relative likelihood of Outcome A vs. Outcome B. One of the aggregated effects of unjustifiably emphatic and unequivocal speech across large numbers of listeners is a distortion of those listeners’ probability spread—more of them believing in one branch of possibility than they ought, and than they would have if the speech had been more reserved. There are indeed other factors in the mix (such as tribal cohesion and belief-as-attire, where people affirm things they know to be false for pragmatic reasons, often without actually losing sight of the truth), but the distortion effect is real. Many #stopthesteal supporters are genuine believers; many egalitarians are startled to discover that the claims of the IQ literature are not fully explained away by racism, etc.
In short, displays of confidence sway people, independent of their truth (and often, distressingly, even independent of a body of evidence against the person projecting confidence). If one were somehow able to run parallel experiments in which 100 separate pitches/speeches/arguments/presentations/conversations were each run twice, the first time with justified confidence and emphasis and the second with 15% "too much" confidence and emphasis, I would expect the latter set of conversations to be substantially more rewarding for the speaker overall. Someone seeking to be maximally effective in today’s world would be well advised to put nonzero skill points into projecting unearned confidence—at least a little, at least some of the time.
This is sad. One could imagine a society that is not like this, even if it’s hard to picture from our current vantage point (just as it would have been hard for a politician in Virginia in the early 1700s to imagine a society in which dueling is approximately Not At All A Thing).
I do not know how to get there from here. I am not recommending unilateral disarmament on the question of strategic overconfidence. But I am recommending the following, as preliminary steps to make future improvement in this domain slightly more likely:
0. Install a mental subroutine that passively tracks overconfidence...
...particularly the effects it has on the people and social dynamics around you (since most of my audience is already informally tracking the effects of their own overconfidence on their own personal efficacy). Gather your own anecdata. Start building a sense of this as a dynamic that might someday be different, à la dueling, so that you can begin forming opinions about possible directions and methods of change (rather than treating it as something that shall-always-be-as-it-always-has-been).
1. Recognize in your own mind that overconfidence is a subset of deceit...
...as opposed to being in some special category (just as dueling is a subset of violence). In particular, recognize that overconfidence is a behavioral pattern that people are vulnerable to, and can choose to indulge in more or less frequently, as opposed to an inescapable reflex or inexorable force of nature (just as violence is a behavioral pattern over which we have substantial individual capacity for control). Judge overconfidence (both in yourself and others, both knowing and careless) using similar criteria to those you use to judge deceit. Perhaps continue to engage in it, in ways that are beneficial in excess of their costs, but do not confuse "net positive" with "contains no drawbacks," and do not confuse "what our culture thinks of it" with "what it actually is." Recognize the ways in which your social context rewards you for performative overconfidence, and do what you can to at least cut back on the indulgence, if you can't eschew it entirely ("if you would go vegan but you don't want to give up cheese, why not just go vegan except for cheese?"). Don't indulge in the analogue of lies-by-omission; if you can tell that someone seems more convinced by you than they should be, at least consider correcting their impression, even if their convinced-ness is convenient for you.
2. Where possible, build the habit of being explicit about your own confidence level...
...the standard pitch here is "because this will make you yourself better at prediction, and give you more power over the universe!" (which, sure, but also  and also the degree matters; does ten hours of practice make you .01% more effective or 10% more effective?). I want to add to that motivation "and also because you will contribute less to the general epistemic shrapnel being blasted in every direction more or less constantly!" Reducing this shrapnel is a process with increasing marginal returns—if 1000 people in a tight-knit community are all being careless with their confidence, the first to hold themselves to a higher standard scarcely improves the society at all, but the hundredth is contributing to a growing snowball, and by the time only a handful are left, each new convert is a massive reduction in the overall problem.
Practice using numbers and percentages, and put at least a one-time cursory effort into calibrating that usage, so that when your actual confidence is "a one-in-four chance of X" you can convey that confidence precisely, rather than saying largely contentless phrases like "a very real chance." Practice publicly changing your mind and updating your current best guesses. Practice explicitly distinguishing between what seems to you to be likely, what seems to you to be true, and what you are justified in saying you know to be true. Practice explicitly distinguishing between doxa, episteme, and gnosis, or in more common terms, what you believe because you heard it, what you believe because you can prove it, and what you believe because you experienced it.
3. Adopt in your own heart a principle of adhering to true confidence...
...or at least engaging in overconfidence only with your eyes open, such that pushback of the form "you're overconfident here" lands with you as a cooperative act, someone trying to help you enact your own values instead of someone trying to impose an external standard. This doesn't mean making yourself infinitely vulnerable to attacks-in-the-guise-of-feedback (people can be wrong when they hypothesize that you're overconfident, and there are forms of pushback that are costly or destructive that you are not obligated to tolerate, and you can learn over time that specific sources of pushback are more or less likely to be useful), but it does mean rehearsing the thought "if they're right, I really want to know it" as an inoculation against knee-jerk dismissiveness or defensiveness.
4. Don't go around popping bubbles...
...in which the local standards are better than the standards of the culture at large. I have frequently seen people enter a promising subculture and drag it back into the gutter under the guise of curing its members of their naïveté, and forearming them against a cruel outside world that they were in fact successfully hiding from. I've also witnessed people who, their self-esteem apparently threatened by a local high standard, insisted that it was all lies and pretense, and that "everybody does X," and who then proceeded to deliberately double down on X themselves, successfully derailing the nascent better culture and thereby "proving their point." I myself once made a statement that was misinterpreted as being motivated primarily by status considerations, apologized and hastened to clarify and provide an alternate coherent explanation, and was shot down by a third party who explicitly asserted that I could not opt out of the misinterpretation while simultaneously agreeing that the whole status framework was toxic and ought to go.
When society improves, it's usually because a better way of doing things incubated in some bubble somewhere until it was mature enough to germinate; if you are fortunate enough to stumble across a fledgling community that's actually managed to relegate overconfidence (or any other bad-thing-we-hope-to-someday-outgrow) to the same tier as anti-vax fearmongering, maybe don't go out of your way to wreck it.
To reiterate: the claim is not that any amount of overconfidence always leads to meaningful damage. It's that a policy of indulging in and tolerating overconfidence at the societal level inevitably leads to damage over time.
Think about doping, or climate change—people often correctly note that it's difficult or impossible to justify an assertion that a given specific athletic event was won because of doping, or that a given specific extreme weather event would not have happened without the recent history of global warming. Yet that does not weaken our overall confidence that drugs give athletes an unfair edge, or that climate change is driving extreme weather in general. Overconfidence deals its damage via a thousand tiny cuts to the social fabric, each one seeming too small in the moment to make a strong objection to (but we probably ought to anyway).
It's solidly analogous to lying, and causes similar harms: like lying, it allows the speaker to reap the benefits of living in a convenient World A (that doesn't actually exist), while only paying the costs of living in World B. It creates costs, in the form of misapprehensions and false beliefs (and subsequent miscalibrated and ineffective actions) and shunts those costs onto the shoulders of the listeners (and other people downstream of those listeners). It tends to most severely damage those who are already at the greatest disadvantage—individuals who lack the intelligence or training or even just the spare time and attention to actively vet new claims as they're coming in. It's a weapon that grows more effective the more desperate, credulous, hopeful, and charitable the victims are.
This is bad.
Not every instance of overconfidence is equally bad, and not every frequently-overconfident person is equally culpable. Some are engaging in willful deception, others are merely reckless, and still others are trying their best but missing the mark. The point is not to lump "we won the election and everyone knows it" into the same bucket as "you haven't seen Firefly? Oh, you would love Firefly," but merely to acknowledge that they're both on the same spectrum. That while one might have a negative impact of magnitude 100,000 and the other of magnitude 0.01, those are both negative numbers.
That is an important truth to recognize, in the process of calibrating our response. We cannot effectively respond to what we don't let ourselves see, and it's tempting to act as if our small and convenient overconfidences are qualitatively different from those of Ponzi schemers and populist presidents.
But they aren't. Overconfidence can certainly be permissible and forgivable. In some strategic contexts, it may be justified and defensible. But every instance of it is like the cough of greenhouse gases from starting a combustion engine. Focus on the massive corporate polluters rather than trying to shame poor people who just need to get to work, yes, but don't pretend that the car isn't contributing, too.
It's unlikely that this aspect of our culture will change any time soon. We may never manage to outgrow it at all. But if you're looking for ways to be more moral than the culture that raised you, developing a prosocial distaste for overconfidence (above and beyond the self-serving one that's already in fashion) is one small thing you might do.
Author's note: Due to some personal considerations, I may not actively engage in discussion below. This feels a little rude/defecty, but on balance I figured LessWrong would prefer to see this and be able to wrestle with it without me, than to not get it until I was ready to participate in discussion (which might mean never).