I apparently possess some sort of aura of competence. Some say I'm confident, others say I'm arrogant, others remark on how I seem very certain of myself (which I have been told both as compliment and critique).
I was surprised, at first, by these remarks from friends and family — from my perspective, I'm usually the first person in the conversation to express uncertainty in the form of probability estimates and error bars. I'm often quick to brainstorm alternative explanations of the data I use to support my claims. And, of course, I'm certain of nothing.
In fact, I had a conversation with a friend about this phenomenon once, which went something like this:
Me: Hey, have you noticed how everyone thinks I have an aura of confidence and certainty, sometimes arrogance? I don't know how to shake it, nor how it works. What's up with that?
Him: Well, you always seem to have a solid grasp on every situation. When you're explaining things, you answer questions quickly, deftly, and with precision.
Me: I don't think that's it, though. I'm rarely confident in the claims I'm making, and I tend to highlight that fact. Earlier, when we were talking with [other friend] about tools society can use to break monopolies, I was very explicit about where my uncertainty lies, and what assumptions my models relied upon, and where they might be flawed.
Him: Yeah, but even then you were confident in what you were saying — maybe not confident in any particular claim you made, but confident in your overall analysis.
Me: I don't think that's it either. I'll be the first to admit that the probabilities I put on my propositions are pulled out of thin air, and I'll also be the first to admit that my hypothesis space is decrepit and that I'd be able to find better models if I could think better. In fact, I'm aware of a bunch of flaws in the ways I think, and I dedicate a decent amount of effort to improving my own reasoning methods.
Me: … I'm doing the thing right now, aren't I?
Him: Yes, yes you are.
There definitely is something of "confidence" to this pattern of speech and thinking, but it's not an empirical confidence. The confidence people notice in me isn't in the content of my claims, for I'm quick to couch my claims with probability estimates and error bars. Most of the confidence isn't in my analysis, either; I'm quick to note the ways my analyses could be flawed.
Some of the confidence does reside in the ways I reason; I do admit that I am much better equipped to answer questions of the form "but why are you so much more confident in your own reasoning than their reasoning, when they actually have more credentials?" than most. But even there, I can note plausible biases and judgement errors in my own reasoning processes with alacrity.
Why, then, do I come off as so confident? Why do I seem so self-assured while listing the ways I know my brain is flawed?
On reflection, I've concluded that (at least part of) the answer is something I call "confidence all the way up". Insofar as I'm uncertain of my content, I'm confident in my analysis — except, I'm not fully confident in my analysis. But insofar as I'm uncertain of my analysis, I'm confident in my reasoning procedures — except, I don't put faith there, either. But insofar as I'm uncertain of my reasoning procedures, I'm confident in my friends and failsafe mechanisms that will eventually force me to take notice and to update. Except, that's not quite right either — it's more like, every lack of confidence is covered by confidence one meta-level higher in the cognitive chain.
The result is something that reads socially as confidence regardless of how much empirical uncertainty I'm under.
Where does it bottom out? Well, insofar as my friends and failsafe mechanisms aren't sufficient to raise errors to my attention, I expect to reason poorly in an irredeemable fashion and then fail to achieve my goals. It bottoms out at the point where I say "yeah, if I'm that far gone, then I fail and die."
(And somehow, I'm able to say even this while maintaining my aura of self-assuredness and confidence).
I have encountered many people who seem paralyzed by their uncertainties. They hit a question (such as "what methods can a society use to break up monopolies?") and they are pretty sure that they won't be able to generate the right answers, and so they generate no answers.
And this may be a better failure mode than the failure mode of someone who has too much confidence and self-assuredness, who makes up a bunch of bad answers and then believes them with all their heart.
Someone with Confidence All The Way Up, though, can achieve the third alternative: generate a bunch of bad answers, understand why they're bad and where their limitations are, and use that information as best they can.
I have found this mindset to be very useful throughout my life. Confidence all the way up is what has me dive into the fray to try new things, while others stand on the sidelines bemoaning a high degree of uncertainty. It's part of the technique of treat recurring failures as data and training, rather than as a signal that it's time to feel guilty. It's part of the technique of knowing you're deeply limited without letting that interfere with your progress towards the goal. Of the top ten most competent people I've met in person (by my estimation), eight of them seem to have some variant of confidence all the way up running. If the mindset seems foreign to you, I suggest finding a way to practice it for a while.
Confidence all the way up is about working with what you have. It's about knowing your limitations. It's about knowing that you don't have perfect models of "what you have" nor "your limitations", and proceeding anyway, with an even stride.
It's about knowing that there are going to be curveballs, and trusting your ability to handle curveballs, but not all the time; and trusting your ability to get back up when you're knocked down by a curveball you couldn't handle, but not all the time; and coming to terms with the fact that you might be hurt so badly you can't get up.
Yes, we're limited. All humans are limited. There are important, decision-relevant facts that we don't know. Our reasoning processes run on compromised hardware. But the correct response to uncertainty is not to proceed at half speed!
No matter how hard you try to justify your beliefs, if you're being honest with yourself, they won't ground out into "and therefore, no matter what I do, everything is going to be OK." No matter how hard you try to justify your reasoning, the meta-reasoning tower does not terminate at "and thus, eventually you will become capable of success." They terminate at "I may be so wrong that I can never be corrected; I may fail and all value may be lost." You will find no objectively stable perch from which to launch your reasoning.
But you were created already in motion. You don't need to ground out all your beliefs and justify all your reasoning steps before you can start moving. You don't need to have plans for every contingency before you can act. You don't need to be highly confident in your analyses before you present a model. If you sit around awaiting certainty, you will be waiting a long while.
Better, I say, to cover each lack of confidence on one level with confidence on the next level, and to come to terms with the fact that if you're so irredeemable that even your best meta-reasoning cannot save you, then you've already lost.