Me: “I don’t know what I should do next? Lots of things seem good, but I can’t think of particularly good reasons for them to cash out to what I care about.”

Hallucinated frankenstein interlocutor: “People in your position tend to undervalue judgement and gut thinking relative to explicit reasoning, you should go with what your gut says is good."

Me: “I live inside my head and I can definitely say that much of the time my gut is saying what it thinks high status people would want to hear or thinking about the last three examples and none before or is taking into account at most two out of the whole space of considerations, why am I trusting it again?”

HFI: “Sure, but if you just go with explicit reasoning, you’ll miss a bunch of unquantifiable intuitions that contain a lot of data, and those are valuable.”

Me: “Yes, but again from inside my head I need you to hear me when I say my gut is like the people who answer “should there be homosexuals in the military” differently from “should there be gays or lesbians in the military” and I don’t think I’m like unusual, this is like the whole point of the rationality project.”

This is a kind of conversation I have a lot (if this confuses you, see Appendix 1), which at times has left me feeling like maybe I’m the only person in EA/rationality who thinks they have a brain that makes mistakes by default, that can’t be trusted to make good decisions without actually thinking about them. This is definitely not the case! 
 

There are a bunch of ways in which HFI and I above are not understanding each other (Appendix 2), but one is that, like half of a cow, there are two guts in play.

Sometimes people in the world are like “trust your gut instinct about people” and they mean “trust your split-second first impression.” And sometimes they’re like “in the end, you have to go with your gut” and they mean “after marinating in all the relevant ideas, thinking for a long time, sleeping on it, probably doing a lot of explicit reasoning, if you have a deep sense of what you should do, it’s worth trusting.”

And I often heard people as saying “trust your snap gut”, and I was like “my snap gut judgment is that that is insane. Do you know that I don’t know anything about this.” But at least in one case, and likely others, they meant “trust your reflective gut”, the gut that’s had time to sit with everything and digest it (the metaphor pays dividends).

[They might also mean that there’s a valuable exchange to be had between gut and explicit models, where your explicit reasoning informs your intuition and vice versa, and you can interrogate your sense of what’s reasonable or valuable to mine it for interesting information or to find evidence that it’s based on what Melissa in third grade told you one time and maybe it’s time to let go of the notion that every fifth American highway mile is perfectly straight for planes to land in wartime.] (More in Appendix 3)

And sometimes I think they are saying “you’ve ingested more in this category than you give yourself credit for” and/or “you’ve had more time to digest this than you give yourself credit for”.

But at least this is something I can make sense of, because I know my snap judgment changes based on what tweet I read right before you asked me, but I also know that good judgment is of deep, deep value, since we are making decisions all the time about what to do and how to act, and we don’t have time to do all of that explicitly (Though, see Appendix 4).

Appendix 1: 

Sometimes people aren’t so much talking as trying to win the last argument they were in. If you’re like “what? EA’s *love* explicit reasoning” you’re both right and you’ll have to take my word for it that there are subsections where indeed I’m under the impression (mistaken or not) that I need to fight on the margin for the glory of explicit reasoning. I find a lot of human behavior is more comprehensible if you adopt the frame in which they are reacting on the margin, perceiving themselves as a valiant minority faced all around by a stronger enemy on their pet thing. I’m here being like “explicit reasoning!” in a sea of “hone your judgment” in a larger continent of “explicit reasoning!” in a larger ocean of something else, so everyone gets to feel very brave.

Appendix 2: 

  • I think HFI often doesn’t take my own sense of my weaknesses seriously, which is ironic because it’s my carefully considered inside view based on being myself for a while
    • It feels like surely it matters how good the judgment of the people involved is and I don’t always feel like people are assessing me in particular to say “yes, Chana, you have good judgment” so I’m in this awkward position where I feel like I have to be like “just fyi, I think I’m worse than whatever the median person you’re talking to about this is maybe?”
    • Some people’s judgment is bad, yo!
  • There are other good reasons to go with your judgment call, perhaps to test it and help it be better in the future (though I think sometimes you can do this by not going with your judgment but noting that you did so and checking later)
  • I think HFI could help me out by noting specific examples in which they were happy they went with their gut judgment over explicit reasoning (though often the reason they were happy is not very comprehensible to me in concrete terms precisely because of the nature of the topic here)

Appendix 3:

Figuring out why you believe what you believe I think is a great exercise (for which Focusing might be helpful), where you try to access the actual reasons your brain came to output this thing, and some of them are going to be good and some useless and then you have a better sense of what to hold on to. (Though sometimes it will also be very unclear!)

And when you look at an argument and it makes sense to you and you see the logic, you’ll sometimes feel your gut sense change, because you have internalized and taken seriously what it means and what it points to.


Appendix 4:

I feel like I repeated that paragraph a bit because it feels virtuous in the minds of people who think “judgment is good.” In actuality, I think you could reflect once a year on what your core goals are, check in every quarter on whether you’re aimed at them correctly, and every week on whether you’re moving towards them, in such a way that you’re mostly working off of very intense explicit reasoning you did at some point, even if in each moment you’re trusting your past self. Would be weird if judgment just didn’t matter here, though, and the whole point of this is that “deep thought” and “good judgment” aren’t in tension.

 

Appendix 5:

For what it’s worth, I want to stand up for not always having a judgment or a gut sense, or knowing what it is, and that not necessarily being some horrifying pathology where you’ve excised your humanity in service of the Numbers God but instead just what it can feel like to be uncertain or to have instincts that are pretty reliably warped by something or other (e.g. mental illness).