[Written September 02, 2022. Note: I'm likely to not respond to comments promptly.]

Sometimes people defer to other people, e.g. by believing what they say, by following orders, or by adopting intents or stances. In many cases it makes sense to defer, since other people know more than you about many things, and it's useful to share eyes and ears, and coordination and specialization are valuable, and one can "inquisitively defer" to opinions by taking them as challenges to investigate further by trying them out for oneself. But there are major issues with deferring, among which are:

  • Deferral-based opinions don't contain the detailed content that generated the opinions, and therefore can't direct action effectively or update on new evidence correctly.
  • Acting based on deferral-based opinions is discouraging because it's especially not the case that the whole of you can see why the action is good.
  • Acting based on deferral-based opinions to some extent removes the "meaning" of learning new information; if you're just going to defer anyway, it's sort of irrelevant to gain information, and your brain can kind of tell that, so you don't seek information as much. Deference therefore constricts the influx of new information to individuals and groups.
  • A group with many people deferring to others will amplify information cascades by double-triple-quadruple-counting non-deferral-based evidence.
  • A group with many people deferring to others will have mistakenly correlated beliefs and actions, and so will fail to explore many worthwhile possibilities.
  • The deferrer will copy beliefs mistakenly imputed to the deferred-to that would have explained the deferred-to's externally visible behavior. This pushes in the direction opposite to science because science is the way of making beliefs come apart from their pre-theoretical pragmatic implications.
  • Sometimes the deferrer, instead of imputing beliefs to the deferred-to and adopting those beliefs, will adopt the same model-free behavioral stance that the deferred-to has adopted to perform to onlookers, such as pretending to believe something while acting towards no coherent purpose other than to maintain the pretense.
  • If the deferred-to takes actions for PR reasons, e.g. attempting to appear from the outside to hold some belief or intent that they don't actually hold, then the PR might work on the deferrer so that the deferrer systematically adopts the false beliefs and non-held intents performed by the deferred-to (rather than adopting beliefs and intents that would actually explain the deferred-to's actions as part of a coherent worldview and strategy).
  • Allocating resources based on deferral-based opinions potentially opens up niches for non-epistemic processes, such as hype, fraud, and power-grabbing.
  • These dynamics will be amplified when people choose who to defer to according to how much the person is already being deferred to.
  • To the extent that these dynamics increase the general orientation of deference itself, deference recursively amplifies itself.

Together, these dynamics make it so that deferral-based opinions are under strong pressure to not function as actual beliefs that can be used to make successful plans and can be ongoingly updated to track reality. So I recommend that people

  • keep these dynamics in mind when deferring,
  • track the difference between believing someone's testimony vs. deferring to beliefs imputed to someone based on their actions vs. adopting non-belief performative stances, and
  • give substantial parliamentary decision-weight to the recommendations made by their expectations about facts-on-the-ground that they can see with their own eyes.

Not to throw away arguments or information from other people, or to avoid investigating important-if-true claims, but to think as though thinking matters.


New Comment
7 comments, sorted by Click to highlight new comments since: Today at 8:00 AM

Null pointer exceptions and segfaults are the most obvious risks. Oh wait - those are dangers of dereference.

While I don’t disagree that delegating your beliefs (my preferred framing; deference is passive and hierarchy-based, delegation is active and implies an intentional choice of who and what) has done cost, it also has massive benefits in efficiency and probably correctness, as long as you delegate to mainstream thinkers rather than contrarians.

The deferrer will copy beliefs mistakenly imputed to the deferred-to that would have explained the deferred-to's externally visible behavior. This pushes in the direction opposite to science because science is the way of making beliefs come apart from their pre-theoretical pragmatic implications.

Clarificaton request, this means that in addition to the stuff that the deferred-to opines, leaners will take as advice stuff the author didn't mean to be opining?

I don't know whether the high-mindedness magisteria matters. I question whether that activity is actually philosophy rather than science (I guess there is a link through "natural philosophy"). Seem I don't know da way.

What I mean is, suppose the deferred-to has some belief X. This X is a refined, theoretically consilient belief to some extent, and to some extent it isn't, but is instead pre-theoretic; intuitive, pragmatic, unreliable, and potentially inconsistent with other beliefs. What happens when the deferred-to takes practical, externally visible action, which is somehow related to X? Many of zer other beliefs will also play a role in that action, and many of those beliefs will be to a large extent pre-theoretical. Pre-theoreticalness is contagious, in action: theoretical refinement, to be expressed in action, asks for a rethinking of previously used protocols, so the easiest way to act on X is to use what is functionally a more pre-theoretical version of X.

So if the deferrer is imputing beliefs based on action, they'll in general impute a more pre-theoretical belief; and they'll place extra drag on their own processes of theoretical refinement. Like, when they notice contradictions, instead of rethinking their concepts and assumptions, they'll avoid doing so, because that would contradict the apparent belief implied by the deferred-to's behavior.

(Sorry this is isn't more clear or concrete. I think the history of phlogiston is an example of some of this, where two theories are nearly identical in terms of pre-theoretic behavioral implications / expectations (e.g., both theories say that fire will be snuffed out by being in an enclosed space); but then by drawing out more implications, the threat of inconsistency forces one theory to be more and more complicated.)

I was about to request clarification on this too. I don't get 

"science is the way of making beliefs come apart from their pre-theoretical pragmatic implications."

And I would like to get it.

(See my response to the parent comment.)

Endorsed. I think what you should do about deferral depends on what role you wish to play in the research community. Knowledge workers intending to make frontier progress should be especially skeptical of deferring to others on the topics they intend to specialise in. That may mean holding off on deferring on a wide range of topics, because curious scientists should keep a broad horizon early on. Deferring early on could lead to habits-of-thought that can be hard to override later on (sorta like curse of knowledge), and you might miss out on opportunities to productively diverge or even discover a flaw in the paradigm.

Explorers should mostly defer on value of information, not object-level beliefs. When someone I trust says they're confident in some view I'm surprised by, I'm very reluctant to try to tweak my models to output what I believe they believe; instead I make a note to investigate what they've investigated, using my own judgment of things all the way through.

Yeah, VoI seems like a better place to defer. Another sort of general solution, which I find difficult but others might find workable, is to construct theories of other perspectives. That lets there be sort of unlimited space to defer: you can do something that looks like deferring, but is more precisely described as creating a bunch of inconsistent theories in your head, and deferring to people about what their theory is, rather than what's true. (I run into trouble because I'm not so willing to accept others's languages if I don't see how they're using words consistently.)

New to LessWrong?