A taxonomy of Cruxes

by elityre7 min read27th May 20204 comments

41

Rationality
Frontpage

[Crossposted to Musings and Rough Drafts]

This is a quick theoretical post. I have little idea if this is interesting to others. Plus this is super finicky to read, but I don't know what to do about that.

In this post, I want to outline a few distinctions between different kinds of cruxes. Sometimes folks will find what seems to be a crux, but they feel some confusion, because it seems like it doesn’t fit the pattern that they’re familiar with, or it seems off somehow. Often this is because they’re familiar with one half of a dichotomy, but not the other.

Conjunctive, unitary, and disjunctive cruxes

As the Double Crux method is typically presented, double cruxes are described as single propositions, about which, if you changed your mind, you would change your mind about another belief.

But as people often ask,

What if there are two propositions, B and C, and I wouldn’t change my mind about A, if I just changed my mind about B or if I just changed my mind about C? I would my mind about A, only if I shift on both B and C.

This is totally fine. In this situation would would just say that your crux for A is a conjunctive crux of B and C.

In fact, this is pretty common, because people often have more than one concern in any given situation.

Some examples:

  • Someone is thinking about quitting their job to start a business, but they will only pull the trigger if a) they thought that their new work would actually be more fulfilling for them, and b) they know that their family won’t suffer financial hardship.
  • A person is not interested in signing up for cryonics, but offers that they would if a) it was inexpensive (on the order of $50 a month and b) if the people associated with cryonics were the sort of people that he wanted to be identified with. [These are the stated cruxes of a real person that I had this discussion with.]
  • A person would go vegetarian if, a) they were sure it was healthy for them and b) doing so would actually reduce animal suffering (or "how elastic is the supply curve for meat?").

In each of these cases there are multiple considerations, none of which is sufficient to cause one to change one’s mind, but which together represent a crux.

As I said, conjunctive cruxes are common, I will say that sometimes folks are too fast to assert that they would only change their mind if they turned out to be wrong about a large number of conjunctive terms.

When you find yourself in this position of only changing your mind on the basis of a large number of separate pieces, this is a flag that there may be a more unified crux that you’re missing.

In this situation I would back up and offer very “shallow” cruxes. Instead of engaging with all the detail of your model, instead look for a very high level / superficial summary, and check if that is a crux. Following a chain of many shallow cruxes is often easier than trying to get into the details of complicated models right off the bat. (More on this in another post)

(Alternatively, you might move into something more like consideration factoring.)

As a rule of thumb, the number of parts to a conjunction should be small: 2 is common, three is not that common. Having a 10 part conjunction is implausible. Most people can’t hold that many elements in their head all at once!

I’ve occasionally seen order of 10 part disjunctive arguments / conjunctive cruxes in technical papers, though I think it is correct to be suspicious of them. They’re often of the form “argument one is sufficient, but even if it fails, argument 2, is sufficient, and even that one fails…”

I'm often skeptical because errors are often correlated, and the arguments are likely not as independent as they may at first appear. It behooves you to identify the deep commonality between your lines of argument, the assumptions that multiple arguments are resting on, because then you can examine it directly. (Related to the “multiple stage fallacy‘).

Of course, one could, in principle, have a disjunctive crux, where if a person changed their mind about B or about C, they would change their mind about A. But, in that case there’s no need to bundle B and C, into a disjunctive proposition. I would just say that B is a crux for A and also C is a crux for A.

Causal cruxes vs. evidential cruxes

A causal crux back-traces the causal arrow of your belief structure. They’re found by answering the question “why do I believe [x]?” or “what caused me to think [x] in the first place?” and checking if the answer is a crux.

For instance, someone is intuitively opposed to school uniforms. Introspecting on why they feel that way, they find that they’re expecting (or afraid that) that kind of conformity squashes creativity. They check if that’s a crux for them (“what if actually school uniforms don’t squash creativity?”), and find that it is: they would change their mind about school uniforms, if they changed their mind about the impact on creativity.”

Causal cruxes trace back to the reason why you believe the proposition.

In contrast, an evidential crux is a proxy for your belief. You might find evidential cruxes by asking a question like “what could I observe, or find out, that would make me change my mind?”

For instance, (this one is from a real double crux conversation that happened at a training session I ran), two participants were disagreeing about whether advertising destroys value on net. Operationalizing, one of them stated that he’d change his mind if they realized that beer commercials, in particular, didn’t destroy value.

It wasn’t as if he believed that advertising is harmful because beer commercials destroy value. Rather it was that he thought that advertising for beer was a particularly strong example of the general trend that advertising is harmful. So if he changed his mind in that instance, where he was most confident, he expected that he would be compelled in the general case.

In this case “beer commercials” are serving as a proxy for “advertising.” If the proxy is well chosen, this can totally serve as a double crux. (It is, of course, possible that one will be convinced that they were mistaken about the proxy, in a way that doesn’t generalize to the underlying trend. But I don’t think that this is significantly more common than following a chain of cruxes down, resolving at the bottom, and then finding that the crux that you named was actually incomplete. In both cases, you move up as far as needed, adjust the crux (probably by adding a conjunctive term), and then traversing a new chain.)

Logically, these two kinds of cruxes both have the structure “If not B, then not A” (“if uniforms don’t squash creativity, then I wouldn’t be opposed to them anymore.” and “if I found that beer commercials in fact do create value, then I would think that advertising doesn’t destroy value on net”). In that sense they are equivalent.

But psychologically, causal cruxes traverse deeper into one’s belief structure, teasing out why one believes something, and evidential cruxes traverse outward, teasing out testable consequences or implications of the belief.

Monodirectional vs. Bidirectional cruxes

Say that you are the owner of a small business. You and your team are considering undertaking a major new project. One of your employees speaks up and says “we can’t do this project. The only way to execute on it would bankrupt the company.”

Presumably, this would be a crux for you. If you knew that the project under consideration would definitely bankrupt the company, you would definitively think that you shouldn’t pursue that project.”

However, it also isn’t a crux, in this sense: if you found out that that claim was incorrect, that actually you could execute on the project without bankrupting your company, you would not, on that basis alone, definitively decide to pursue the project.

This is an example of a monodirectional crux. If the project bankrupts the company, then you definitely won’t do it. But if it doesn’t bankrupt the company then you’re merely uncertain. This "bankrupting" consideration dominates all the other considerations, it is sufficient to determine the decision, when it is pointing in one direction, but it doesn’t necessarily dominate when it points in the other direction.

(Oftentimes, double cruxes are composed of two opposite bidirectional cruxes. This can work totally fine. It isn’t necessary that for each participant, the whole question turns on the double crux, so long as for each participant, flipping their view on the crux (from their current view) would also cause them to change their mind about the proposition in question.)

In contrast, we can occasionally identify a bidirectional crux.

For instance, if a person thinks that public policy ought to optimize for Quality Adjusted Life Years, and they’ll support whichever health care scheme does that, then “maximizing QALYs” is a bidirectional crux. That single piece of information (which plan best maximizes QALYs), completely determines their choice.

“A single issue voter” is a person voting on the basis of a bidirectional crux.

In all of these cases you’re elevating one of the considerations over and above all of the others.

Pseudo cruxes

[This section is even more esoteric, and is of little practical relevance, except for elucidating a confusion that folks sometimes encounter.]

Because of the nature of mono-directional cruxs, people will sometimes find pseudo-cruxes, propositions that seem like cruxes, but are nevertheless irrelevant to the conversation.

To give a (silly) example, let’s go back to the canonical disagreement about school uniforms. And let’s consider the proposition “school uniforms eat people.”

Take a person who is in favor of school uniforms. The proposition that “school uniforms eat people” is almost certainly a crux for them. The vast majority of people who support school uniforms would change their mind if they were convinced that school uniforms were carnivorous.

(Remember, in the context of a Double Crux conversation, you should be checking for cruxy-ness independently of your assessment of how likely the proposition is. The absurdity heuristic is insidious, and many claims that turn out to be correct, seem utterly ridiculous at first pass, lacking a lot of detailed framing and background.)

This is a simple crux. If the uniform preferring person found out that uniforms eat people, they would come to disprefer uniforms.

Additionally, this is probably a crux for folks who oppose school uniforms as well, in one pretty specific sense: were all of their other arguments to fall away, knowing that school uniforms eat people would still be sufficient reason for them to oppose school uniforms. Note that doesn’t mean that they do think that school uniforms eat people, nor does it mean that finding out that school uniforms don’t eat people (duh) would cause them to change their mind, and think that school uniforms are good. We might call this an over-determining hypothetical crux. It’s a bidirectional crux that points exclusively in the direction that a person already believes, and which furthermore, the person currently assumes to be false.

A person might say,

I already think that school uniforms are a bad idea, but if I found out they eat people, that would be further reason for me to reject them. Furthermore, now that we’re discussing the possibility, that school uniforms don’t eat people is such an important consideration such that it would have to be a component of any conjunctive crux that would cause me to change my mind and think that school uniforms are a good idea. But I don’t actually think that school uniforms eat people, so it isn’t a relevant part of that hypothetical conjunction.

This is a complicated series of claims. Essentially, this person is saying that in a hypothetical world where they thought differently than they currently do, this consideration, if it held up, would be a crux for them (that would bring them to the position that they actually hold, in reality).

Occasionally (on the order of once out of 100?), a novice participant will find their way to a pseudo crux like that one, and find themselves confused. They can tell that the proposition “school uniforms eat people” if true, matters for them. It would be relevant for their belief. But it doesn’t actually help them push the disagreement forward, because, at best, it pushes further in the direction of what they already think.

(And secondarily, it isn’t really an opening for helping their partner change their mind, because the uniform-dispreferring person, doesn’t actually think  that school uniforms eat people, and so would only try to argue that they do if they had abandoned any pretense of truth-seeking in favor of trying to convince someone using whatever arguments will persuade, regardless of their validity.)

So this seems like a crux, but it can’t do work in the Double Crux process.

There is another kind of pseudo crux stemming from bidirectional cruxes. This is when a proposition is not a crux, but it’s inverse would be.

In our school uniform example, suppose that that in a conversation, someone boldly, and apropos of nothing,  asserted “but school uniforms don’t eat people.” Uniforms eating people is a monodirectional crux that dominates all the other considerations, but school uniforms not eating people is so passé, that it is unlikely to be a crux for anyone (unless the reason they were opposed to school uniforms was kids getting eaten). Nevertheless, there is something about it that seems (correctly) cruxy. It is the ineffectual side of a monodirectional crux. It isn’t a crux, but its inverse is. We might call this a crux shadow or something.

Thus, there is a four-fold pattern of monodirectional cruxes, where one quadrant is a useful progress bearing crux, and the other three contain different flavors of pseudo cruxes.

In the general case,

Note that the basic double crux pattern avoids accidentally landing on pseudo cruxes.

Rationality2
Frontpage

41