I reject the framing of truth vs winning. Instead, I propose that only winning matters. Truth has no terminal value.

It does, however, have enormous instrumental value. For this reason, I support the norm always to tell the truth even if it appears as if the consequences are net negative – with the reasoning that they probably aren't, at least not in expectation. This is so in part because truth feels extremely important to many of us, which means that having such a norm in place is highly beneficial.

The other response is much more interesting, arguing that appeals to consequences are generally bad, and that meta-level considerations mean we should generally speak the truth even if the immediate consequences are bad. I find this really interesting because it is ultimately about infohazards: those rare cases where there is a conflict between epistemic and instrumental rationality. Typically, we believe that having more truth (via epistemic rationality) is a positive trait that allows you to “win” more (thus aligning with instrumental rationality). But when more truth becomes harmful, which do we preference: truth, or winning?

The keyword here is "immediate" [emphasis added], which you drop by the end. I agree with the first part of this paragraph but disagree with the final sentence. Instead, my question would have been, "but when more truth appears to become harmful, how do we balance the immediate consequences against the long term/fuzzy/uncertain but potentially enormous consequences of violating the truth norm?"

I read jimrandomh's comment as reasoning from this framework (rather than arguing that we should assign truth terminal value), but this might be confirmation bias.

I also support the general norm to default to truth. But I do believe there are cases where the negative consequences of truth become so severe and immediate that it is reasonable to not do so in favour of winning. The bar for that should be very high, but not unreachable.

Winning vs Truth – Infohazard Trade-Offs

by eapache 1 min read7th Mar 202011 comments

12


[Cross-posted from Grand Unified Crazy.]

This post on the credibility of the CDC has sparked a great deal of discussion on the ethics of posts like it. Some people claim that the post itself is harmful, arguing that anything which reduces trust in the CDC will likely kill people as they ignore or reject important advice for dealing with SARS-CoV-2 and (in the long-run) other issues like vaccination. This argument has been met with two very different responses.

One response has been to argue that the CDC’s advice is so bad that reducing trust in it will actually have a net positive effect in the long run. This is an ultimately empirical question which somebody should probably address, but I do not have the skills or interest to attempt that.

The other response is much more interesting, arguing that appeals to consequences are generally bad, and that meta-level considerations mean we should generally speak the truth even if the immediate consequences are bad. I find this really interesting because it is ultimately about infohazards: those rare cases where there is a conflict between epistemic and instrumental rationality. Typically, we believe that having more truth (via epistemic rationality) is a positive trait that allows you to “win” more (thus aligning with instrumental rationality). But when more truth becomes harmful, which do we preference: truth, or winning?

Some people will just decide to value truth more than winning as an axiom of their value system. But for most of us, ultimately I think this also boils down to an empirical question of just how bad “not winning” will end up being. It’s easy to see that for sufficiently severe cases, natural selection takes over: any meme/person/thing that prefers truth over winning in those cases will die out, to be replaced by memes/people/things that choose to win. I personally will prefer winning in those cases. It’s also true that most of the time, truth actually helps you win in the long run. We should probably reject untrue claims even if they provide a small amount of extra short-term winning, since in the long run having an untrue belief is likely to prevent us from winning in ways we can’t predict.

Figuring out where the cut-over point lies between truth and winning seems non-trivial. Based on my examples above we can derive two simple heuristics to start off:

  • Prefer truth over winning by default.
  • Prefer winning over truth if the cost of not winning is destruction of yourself or your community. (It’s interesting to note that this heuristic arguably already applies to SARS-Cov-2, at least for some people in at-risk demographics.)

What other heuristics do other people use for this question? How do they come out on the CDC post and SARS-CoV-2?

12