That which can be destroyed by the truth should *not* necessarily be

byalexflint8y24th Oct 201012 comments


I've been throwing some ideas around in my head, and I want to throw some of them half-formed into the open for discussion here.

I want to draw attention to a particular class of decisions that sound much like beliefs.



There is no personal god that answers prayers.           

I should badger my friend about atheism.

Cryonics is a rational course of action.

To convince others about cryonics, I should start by explaining that if we exist in the future at all, then we can expect it to be nicer than the present on account of benevolent super-intelligences.

There is an objective reality.

Postmodernists should be ridiculed and ignored.


If I encounter a person about to jump unless he is told "1+1=3", I should not acquiesce.

I've thrown ideas from a few different bags into the table, and I've perhaps chosen unnecessarily inflammatory examples. There are many arguments to be had about these examples, but the point I want to make is the way in which questions about the best course of action can sound very much like questions about truth. Now this is dangerous because the way in which we chose amongst decisions is radically different from the way in which we chose amongst beliefs. For a start, evaluating decisions always involves evaluating a utility function, whereas evaluating beliefs never does (unless the utility function is explicitly part of the question). By appropriate changes to one's utility function the optimal decision in any given situation can be modified arbitrarily whilst simultaneously leaving all probability assignments to all statements fixed. This should make you immediately suspicious if you ever make a decision without consulting your utility function. There is no simple mapping from beliefs to decisions.

I've noticed various friends and some people on this site making just this mistake. It's as if their love for truth and rational enquiry, which is a great thing in its own right, spills over into a conviction to act in a particular way, which itself is of questionable optimality.

In recent months there have been several posts on LessWrong about the "dark arts", which have mostly concerned using asymmetric knowledge to manipulate people. I like these posts, and I respect the moral stance implied by their name, but I fear that "dark arts" is becoming applicable to the much broader case of not acting according to the simple rule that decisions are always good when they sound like true beliefs. I shouldn't need to argue explicitly that there are cases when lying or manipulating constitute good decisions; that would privileged a very particular hypothesis (namely that decisions are always good when they sound like true beliefs).

This brings be all the way back to the much-loved quotation, "that which can be destroyed by the truth should be". Now there are several ways to interpret the quote but at least one interpretation implies the existence of a simple isomorphism from true beliefs to good decisions. Personally, I can think of lots of things that could be destroyed by the truth but should not be.