ETA: As stated below, criticizing beliefs is trivial in principle, either they were arrived at with an approximation to Bayes' rule starting with a reasonable prior and then updated with actual observations, or they weren't. Subsequent conversation made it clear that criticizing behavior is also trivial in principle, since someone is either taking the action that they believe will best suit their preferences, or not. Finally, criticizing preferences became trivial too -- the relevant question is "Does/will agent X behave as though they have preferences Y", and that's a belief, so go back to Bayes' rule and a reasonable prior. So the entire issue that this post was meant to solve has evaporated, in my opinion. Here's the original article, in case anyone is still interested:
Pancritical rationalism is a fundamental value in Extropianism that has only been mentioned in passing on LessWrong. I think it deserves more attention here. It's an approach to epistemology, that is, the question of "How do we know what we know?", that avoids the contradictions inherent in some of the alternative approaches.
The fundamental source document for it is William Bartley's Retreat to Commitment. He describes three approaches to epistemology, along with the dissatisfying aspects of the other two:
- Nihilism. Nothing matters, so it doesn't matter what you believe. This path is self-consistent, but it gives no guidance.
- Justificationlism. Your belief is justified because it is a consequence of other beliefs. This path is self-contradictory. Eventually you'll go in circles trying to justify the other beliefs, or you'll find beliefs you can't jutify. Justificationalism itself cannot be justified.
- Pancritical rationalism. You have taken the available criticisms for the belief into account and still feel comfortable with the belief. This path gives guidance about what to believe, although it does not uniquely determine one's beliefs. Pancritical rationalism can be criticized, so it is self-consistent in that sense.
Read on for a discussion about emotional consequences and extending this to include preferences and behaviors as well as beliefs.
"Criticism" here basically means philosophical discussion. Keep in mind that "criticism" as a hostile verbal interaction is a typical cause of failed relationships. If you do nothing but criticize a person, the other person will eventually find it emotionally impossible to spend much time with you. If you want to keep your relationships and do pancritical rationalism, be sure that the criticism that's part of pancriticial rationalism is understood to be offered in a helpful way, not a hostile way, and that you're doing it with a consenting adult. In particular, it has to be clear to all participants that there every available option will, in practice, have at least one valid criticism, so the goal is to choose something with criticisms you can accept, not to find something perfect.
We'll start by listing some typical criticisms of beliefs, and then move on to criticizing preferences and behaviors.
Criticizing beliefs is a special case in several ways. First, you can't judge the criticisms as true or false, since you haven't decided what to believe yet. Second, the process of criticizing beliefs is almost trivial in principle: apply Bayes' rule, starting with some reasonable prior. Neither of these special cases apply to criticizing preferences or behaviors, so pancriticial rationalism provides an especially useful framework for discussing them.
Criticizing beliefs is not trivial in practice, since there are nonrational criticisms of belief, there is more than one reasonable prior, Bayes' rule can be computationally intractable, and in practice people have preexisting non-Bayesian belief strategies that they follow.
With that said, a number of possible criticisms of a belief come to mind:
- Perhaps it contains self-contradictions.
- Perhaps it cannot arrived at by starting with a reasonably unbiased prior and doing updates according to Bayes' rule. (As a special case, perhaps it is contradicted by available evidence.)
- Perhaps it is so structured that it is invulnerable to being changed after it is adopted, regardless of the evidence observed.
- Perhaps it does not make predictions about the world.
- Perhaps it is really a preference or a behavior. ("I believe in free speech" or "I believe I'll have another drink.")
- Perhaps it is unpopular.
- Perhaps it is inconsistent with some ancient religious book or another.
The last two of these illustrate that the weight one gives to a criticism is subjectively determined. Those last two criticisms are true for many beliefs discussed here, and the last one is true for essentially every belief if you pick the right religious book.
Once you accept the idea that beliefs can be criticized, it's a small step from there to adopting a similar approach to preferences and behavior. Here are some plausible criticisms of a preference:
- Perhaps it is not consistent with your beliefs about cause-and-effect. That is, the preference prefers X over Y and also prefers the expected consequences of Y over the expected consequences of X.
- Perhaps it cannot be used to actually decide what to do. There are several subcases here:
- Perhaps it has mathematical properties that break some decision theories, such as an unbounded utility. Concerns about actual known breakage or conjectured breakage are two different criticisms.
- Perhaps it is defined in such a way that what you prefer depends on things you cannot know.
- Perhaps it gives little guidance, that its, it considers many pairs of outcomes that you expect to actually encounter as equally preferable.
- Perhaps the stated preference is ineffective or counterproductive as a social signal. There are several subcases here:
- Perhaps it is psychologically implausible. That is, perhaps it is so unlikely that a human would hold such a preference that stating the preference to others will lead the others to reasonably conclude that you're a liar or confused, rather than leading them to conclude that you have the given preference.
- Perhaps it does not help others to predict your behavior. For example, it may require complicated decisions based on debatable guesses about the remote consequences of one's actions.
- Perhaps it is not something that anybody else would want to cooperate with.
- Perhaps it is at cross-purposes with the specific people you want to signal to.
- Perhaps the preference does not include preferring that you want to stay alive enough, so one would expect the preference to select itself out if there's enough time and selection pressure. ("Selection" here might mean biological evolution or some sort of technological process, take your pick based on your beliefs.)
- Perhaps the preference does not include preferring that you accumulate enough power to actually do anything important.
- If you believe in objective morality, perhaps the preference is inconsistent with objective morality. Someone who does believe in objective morality should fill in the details here.
- Perhaps a preference is likely to have problems because it is held by only a non-controlling minority of the persons mind. This can happen in several ways:
- Perhaps a preference is likely to be self-deception because it is being claimed only because of a philosopical position, and not as a consequence of introspection or generalization from observed behavior.
- Perhaps a preference is likely to be self-decpetion because it is being claimed only because of introspection, and we expect introspection to yield socially convenient lies.
- Perhaps a claimed preference is likely to be poorly thought out because it arose nonverbally and has not been reflected upon.
- Perhaps a preference is an overt deception, that is, the person claiming it knows they do not hold it. This criticism can be used by a person against themselves if they know they are lying and want clarity, or used by others against a person if the person is a poor liar.
- Perhaps a preference has short-term terminal values that aren't also instrumental values.
We can also criticize behavior in at least the following ways:
- Perhaps the behavior is not consistent with any reasonable guess about your preferences.
- Perhaps the behavior is not consistent with your actual statements about your preferences.
- Perhaps the behavior does not promote personal survival.
- Perhaps the behavior is undesired by others, that is, others would prefer that you not do it.
- Perhaps you did not take into account your own preferences about the outcome for others at the time you did the behavior.
- Perhaps the behavior leads to active conflict with others, that is, in addition to it being against the preferences of others, it motivates them to act against you.
- Perhaps the behavior will lead others to exploit you.
- Perhaps you didn't take into account some of the important consequences of the behavior when you chose it.
In all cases, if you're doing or preferring or believing something that has a valid criticism, the response does not necessarily have to be "don't do/prefer/believe that". The response might be "In light of the alternatives I know about and the criticisms of all available alternatives, I accept that".
Of course, another response might be "I don't have time to consider any of that right now", but in that case you are at a level of urgency where this article won't be directly useful to you. You'll have to get yourself straightened out when things are less urgent and make use of that preparation when things are urgent.
Assuming this post doesn't quickly get negative karma, a reasonable next step would be to put a list of criticisms of beliefs, preferences, and behaviors on a not-yet-created LessWrong pancritical rationalism Wiki page. Posting them in comments might also be worthwhile. If someone else could take the initiative to update the Wiki, it would be great. Otherwise I would like to get to it eventually, but that probably won't happen soon.
Question for the readers: Is criticising a decision theory a useful separate category from the three listed above (beliefs, preferences, and behaviors)? If so, what criticisms are relevant?