Today's post, Your Rationality is My Business was originally published on April 15, 2007. A summary (from the LW wiki):

As humans, we have an interest in the future of human civilization, including the human pursuit of truth. That makes your rationality my business. However, calling out others as wrong can be a dangerous action. Some turn to relativism to avoid it, but this is too extreme. Disagreements should be met with experiments and arguments, not ignored or met with violence and edicts.


Discuss the post here (rather than in the comments of the original post).

This post is part of a series rerunning Eliezer Yudkowsky's old posts so those interested can (re-)read and discuss them. The previous post was New Improved Lottery, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it, posting the next day's sequence reruns post, summarizing forthcoming articles on the wiki, or creating exercises. Go here for more details, or to discuss the Sequence Reruns.

New to LessWrong?

New Comment
7 comments, sorted by Click to highlight new comments since: Today at 7:49 AM

Playing the lottery has nothing to do with truth. If someone is biased, you can call them objectively wrong. A bias is an epistemic confusion. But if one does recognize the cognitive factors leading to certain decisions, and chooses to embrace them as valuable, one is not biased.

I believe that it is right and proper for me, as a human being, to have an interest in the future, and what human civilization becomes in the future.

It's not right or wrong. Your interests are not subject to epistemic criticism. If you care more strongly to play the lottery than to save human beings, that's neither right nor wrong, as long as you are not confused about what you are doing and its consequences.

If someone else does not care about the future of humanity, while you do, you might call their values instrumentally wrong in relation to your own goals, or even that of most humans. But if you do not mention the context in which someones values are wrong, you engage in malicious persuasion by signaling that their goals are wrong in an absolute, epistemic sense.

If they were really selfish, they would get on with making money instead of arguing passionately with others.

That sentence is completely confused. If selfishness meant not to care to influence other people in any way, how could one earn money, how would one spend it? Selfishness means to care more strongly about the well-being of oneself and one's goals than that of others.

If it could be shown that caring about others is instrumental in reaching selfish goals, then even the rational selfish would engage in altruism.

If one's goal is to correct other people, arguing is not less a selfish activity than earning money, if you care about money. And even if one doesn't care to correct other people in and of itself, it can be instrumental.