Today's post, The Moral Void was originally published on 30 June 2008. A summary (taken from the LW wiki):

 

If there were some great stone tablet upon which Morality was written, and you read it, and it was something horrible, that would be a rather unpleasant scenario. What would you want that tablet to say, if you could choose it? What would be the best case scenario?
Why don't you just do that, and ignore the tablet completely?


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was What Would You Do Without Morality?, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New Comment
2 comments, sorted by Click to highlight new comments since:

What IS morality? Is there a definition somewhere around here, somewhere in the sequences?

As the term is used here, it seems to be nothing more (or less) than a list of your preferences rationally processed to be internally consistent and especially to resolve conflicts between immediate preferences and longer-term preferences.

What does the question, "What would you do without morality?" can then have at least two meanings I can think of: 1) What would you do without preferences? Well without preferences I would do nothing. 2) What would you do with only your preferences with no rationalization to resolve conflicts between your preferences? I'd imagine that would look largely like what I do now, with the possible exceptions that I would be even more self-indulgent when it came to immediate gratification and so might wind up with a broken marriage, credit card debt, a higher category of obesity, and venereal diseases.

I certainly wouldn't recommend for most males "maximizing genetic fitness." It seems likely that maximizing genetic fitness for most men would be trying to get the men you respect most to fuck your wife for you. A human breeding program that limited male breeding to the top 10% of males would seem to be superior to the one we have now. I have seen it stated somewhere that genetic statistics suggests that 80% of women and 40% of men who ever lived have contributed to the genetics of modern humanity. Presumably the 40% of men was a concession to the lack of mobility that the human race has had until recently. Presumably with the kind of information and matter flow we are capable of now it would be trivial to do a better job of breeding based on a number like 10% of men, even using highly imperfect filters for picking the 10%. In any case, most of me and my fellow males would NOT be in the top 10%, even though we would all irrationally think we are.

The term gets used in a few different ways throughout the Sequences.

IIRC, the post you're referring to is trying to point out some of the contradictions in the way people think about morality, so when it asks "what would you do without morality?" "morality" is used to refer to something other than preferences... something more like objective morality... in order to make the point that even in the absence of that thing, one still has preferences, and that if one's preferences conflict with that thing, one would still prefer to follow one's preferences, so clearly what actually matters is one's preferences.

My summary is perhaps unfair; the whole thing struck me as question-begging, though I happen to agree with the conclusion.

At other places "morality" and related words are used in a broader sense, to refer to an as-yet-unspecified set of values. At still other places "morality" and related words are used more narrowly to refer to the result of some unspecified function that takes as input the preferences of various brains and aggregates that input to generate a coherent set of preferences.

I found it helpful, when trying to make sense of what the author is trying to say in the Metaethics sequence, to keep track explicitly of what "morality" (and related words like "right", especially when italicized) are probably being used to refer to. (In local jargon: Taboo "morality.") He often doesn't use them to refer to what i would ordinarily use them to refer to, so it's easy for me to misunderstand what he's saying.