LESSWRONG
LW

Ethics & Morality
Personal Blog

2

Poll: Compressing Morality

by Randaly
7th Oct 2010
1 min read
3

2

Ethics & Morality
Personal Blog

2

Poll: Compressing Morality
1Spurlock
0[anonymous]
0Bongo
New Comment
3 comments, sorted by
top scoring
Click to highlight new comments since: Today at 1:03 PM
[-]Spurlock15y10

If we're really only interested in maintaining as much of our "current morality" as possible, than it's not really a moral question, just a question of compression methods. So the answer would be something like "analyze a large set of moral decisions, find prominent themes and typical variations and make these correspondingly inexpensive to represent". Hamming codes seem very applicable, LZW less so.

For the question to be a moral one, you have to ask something more like question 2: "which parts do you deem it most important that we not lose?". Maybe you know this already, but I thought I'd point it out.

I can't figure out a good way to answer (2) because (perhaps unlike most LWers) I don't consider my morality to be modular. Also if there was something in it that I didn't think should be, I would have scrapped it by now.

Reply
[-][anonymous]15y00

Make a bunch of data points for all the things that people do. Add some kind of weighting function for when people think of an action of their own as moral, or an action of someone else's as moral. Prune for internal contradictions: average the weighting function across persons (so that if I say it's great if I do A, score 1000, but horrible if you do A, score 0, we declare that it's score 500 for people to do A.) You'd come up with something interesting, at least.

Reply
[-]Bongo15y00

I'd compress morality into utilitarianism.

I'd choose to stop caring about anticipated experience and "continuity of subjective experience".

Reply
Moderation Log
More from Randaly
View more
Curated and popular this week
3Comments

Two related questions:

 

Suppose that you had to do a lossy compression on human morality. How would you do so to maintain as much of our current morality as possible?

Alternately, suppose that you had to lose one module of your morality (defined however you feel like). Which one would you lose?