[...] Ricardo’s Law Of Comparative Advantage was one of the most inspirational things he’d ever read. This is an economic theorem which says that under ideal frictionless capitalism, even if you’re the worst person in the world at everything, you can do the thing you’re relatively least worst at, trade with other people, and be guaranteed to make nonzero money. Maybe not very much money, but nonzero. It will mean that your actions have made other people nonzero better off, and they are nonzero grateful to you.
I do not find this inspiring! Under ideal frictionless capitalism, if you’re the worst person in the world at everything, you can make nonzero money, but most likely less then the cost of living, so the most likely outcome is that you die in abject poverty, having in the meantime done some things that made other people nonzero better off, and which might - or might not - outweigh any costly screw ups you may have made or suffering you may have undergone in the process.
And that's under ideal frictionless capitalism! In the real world this can and does sometimes happen even to people who are pretty good at some things!
(A list of songs compiled from impromptu group singing sessions at several rationalist-adjacent events. Anything where the program was decided in advance is excluded from this list. This is all done from memory of events I happened to be at, and I'm more likely to remember songs I'm familiar with.)
(and see the 'Filk and filk-adjacent' and 'Uncategorized' sections, which include some that weren't written for Solstice but are oft sung at it)
Made some updates to this:
So, in general not having your values changed is an Omohundro goal, right? But would I suggest that if you you change your utility function[1] from U(w) = weightedSumSapientSatisfaction(w) + personalHappiness(w) + someIdiosyncraticPreferences(w)
or whatever it is, to U(w) = weightedSumSapientSatisfaction(w) + personalHappiness(w) + someIdiosyncraticPreferences(w) + 5000
, all your choices that involve explicit expected utility comparisons will come out the same as before, but you'll be happier.
There are a lot of issues with utility functions as a framing for describing actual human motivations, but bear with me.