There are also people that genuinely prefer others' well-being over a marginal increase in theirs—mostly wealthy or ascetic folks—and I think this is the target audience of EA evangelism. However, a lot of people don't genuinely prefer others' well-bing over a marginal increase in their own (or at least, the margin is pretty small), but these people still end up caught with Singer's thought experiment, not realizing that the conclusions it leads them to (e.g. that they should donate to GiveWell) are inconsistent with their more fundamental values.
The ellipsis is, "genuinely prefer others' well-being over a marginal increase in their own," from the previous sentence.
They have to be smarter to recognize their actual beliefs and investigate what is consistent with them. They have to be more honest, because there is social pressure to think things like, "oh of course I care about others," and hide how much or little they care.
I think the title is fine. The post mostly reads, "if you want a quantum analogue, here's the path to take".
Yeah, that was about the only sentence I read in the paper. I was wondering if you'd seen a theoretical justification (logos) rather than just an ethical appeal (ethos), but didn't want to comb through the maths myself. By the way, fidelity won't give the same posterior. I haven't worked through the maths whatsoever, but I'd still put >95% probability on this claim.
Is there a reason they switched from divergence to fidelity when going quantum? You should want to get the classical Bayes' rule in the limit as your density matrices become classical, and fidelity definitely doesn't give you that.
Please avoid the abbreviation "MWI" until you've at least written "many-worlds interpretation" once. I had to do a ctrl+f and go to the eighth iteration of "MWI" before I could read your post, because all the information I was getting is this is something like UDASSA, and MWI is some information or decision theory term that I don't know, but need to to even make sense of the first paragraph.
Then why is it too difficult for you to write down one of those definitions or theories where your criticism makes any sense?
Words demarcate the boundaries of meanings. You seem to be claiming there is some undefinable quality to the word "truth" that is useful to us, i.e. some unmeaningful meaning. Believe in epehemeral qualities all you like, but don't criticize me for missing out on some "truths" that are impossible to discover anyway.
Millions of years ago, the world was pretty much zero sum. Animals weren't great at planning, such as going back for reinforcements or waiting months to take revenge, so fights were brief affairs determined mostly by physical prowess, which wasn't too hard to predict ahead of time. It was relatively easy to tell when you can get away with bullying a weaker animal for food, instead of hunting for your own.
When humans come along, with tools and plans, there is suddenly much less common knowledge when you get into a fight. What allies does this other human have to call upon? What weapons have they trained in? If they're running away, are they just weaker, or are they leading you into a trap? If you actually can win the fight, you should take it, but the variance has shot up due to the unknowns so you need a higher expected chance of winning if you don't want an unlucky roll to end your life. If you enter fights when you instictively feel you can win, then you will evolve to lower this instictual confidence.
If you "want to stop smoking" or "want to donate more" but do not, you are either deluding yourself, lacking intelligence, or preferring ignorance. Deluding yourself can make you feel happier about yourself. "I'm the kind of person who wants to help out other people! Just not the kind who actually does [but let's not think about that]." Arguably, this is what you really prefer: to be happy, whether or not your thoughts are conistent with your behavior. If you are smart enough, and really want to get to the bottom of any inconsistencies you find yourself exhibiting, you will, and will no longer be inconsistent. You'll either bite the bullet and say you actually do prefer the lung cancer over the shakes, or actually quit smoking.
Are the majority of rationalists deluded or dishonest? Absolutely. As I said in my post, utilitarianism is not well-defined, but most rationalists prefer running with the delsuion.