Posts

Sorted by New

Wiki Contributions

Comments

I don't know of any real-life analogue, though I would bet that some exist. I recall that there is a fictional example in "The Adventures of Huckleberry Finn", though: Huck believes that helping Jim escape slavery would be stealing, since that is what he had been taught living in the antebellum South, and he concludes that he will go to Hell for doing this. But he decides to help Jim anyway, even if it's the "wrong" thing to do.

Seems relevant in the wake of the FTX scandal. I've seen people blaming effective altruism for the scandal, as if it FTX's fraudulent practices prove that the philosophy of giving to charities that demonstrably do the most good is flawed. Even if the entire EA movement is cult-like and misguided, that doesn't mean that the principle it's based on is wrong. I think the modern EA movement is misguided to some extent, but only because they have misjudged which causes are the most effective, and this shouldn't stop anyone else from donating to causes that they believe are more effective.

I actually think it is possible for someone's beliefs to anti-correlate with reality without being smart enough to know what really is true just to reverse it. I can think of at least three ways this could happen, beyond extremely unlikely coincidences. The first two are that a person could be systematically deceived by someone else, until they have more false beliefs then true ones, and the second is that systematic cognitive biases could reliably distort their beliefs. The third is the most interesting one, though: If someone has a belief that many of their other beliefs depend on, and that belief is wrong, it could lead to all of those other beliefs being wrong as well. There are plenty of people who base a large portion of their beliefs on a single belief or cluster of beliefs, the most obvious example being the devoutly religious, especially if they belong to a cult or fundamentalist group. Basically, since beliefs are not independent, people can have large sets of connected beliefs that stand or fall together. Of course, this still wouldn't affect the probability that any of their beliefs not connected to those clusters are true, so it doesn't really change the conclusion of this essay by much, but I think it is interesting nonetheless. At the very least, it is a warning against having too many beliefs that all depend on a single idea.