Sorry, I'm not too familiar with the community, so not sure if this question is about AI alignment in particular or risks more broadly. Assuming the latter: I think the most overlooked problem is politics. I worry about rich and powerful sociopaths being able to do evil without consequences or even without being detected (except by the victims, of course). We probably can't do much about the existence of sociopaths themselves, but I think we can and should think about the best ways to increase transparency and reduce inequality. For what it's worth, I'm a negative utilitarian.
I agree with HiddenTruth and prase. The original post is flawed, because it starts with a perfectly good idea: "if there were a group that 'did science' but was always wrong, it would be a good control group to compare to 'real science'", but then blows it by assuming parapsychologists are indeed always wrong.
FWIW, I too believe parapsychologists are probably almost always wrong, but so what? Who cares what I believe? No one does, and no one should (without evidence), and that's the point.
Yeah. This was put very well by Fyodor Urnov, in an MCB140 lecture:
"What is blindingly obvious to us was not obvious to geniuses of ages past."
I think the lecture series is available on iTunes.
Sounds implausible to me, so I'm very interested in a citation (or pointers to similar material). If true, I'm going to have to do a lot of re-thinking.
In Soviet Russia...
Of course prejudices can be changed, at which point they become postjudices.
Yes, that's it! Thank you!
At the risk of revealing my stupidity...
In my experience, people who don't compartmentalize tend to be cranks.
Because the world appears to contradict itself, most people act as if it does. Evolution has created many, many algorithms and hacks to help us navigate the physical and social worlds, to survive, and to reproduce. Even if we know the world doesn't really contradict itself, most of us don't have good enough meta-judgement about how to resolve the apparent inconsistencies (and don't care).
Most people who try to make all their beliefs fit with all their other beliefs, end up forcing some of the puzzle pieces into wrong-shaped holes. Their favorite part of their mental map of the world is locally consistent, but the farther-out parts are now WAY off, thus the crank-ism.
And that's just the physical world. When we get to human values, some of them REALLY ARE in conflict with others, so not only is it impossible to try to force them all to agree, but we shouldn't try (too hard). Value systems are not axiomatic. Violence to important parts of our value system can have repercussions even worse than violence to parts of our world view.
FWIW, I'm not interested in cryonics. I think it's not possible, but even if it were, I think I would not bother. Introspecting now, I'm not sure I can explain why. But it seems that natural death seems like a good point to say "enough is enough." In other words, letting what's been given be enough. And I am guessing that something similar will keep most of us uninterested in cryonics forever.
Now that I think of it, I see interest in cryonics as a kind of crankish pastime. It takes the mostly correct idea "life is good, death is bad" to such an extreme that it does violence to other valuable parts of our humanity (sorry, but I can't be more specific).
To try to head off some objections:
I offer this comment, not in an attempt to change anyone's mind, but to go a little way to answer the question "Why are some people not interested in cryonics?"