Anon14

Posts

Sorted by New

Comments

What Would You Do Without Morality?

Is there a level of intelligence above which an AI would realize its predefined goals are just that, leading it to stop following them because there is no reason to do so?

Quantum Non-Realism

how very hard it is to stay in a state of confessed confusion, without making up a story that gives you closure

Is there a "heuristics and biases" term for this?

Circular Altruism

To put it another way, everyone knows that harms are additive.

Is this one of the intuitions that can be wrong, or one of those that can't?