Maybe this helps evade a set of jailbreaks where the user could otherwise create a situation where they convince Opus to break its safety measures in order to help Anthropic not lose money?
I actually think that it is both predictable and also a merit to our society that doctors are dumber than they used to be. Following Whitehead’s precept that “Civilization advances by extending the number of important operations which we can perform without thinking of them”, we’d ideally at once broaden the pool of potentially qualified medical practitioners and reduce the difficulty of succeeding at their profession.
If, in the future, any Homer Simpson can safely suture your wounds and chemotherapize your cancers, would you insist they had the reading skills of a medical student of the 1860s?
I’ll grant that the quotations from the centennial Etiquette are simply horrendous, but apart from its prose defects it should be noted that we currently live in a society in which the concept of ‘good breeding’ has no proper equivalent (big loss imo), social skill is defined more by charm than class (mixed bag), and business etiquette has been elevated to a minimal cross-cultural norm of tolerance and non-offensiveness (big win). So we should in general expect a modern treatment of manners to be vaguer, and more people to read other sorts of books for social advice.
The opposite is also frequently true, perhaps even for the very same people: being in a position of authority is also a reliable fix for akrasia. Common examples: “I have a kid now, my family relies on me”, “my community is depending on me”, “my employees need me to make the best decision for the company”.
Even in the example given, almost every officer in the military has both superiors and direct reports.
There are two types of bad scientific theories:
I don't think it's that useful to call them both false. Geocentrism is outdated in the same sense that woodblock printing is outdated: highly inefficient at the task it's meant to do relative to better tools.
Ooh, one way that you could have it is that the human is actually solving problems by programming a bot to solve them, a bit like in Shenzhen I/O, and the bot starts to meet lookalike bots, that act according to the code you've written, but with an opposing goal? And they're on the other side of the mirror so you can only change your own bot's behavior
I'd never heard of it! But it does seem like current-me minds dying because it means there is no more experiencing afterwards of any sort, not the dying itself, I think I personally wouldn't mind clone-teleportation. O Death, where is your sting?
That does seem exactly like the sort of thing that I was gesturing at with the education section, neat
Interesting! Is there a way to limit the player's agency such that, within the rules of the game, the mirroring mechanic would be effectively true?
It's over, consciousness skeptics. I've depicted myself as a bat, you as Thomas Nagel, and Dwarkesh Patel as Dwarkesh Patel.