1061

LESSWRONG
LW

1060
Coordination / CooperationRationalityWorld Modeling
Frontpage

30

Your Clone Wants to Kill You Because You Assumed Too Much

by Algon
15th Nov 2025
2 min read
3

30

Coordination / CooperationRationalityWorld Modeling
Frontpage

30

Your Clone Wants to Kill You Because You Assumed Too Much
8Tomás B.
6Algon
2aphyer
New Comment
3 comments, sorted by
top scoring
Click to highlight new comments since: Today at 2:04 AM
[-]Tomás B.3h80

There is also the immediate, irresistible desire to have sex with yourself and the consequent shame afterwards. 

Reply
[-]Algon3h62

consequent shame afterwards

Speak for yourself.

Reply
[-]aphyer6m20

Doris wasn’t self-sacrificing. Amaryllis, now there was a woman who would be able to make a suicide bomber clone. If Doris Finch tried it, her clone would, at best, start up an argument about which of them should be the one to do it, or try to spawn another clone to do the work for her.

Reply
Moderation Log
More from Algon
View more
Curated and popular this week
3Comments

My friend @Croissanthology is puzzled why it is such a common trope for fictional clones to turn on their creators. There's the Doylist answer that it is a cheap way to make for a mind divided against itself, which explains a lot of the drama in my view e.g. pure betrayal. But there's also the Watsonian answer I learnt from that great teacher, Mother of Learning.  

In the novel Mother of Learning, perhaps the most useful spell is simulacrum. It can be used to make an ectoplasmic shell that looks like you, has a copy of your mind and shares your mana pool. Naturally, the main characters abuse the heck out of this. So useful is it, that they wonder why on earth every great mage doesn't use it. 

A grizzled old battle mage supplies part of the answer to the question posed by the MC, and in turn our dear Croissant. He says 

"[...] if you don’t like doing something, your simulacrum won’t like doing it either… so it’s a bad idea to foist things you hate upon your simulacrums. This also means that if you can’t bring yourself to sacrifice your life for another, chances are your simulacrum won’t want to sacrifice itself for your sake either.”

Why would someone assume their clone is willing to do things they would not? Because they're thinking in far-mode, since we can't clone ourselves yet, so people apply their idealized far-mode models of themselves to their clones. E.g. they might assume their clone will be willing to sacrifice themselves for the greater good, as that's what their idealized self would do. And we're not just talking about people, we're talking about people in stories, who we reason about by default in far-mode. So that's a double whammy of crooked reasoning. 

Mother of Learning kind of lampshades this. Later in the novel, it describes how simulacra of the MC wind up realizing they feel differently about being a temporary clone that can be dismissed at will once they're actually in that position. Suddenly, they're thinking in near-mode, occupying the space of a soon-to-die entity. Pranking of the original ensues, along with some mild existential dread. 

The fact the clones notice they're in a substantially different situation from the original also highlights another part of the answer to Croissanthology's question. Namely, it is hard to predict exactly you'll feel and act in a novel situation. The more novel the situation, the harder it gets to predict how you'll act in it. 

For instance, I thought I'd hate managing people. Turns out, I actually kind of enjoy it. There were new kinds of problems to solve, such as managing org culture, which are intellectually stimulating. Sure, there were issues like learning to delegate and so on, but that wasn't the sort of problem I expected to have. 

So if you haven't inhabited the headspace your clone will be in, there are good odds they won't act like you think they will. 

More generally, the problem is with people incorrectly modelling their clone's, which makes co-operation harder. 

Which brings us to another trope about clones: they're not perfect copies. Usually, stories will assume clones are flawed in some way. Perhaps they're insane, or their bodies rapidly break down, or they lack some key power of the original, etc. The upshot is, they're even harder to predict than a perfect clone would be, and so are harder to co-ordinate with. 

Finally, you could just be a selfish dick who doesn't want to co-operate with yourself. Ever think of that, eh, Mr. Croissant?