A Long Time From Now, in a Galaxy Far, Far Away:

Episode 1: The Phantom Mugger

One night, on your way home from work, you find yourself face to face with a mysterious shadowy figure. “Your money or your life,” he hisses through his teeth. The tip of a knife pokes uncomfortably into your jacket, backing up his threat. You quickly hand over your phone and wallet. Once you get home you cancel your credit cards, but he’s already spent hundreds of dollars around town. After reporting the crime, you can only hope the police catch up to him.

 

Episode 2: Revenge of the Fists

A few weeks later and in similar circumstances, you’re accosted once again. “This is a mugging,” he whispers in slightly nervous tones. “Give me your wallet.” 

You pause. “Where’s your weapon?”

“Well…I’m part of a school of non-murderous muggers. Did you know the penalty for attempted murder is 9 years? I have a future to think about! No, if you refuse I’ll just beat you up.”

He doesn’t look so tough to you. Frustrated with this inexplicable spike in crime and conscious of the money in your wallet, you decide not to take this lying down. Your fist cannonballs into his chest, a surprised “Oomph” escaping his lips. He swings in retaliation, hitting you across the face. You elbow him in the gut. He falls over, and with an extra kick at his prone form you stagger back home. Eyeing your black eye in the mirror, you feel a rush of exhilaration nonetheless at escaping with your wallet, and your pride.

 

Episode 3: A New Hoax

One day leaving the grocery store, you’re approached by a distinguished-looking older gentleman on the street. “Excuse me, sir,” he says meekly. “A moment of your time?”

“I have a compelling proposition for you. You see, I teach as a professor at the local University of Game-Theoretic Equilibria. My university has had some difficult years recently, and we may have to declare bankruptcy. That’s why I’ve come to you. You look like a nice young man - and not one immune to reason. I ask 100 dollars from you to help save our university.” 

You apologize, and turn to walk away - mumbling that you really can’t help him. 

“Sir, wait! This isn’t just a request - it’s an important offer. I know you care about animal welfare: I saw you drop some money into the “Stop Factory Farming!” tin as you exited the grocery store. I’m not a violent man, but I promise you now: if you refuse my offer, I’ll give 1,000 dollars to the most horrendous farm I know of.”

You pause, stunned. 

“You may think me despicable, sir, and maybe rightly so; but our university is in dire straits. And besides, you should clearly accept - for what’s a hundred dollars compared to a thousand, and to a cause you abhor?”

You think for a moment. “I can’t deny the force of your argument. But why would I trust your word? Maybe if I refuse, you’ll go support the university with that money, instead of wasting it on a factory farm.”

“Ah, a reasonable concern, as I would expect from a sharp young man such as yourself. But your fears are groundless - for I am famed deontologist professor Frank Candor, author of The Words that Bind. If I failed to follow through on a promise like this one, my reputation would be ruined! One thousand dollars wouldn’t be worth that.”

“So you see, it’s very simple. Given your regard for animals, the only logical choice is to pay up.” 

With extreme reluctance, you fork over the cash, resolving to avoid any further passers-by. As you depart, you see the professor accosting another victim - a woman who dropped some money into an “End World Hunger” donation box. Feeling even more chagrined, you head home, wishing there had been some better way.

 

Episode 4: The Victim Strikes Back

The next day, you avoid that street. But on your walk back from work, almost home, you spot Dr. Candor lurking in an alley. He sees you too, and comes sprinting forward. “Excuse me, young man! Excuse me!”

You take off at a run toward your block, desperate to avoid receiving his “offer”. Pounding up the stairs to your apartment, you frantically insert your keys into the lock, open the door, and start to shut it closed - as his arm thrusts its way into your hall. You try to push him back outside and slam the door, but he gives a strong heave, and you’re shoved bodily back into the hallway as your door bangs fully open. He stands panting against the door frame.

“Now then…young man...you know the score. One…hundred dollars…or a thousand…to factory farms. What…do you say?”

You glare at him. “I thought we were done. Now that you know where I live, what’s to stop you from coming back every day until I’m flat broke?”

“Well…nothing really…but I didn’t want to scare you off.”

You sit down and close your eyes, rubbing your temple. “Actually, I’ve had a thought.”

“You’re only threatening me because you’re hoping I’ll give in to your threat. And it’s bad enough that I don’t want you to carry it out. But you don’t want to either - that’s just money out of your pocket. So here’s what we’ll do.”

“From now on, I’ll randomize my response. Since you’re asking for ten times less than you’re threatening me with, I’ll pay you 10 out of 11 times. But one time out of 11, I’ll refuse. That means you’ll lose a thousand dollars once for every ten times you get a hundred - making you no better off than if you’d never threatened me at all.”

The professor protests. “But this plan hurts you! Nearly 10% of the time I’ll have to carry out my threat, leaving you (and me) $1,000 worse off than before. That means following this strategy will lose you nearly $100 in expectation per offer.”

“Sure, but it’s a symmetrical cost - because it leaves both of us $100 worse off than me just paying you, it should discourage you from making the offer. And I can afford to pay it, because it’s the cheapest way to get you off my back. Whereas if I simply refused to pay every time, you’d have to follow through on your threat at least once - and that would cost us both $1,000.”

You add: “Wait a second. Just to be sure you’re not tempted to gamble with this offer, I’ll actually refuse to pay once every ten times. That way you can expect to lose 10 dollars on average every time you try this again.”

“Now let’s roll the dice! On 10, we both lose.” Producing a die from your extensive collection of board games, you roll a 6. “It looks like today is our lucky day! I have your money right here,” you say, handing Candor a bill from your wallet. He seems disheartened nonetheless, and moves to leave.

Satisfied - at least as much as one can be in your shoes - you wave the professor a cheeky goodbye and settle in for the night. He doesn’t come back.
 

Conclusion - The Last Mugging

The first two examples illustrate the basic game theory of threats. The first story, the armed mugger, is the only one in which he’s able to reliably extort us. His knife represents an asymmetric cost to one party. Because we would lose quite a lot if we refused to give in to his threat and got stabbed (while he wouldn’t), mugger and victim “compromise” by giving up our wallet - a small cost to us and small benefit to the mugger.

In the second story we see how important this asymmetry was. Self-interested agents will rationally resist coercion even at some cost to themselves, because this means other agents are incentivized not to threaten them. So evolution has implemented a hacky version of this logic in humans, and people who have never thought about game theory often find themselves hating their enemies and plotting to avenge their wrongs, even if that means costly fighting rather than an unequal peace (cf. many wartime examples, including Ukrainian resistance to Russia). That’s why we fight the mugger the second time - giving him some money is probably better from a selfish perspective than getting in a fight and plausibly still losing the money, except for our intrinsic motivation to resist coercion.

The last two stories are analogous to acausal trade, specifically acausal threats. Without the advantage of asymmetric costs, Professor Candor was initially able to extort us simply by threatening to spend money on things we didn’t like. But even though he has more money to spend, the professor has to use his own money to enforce any threat, which is ultimately his undoing. The acausal version of this is some sort of actor which threatens humanity with negative consequences (e.g. torturing simulated copies of us) for refusing to give it resources, but must spend its own valuable resources to do so. 

Our randomized response is one way of moving the payoff matrix from Candor’s threat to a more negative space, for both him and us, so that he bears some cost of the negative-sum interaction. This works because our situations are symmetrical. Fundamentally, it’s hard to extort us because we can use our resources, even if they’re much more limited, in the same ways he’s using his resources on us. We could spend money to hurt his values (say, by supporting efforts to shutter his university) or to negate the damage he does to our own values, by refusing his threats and donating more money to our preferred charitable causes. At that point both parties are left worse off, both from spending their own resources and from the other party’s efforts.

I’ll probably write up a more rigorous version of this line of argument later, but for now I just wanted to point at some general ideas re: both standard and acausal threats.

New to LessWrong?

New Comment
1 comment, sorted by Click to highlight new comments since: Today at 9:29 AM

Self-interested agents will rationally resist coercion even at some cost to themselves, because this means other agents are incentivized not to threaten them. So evolution has implemented a hacky version of this logic in humans, and people who have never thought about game theory often find themselves hating their enemies and plotting to avenge their wrongs, even if that means costly fighting rather than an unequal peace

I think that Alex M. has done some pretty good research on this; e.g. even just over the last 1000 years in Europe, people who were good at ideological conformity in the religious environment (e.g. quick understanding of Pascal's-wager-like arguments without the aid of literacy, but either don't develop traits that allow Pascal's-wager-like arguments allow clever speakers to take their resources or dissuading them from maximizing offspring, or developing counter-traits e.g. causing them to randomly just maximize offspring anyway). But it's pretty clear that the last 10,000 years of civilization probably had something to do with people's tendency to choose social reality over objective reality, since locking up and refusing to cooperate with a line of logic is a great way to survive in a genepool where some people are smarter than others and think up galaxy-brained coercion strategies.

some sort of actor which threatens humanity with negative consequences (e.g. torturing simulated copies of us) for refusing to give it resources, but must spend its own valuable resources to do so.

This is one of the big reasons why it's so ridiculous what the DoD was doing with the flying saucer videos, leaving it open-ended whether they think they were human, they clearly have no idea what they're doing (e.g. maybe they brought in a rando from SETI or something). The impression I get is that they and others in the Natsec community are generally competent enough to understand that if there were aliens, there would be quadrillions of ships, and maybe that they would probably be as smart to humans as humans are to ants. But they probably aren't competent enough to understand that the question of whether aliens would torture humans in order to maximize compliance, is a math question; let alone matters of realityfluid and simulated copies or other things that would actually happen if you actually encountered an opposing force that was as smart to you as you were to ants (Yudkowsky recently put the odds of friendliness/benign at 5%). The cluelessness on the DoD's part was just embarrassing; it's not rocket science, and they don't just have tons of rocket scientists, their organization is the direct deliberate cause of why so many rocket scientists were pumped out in the first place.