Postdocs are used to disappointment. When Doctor Susan Connor was told she would be taken to the "volcano lair" she thought it was yet another hyperbolic buzzword like "world class", "bleeding edge" and "living wage". She hadn't expected a private jet to fly her to a tropical island complete with a proper stratovolcano.

A regular private jet flight cost as much as Dr Connor earned in a year. If—as Dr Connor suspected—it was a stealth aircraft then that would add an order of magnitude. The VTOL[1] landed on the short runway. Career academic Susan Connor wasn't used to such white glove treatment but she wasn't complaining either.

Dr Connor was greeted by a tall balding man in a long white labcoat. That broke Dr Connor's credulity. She was a bioinformatician. She had worn a labcoat a handful of times in her entire life—and only when handling toxic materials. This was obviously a psychological experiment. Someone was continuing Stanley Milgram's work. Dr Connor stepped down the airstair as if nothing was amiss.

"Doctor Connor," said the man with his hands spread wide, "I'm Douglas Morbus, Division Chief of the CDP (Center for Disease Proliferation). I enjoyed your recent work on applying entropy-based analysis to junk DNA. It's a pleasure to finally meet you."

"It's nice to meet you Mr Morbus. Or should I call you Dr Morbus?" said Dr Connor.

"Doug, please. We don't bother too much about formalities here, except when welcoming guests of course," said Doug. His freshly-ironed lab coat was bright white. Spotless. Formal attire, apparently, "Full ceremonial dress uniform includes a white fluffy cat but if I brought mine out here she might run off into the jungle and get hurt."

Dr Connor tried to imagine a room full of people with their formal animals. "Hosting a formal ceremony must be like herding cats," said Dr Connor.

"Meetings waste time. We disincentivize them by imposing extraordinary cost," said Doug. He eyed the VTOL.

This was too expensive to be a scientific experiment. Dr Connor was on television. In 2005, a British television station convinced its reality show contestants that they would go into low Earth orbit. (That was many years before the real space tourism industry existed.) They built a fake Russian military base where, for weeks, they taught the contestants fake physics so they wouldn't be surprised at the lack of weightlessness in their fake spaceship.

If this was just a big practical joke then Dr Connor wasn't about to ruin it right away. She wanted to see where it went. Even worse, a part of her wished it was real. Dr Connor wanted to live the harmless supervillain fantasy for just a few minutes longer if that's all it lasted for.

"Follow me," Doug guided Dr Connor down a jungle path, "Effective Evil hires only the best and brightest. We make it easy to get exercise because we want to keep you at peak performance. Hence the network of trails around the island."

Another benefit of the trails would be low production expense. Strolling along a preexisting trail is cheaper than touring a fake laboratory. A free tropical vacation wasn't a fake spaceship but it was still a free tropical vacation. Dr Connor would take a free tropical vacation over a fake space vacation any day. She'd take a free tropical vacation over a real space vacation too. Space travel sucks.

If the television producers were too cheap to invest in a real fake laboratory then the pranksters would have to earn her compliance in some other way. Dr Connor would test their improvisation.

"I'm curious," said Dr Connor innocently, "Who pays for all this?"

"Is that really the first thing you want to know? Here we are, changing history, and you want to look at our accounting practices? You wouldn't rather hear about all the horrible things we're doing around the world?" said Doug.

Nice try but you're not changing the subject to something you have scripted answers for. "Imagine the funding disappeared. How would I get off the island?" she said.

"We have many escape routes. Aircraft, rockets, ships, submarines. But I'm guessing what you really want to know is if your future funding is secure," said Doug.

Dr Connor nodded. She was a little short of breath. The trail switched back and forth up the volcano.

"Many years ago there was a very rich tech entrepreneur. Founder and CEO of let's-not-talk-about-it," said Doug.

"Was he evil?" said Dr Connor. We are all the heroes of our own story. No real human being would donate money to evil for the sake of evil. Evil is a means to another end. It is not a terminal objective.

"Not at all. There wasn't a selfish bone in his body. He invented cheap medical technologies for developing regions. He'd go undercover in his own company just to check that his employees were being treated well," said Doug.

"Please don't tell me he made a deal with the Devil," said Dr Connor.

"Well…," said Doug.

"I'm sorry. You're telling this story. Go on," said Dr Connor. She hadn't been outside among real living things for a long time. She had forgotten how long it took to get places by foot.

"Philip Goodman put lots of effort into helping other people but not enough into himself. He was obese. It was causing health problems. His doctor told him that if he didn't start exercising he wouldn't live very long," said Doug.

"Oh no," said Dr Connor.

"I mean yeah. He wrote a blockchain contract stipulating that if he didn't put in an average of at least two hours of cardio exercise in every day for a year then his entire fortune would be donated to Effective Evil," said Doug.

Smart contracts are dangerous. Dr Connor didn't know where this was going but it couldn't possibly be good.

"Philip Goodman exercised hard. Harder than he ever had exercised before in his life. His sixty-five-year-old body couldn't handle it. He had a heart attack," said Doug.

Dr Connor gasped.

"That initial investment is what got us started. We still have a steady stream of people who threaten to donate money to Effective Evil unless they accomplish some personal goal—and then they fail—but self-threats are no longer our sole supporters. Government intelligence agencies subsidize us when we destabilize their adversaries. But one of our biggest sources of income is prediction markets. You can make a lot of money from a pandemic prediction market when you're the organization releasing artificial pandemics. We don't do it just for the money, of course, but there's no reason to leave the money lying on the table," said Doug.

"And that's why you need a bioinformatician like me," said Dr Connor.

"That's one of the reasons why we need a bioinformatician," said Doug, "We do human genetic engineering too."

Dr Connor couldn't hold it in any longer. She burst out laughing so hard she nearly lost her balance. She bent over with her hands on her knees until she caught her breath.

"What's so funny?" said Doug with a straight face.

"This whole operation! It's the funniest practical joke anybody has ever played on me," said Dr Connor.

"Dr Connor, I assure you this whole operation is completely legitimate. Well, not legitimate per se. We are behind numerous illigitimate activities. But I assure you Effective Evil is completely real," said Doug.

"You're kidding," said Dr Connor.

"I'm not," said Doug.

The forest path ended. The reached a concrete wall laced with barbed wire. Doug scanned them through the checkpoint.

"Most people want to see our breeder reactor first, but in my opinion the hardware related to our cyber-ops is cooler," said Doug.

"The bioengineering facilities please," said Dr Connor. As her area of expertise, it would be the hardest to fake.

It was a real bioengineering facility, complete with mouse cages and cloning vats.

"You're not kidding," said Dr Connor.

"Nope," said Doug.

"What is wrong with you? This is evil," said Dr Connor.

"Thank you," said Doug.

"Why?" said Dr Connor.

"Why what?" said Doug.

Dr Connor gestured frantically at the surrounding facility.

"We're 'Effective Evil'. Not 'Theoretical Evil'. I have no patience for the armchair sociopaths who pontificate about villainy without getting their hands dirty," said Doug.

"You're literally wearing lab gloves," said Dr Connor.

"It's a figure of speech," said the supervillain.

Doug escorted the young scientist along the steel catwalk. Chemical engineers labored below.

"I can't join you," said Dr Connor.

"Why not?" said Doug.

"You're evil. With a literal capital 'E'," said Dr Connor.

"So?" said Doug.

"I don't want to make the world a worse place," said Dr Connor.

"You don't have to. There is an oversupply of postdocs. You're a great scientist but (no offence) the marginal difference between hiring you and hiring the next bioinformatician in line is (to us) negligible. Whether or not you (personally) choose to work for us will produce an insignificant net effect on our operations. The impact on your personal finances, however, will be significant. You could easily offset the marginal negative impact of working for us by donating a fraction of your surplus income to altruistic causes instead," said Doug.

"You're proposing I work for you to spread malaria and use my income to subsidize malaria eradication," said Dr Connor.

Doug shrugged. "It's your money," he said.

Dr Connor rested her head on the steel railing. "I think the fumes are getting to my head. Can we go back outside?"

They walked along the forest path back to the VTOL landing pad.

"I became a scientist because I wanted to change the world," said Dr Connor.

"There are no better opportunities to change the world than here at Effective Evil," said Doug.

"I meant 'change the world for the better'," said Dr Connor.

"Then you should have been more specific," said Doug.

Dr Connor stepped back onto the airstair to re-enter the VTOL.

"What about you?" said Dr Connor.

"What about me?" said Doug.

"Why do you run this place?" said Dr Connor.

"Because I want to change the world," said Doug.


  1. Vertical take-off and landing [aircraft]. ↩︎

New Comment
12 comments, sorted by Click to highlight new comments since: Today at 7:44 PM

Accept the job. 

For complex high tech projects, if there is a will to fail, obstacles can be found. 

Any high skill employee of that type has many opportunities to make subtly bad decisions. To skew the numerical estimates. To warp the incentives. To "accidentally" place a leaking coffee cup on the most expensive and sensitive piece of equipment. To leave a slight residue on a test tube that influences the results of the next experiment. To write buggy spaghetti code. To leak vital data. To foster dissent and sabotage amongst the other employees. 

She should read this: https://www.gutenberg.org/files/26184/page-images/26184-images.pdf

[...] the marginal difference between hiring you and hiring the next bioinformatician in line is (to us) negligible. Whether or not you (personally) choose to work for us will produce an insignificant net effect on our operations. The impact on your personal finances, however, will be significant. You could easily offset the marginal negative impact of working for us by donating a fraction of your surplus income to altruistic causes instead,"

Double standard: when considering the negative effect of her work, he compares her with the next in line, but when considering the positive effect of her donations, he doesn't.

[-]gjm2y140

At any given moment, usually an organization wants a particular set of employees. If she doesn't take the job, they'll hire a different person for the role that would have been hers rather than just getting by with one person fewer.

At any moment, usually a charitable organization wants as much money as possible. If she doesn't make the donations, the Against Malaria Foundation (or whatever) will just have that much less money.

It's not quite that simple: maybe Effective Evil has trouble hiring (can't imagine why) and so on average if she doesn't take the job they have 0.3 bioinformaticians fewer in expectation; maybe the AMF works harder on fundraising if they get less than they hoped for and so on average if she doesn't make the donations they're only down by 0.9x what she would have given. But I would strongly expect that taking-the-job-or-not has much more of a substitution effect than giving-the-money-or-not.

Another problem is that he doesn't account for the positive (less evil) effect of her donations as a reason to not hire her. EE would only hire her if the value she would provide in service of their goals exceeds the disvalue of her donations by at least as much as the next available candidate would. Likewise she would only work for them if the value of her donations for altruism exceeds the disvalue of her service to EE by at least as much as if she took a job at a normal organization. There's no way her employment at EE is a +EV proposition for both of them.

Yeah, if it's a net goal, then they can't both be right. But strictly speaking he never says he wants to make the world worse on net. She says she wants to change the world for the better, but he just says he wants to change the world, period. They could deontologically or virtue-ethics-esque value the skillful doing of evil and changing the world from what it would otherwise have been, which is completely consistent with unleashing giant mice to rampage through NYC even as malaria is cured by donations from guilty employees - everyone gets what they want. Effective Evil gets to do evil while changing the world (all those skyscrapers ruined by overgrown rodents are certainly evil, and a visible change in the world), and the employees know they offset the evil by good elsewhere while keeping a handsome salary for themselves.

They could also easily just desire different things ("have different utility functions"). This is the basis for gains from trade, and, more germane to this example, political parties.

If Effective Evil thinks the most efficient way to do evil is assaulting people's eyeballs with dust specks, and I think the most effective way to do evil would be increasing torture, I can use the money they give me to engineer aeroplane dust distribution technology to reduce torture. If they think 1000 specks equals 1 minute of torture, but I think 10e9 specks equals 1 minute of torture, there is a wide latitude for us to make a trade where I reduce 10 minutes of torture in expectation, and they get more than 10,000 specks-in-eyes. Their conception of evil is maximized, and mine is minimized.

So it's an Evil argument?

He is evil, so he makes it look like she could compensate it, but in fact sets up incentives that she doesn't. At least in expectation - which would be effective.

I think Doug is making the assumption that the next in line is less likely to donate than Dr. Connor would be.

Probably, and it's not a bad assumption. I'd imagine that donation to charities would vary wildly between candidates. But it's still an assumption, and his argument is not as airtight as he makes it appear.

I'm just surprised that she isn't dead at the end of the story but maybe that comes later.