Alice worked for Facebook.
"Are you familiar with advertising fraud?" she said.
"Remind me," said her boss Bob.
"People write plugins that simulate clicks. This has gotten worse and worse over time," said Alice.
"I am aware. Are you implying there is something we can do about it?" said Bob.
"We have this new Bayesian model that creates a completely simulated Facebook trained on the clicktracker data. It's robust against adversarial data, which means it should perform better than a naïve clicktracker," said Alice.
"What about privacy concerns?" said Bob.
"The model doesn't output a real copy of Facebook. It outputs a fake Facebook with fake people. Ethical advertisers don't care about individuals. All they care is that the data is statistically accurate. Statistically, our simulation behaves the same as real Facebook (which is why the advertising numbers come out the same) but nobody in the fake Facebook corresponds to a real person on real Facebook. This lets us sell 100% of our data to our advertisers without compromising the privacy of any real people. Cool, huh?" said Alice.
"How hard will it be to plug the new system into our advertising pipeline?" said her boss.
"Everything's modular. It all uses the same data format. Give me a week to prepare and the actual transition could be done in an hour. It'll increase revenue by at least 5%," said Alice.
"Let's do this," said Bob, "We can always revert the change if things go horribly wrong."
Eve worked for the NSA.
"The Pareto Principle is nuts. Here we are, having built an entire surveillance state, and yet 80% of our data comes from hacking into Facebook's ad recommendation system," said Eve, "It's emasculating."
"Get back to work," said her boss.
Charlie worked for the CIA.
"You can't fake an entire online life history. If an agent goes undercover and they don't have 20 years of family photos on Facebook it's obvious to a terrorist organization that they're using a fabricated identity. The old identity fabrication techniques don't work anymore," said Charlie.
"What do you propose we do instead?" said his boss.
"Steal real identities," said Charlie, "Impersonate real people."
"I'll call the NSA," said his boss, "We'll use the identities of people they collect from their mass surveillance system."
"Your model says this guy named Dzhokhar Dachiev is alive, but actually he's dead," said Bob.
"What are you talking about?" said Alice.
"I ran MIRI's data validation metric on your model. He was executed by the Russian government in Chechnya. The local papers say he was a terrorist," said Bob.
"That's not possible," said Alice, "I add noise to my generative backward pass. None of the people it outputs really exist. They're just numbers on a computer."
"Well, the Russians have his body," said Bob.
"Shall I quietly switch back to the old clicktracker?" said Alice.
"We can't switch back to the clicktracker. Your model is too good. It provides 80% of our revenue. We'd have to layoff 4/5 of the department. That's out of the question," said Bob.
"How did the AI get out of its sandbox?" said Alice.
A Bayeswatch agent stepped into Facebook HQ in Palo Alto.
"I appreciate you reporting this suspicious behavior to us but the algorithm you're using isn't capable of superintelligence. It's barely capable of intelligence. It has no agency," said Miriam.
"Then how and why did it escape the sandbox?" said Alice.
"Not my job," said Miriam.
William worked for Microsoft.
"It's really simple," he said to the Fiverr worker, "I pay you half my salary and in exchange you do all my work for American wages while working from Belarus. Don't tell anyone or it'll ruin the gravy train for both of us."
Sofia worked for Russia's Foreign Intelligence Service.
"The Pareto Principle is nuts. Here we are, having built an entire espionage apparatus, and yet 80% of our data comes from hacking into the CIA by accepting random work on Fiverr and dropping backdoors into enterprise software," said Sofia, "It's emasculating."
"Get back to work," said her boss, "Don't let it get to you. Besides, Microsoft contractors are real employees in all but name."
Wang Zhuyi worked for the People's Liberation Army.
"The Pareto Principle is nuts. Here we are, having built an entire espionage apparatus, and yet 80% of our data comes from hacking into Russia's Foreign Intelligence Service," he said.
"That sounds too easy," said his boss, "Maybe it's counterespionage."
"I thought so too but it's all real. I validated it against the ground truth we stole from Facebook's advertising division," said Zhuyi.
Yaakov Kessler worked for Mossad.
"I've been looking at the surveillance data we stole from China and I noticed some patterns which indicate it might be fabricated," said Yaakov.
"Not our problem," said his boss.
"What happens if an agent runs into the real person he or she is impersonating?" said Charlie's boss.
"It's only a problem in theory. In practice, that basically never happens," said Charlie.
"Another of our simulated people turns out to have a body," said Alice, "Can I please shut this system down?"
"Not a chance. It's 95% of our revenue. There has to be a simple explanation. Find out what's going on. I'll authorize whatever budget you need," said Bob.
"Good thing our company operates a surveillance apparatus second only to governments," said Alice.
Justin Lu had been captured in Palo Alto. That wasn't his real name. Actually, he was on a secret mission of the highest clearance. His family―his real family―back home in Chongqing would be taken care of if he didn't slip up. All he had to do was stick to his assumed identity long enough to escape.
"Can I see your driver's license?" said Bob.
Justin Lu handed over his driver's license. It was forged by the People's Liberation Army.
"Looks real to me," said Alice.
"Can you tell me about your family again?" said Bob.
Justin Lu recited the details he had memorized from the real Justin Lu's Facebook page.
"You can go," said Alice.
"I wish we could keep him here," said Bob, "But he's done nothing illegal and we're not cops."
"I'm not cut out for this Twilight Zone nonsense. I just wanted to do math," said Alice. She held her face in her hands.
A Bayeswatch agent stepped into Facebook HQ in Palo Alto.
"I'm telling you, our AI has gone rogue," said Bob, "It's world optimizing. It's hacked its own error function. It's making the world conform to its predictions."
"And I'm telling you that's impossible," said Miriam.
"It's creating people out of thin air," said Alice.
"Why not turn the machine off?" said Miriam, "I'm not saying you should. I don't care. I'm just curious why you haven't."
Alice raised her hand. "I wanted to turn it off," she said.
"It's…uh. Making a lot of money," said Bob.
"Then what's the problem?" said Miriam, "Maybe your algorithm is just so good it's deducing the existence of people you never added to the system?"
Alice and Bob thought about it.
"Wait a minute," said Bob to Alice, "You said the purpose of this system was to deduce reality even when the input was deceptive. I bet these people we're seeing Facebook accounts for are not Facebook users and your algorithm just deduced their existence and outputted what it imagined their Facebook accounts to look like."
"Why didn't I think of that?" said Alice, "Mystery solved. Sorry to bother you."
"I'm just doing my job," said Miriam, "Thank you for being vigilant citizens."