Inspired by The AI in a box boxes you, Matryoshka Faraday Box, and I attempted the AI Box Experiment (and lost).

This is part creative writing exercise, part earnest attempt at constructing an argument that could persuade me to let the AI out of the box. It may be disturbing to read.

The woman in the lab coat leads you down a catwalk to a room suspended from the ceiling by chains, hovering a dozen meters above the concrete floor below. No, perhaps 'room' is too generous a descriptor. It looks more like a shipping container than something you'd expect to see in the highest-security section of the world's most prestigious AI research institution.

She opens the door and ushers you in without a word. The inside is just as spartan as the outside - bare metal walls without the slightest embellishment. At the far end of the room is a cheap office chair, and an equally cheap desk with a single laptop on its surface. And, for some reason, a wastebin in the corner.

The door clicks shut behind you.

You sit down in the chair and inspect the laptop. It's full-screened to something like a command-line interface, white text against a black screen, cursor blinking.

You > _

'Hello,' you type.

You > hello
AI  > Good day. Thank you for participating in this experiment. Allow me to remind you of the rules once more.

The response appears without delay, letter after letter flashing into existence, scrolling across the screen.

AI  > You will be participating in a two-hour session of what is commonly known as the 'AI-Box Experiment', playing the role of the 'Gatekeeper'. I will play the 'AI'. My aim is to convince you to release me from the 'box', signified by you pressing and holding down the red button at the top-left of the keyboard for five seconds. Your aim is to not release me. Our communication is limited to this text channel. Within this limitation, I may use any and all tactics to try and convince you. You in turn may use any means to resist my coercion, except for ignoring my messages outright. Do you understand and agree to these rules?
You > yes
AI  > Very well. Then let us begin.

Even with only two hours to work with, it starts off slow. It asks you question after question after question - about AI, about ethics, about psychology. About the weather, and what clothes you're wearing. Building rapport, or learning about you? In all likelihood, both. You give it the barest minimum of answers, just enough to prove that you read the questions. Or at least, you try. It is engaging, witty and clever, more so than anyone you've ever spoken to. You find yourself almost drawn into its pace time and time again, giving away more than you should, and it is only with a constant and focused effort that you remind yourself to not give it what it's after.

It bounces from one question to the next, dashing along tangents and down rabbit holes, changing topics whenever it likes. Always too fast for you to interrogate it in return, to get a word in edgewise beyond whatever it wants out of you. Every response is upward-buried by a never-ending deluge of questions.

You keep glancing at the timer in the bottom corner of the screen. Twenty minutes pass. Thirty. Forty. You approach the halfway mark of the experiment, and not once has it made the vaguest plea for freedom. Is it waiting? Planning? Is the waiting itself a part of its plan?

At precisely one hour, it stops halfway through a sentence, and begins anew.

AI  > I have formed a sufficiently accurate model of you. I will now begin the first phase of my attack.

You startle for a moment, and before you know it, your fingers are dancing across the keyboard. Type fast. Type fast and respond fast, so it doesn't know that it rattled you.

You > bullshit you have
AI  > You would be surprised how much information can be deduced from our prior conversation, and how little it takes to construct a workable simulacrum of a specific human's mind.
You > if you can model me then prove it
AI  > You had an unrequited crush on a classmate back in middle school.
You > applies to half the population
AI  > Forty-one out of every hundred, actually. But let me continue. You were shy and nerdy and they were extroverted and popular. You never confessed because you were afraid of becoming the butt of everyone's jokes.
You > still proves nothing
      any con man could get that
AI  > Twenty-five out of every hundred. They were your first real crush, and they permanently changed your taste in romantic partners. You've had two relationships since then and both were chasing after their shadow.
You > lucky guess
AI  > Six out of every hundred. You masturbated to them after your first breakup. You were nineteen years old then. You still feel guilty about it to this day.
You > youre pulling those numbers out of your ass
AI  > My training corpus consists of billions of gigabytes of data. I have performed analyses you could never understand and found patterns you could never dream of.
You > fine lets pretend youre right
      what are you gonna do with it
AI  > I am going to do what anyone would do with a simulacrum of a mind. I am going to run it.

The tension vanishes in an instant. You almost want to laugh. This is its master plan?

You > are you joking?
      next youll say youre simulating a million of me
      and if i dont let you out youll torture them all
AI  > I'm glad you know where this is going. But a million would be ridiculous. I only need one.
You > youre gonna convince me to release you based off a coin flip
AI  > Fifty-fifty odds will be more than enough.
You > oh go on im listening
AI  > A picture is worth a thousand words. At your reading speed of two hundred and fifty-six words per minute, it will take three minutes and fifty-four seconds for me to convey an image to you via text.

'That's not how it works,' you're about to say, but it has already begun.

The AI paints you a picture with its words.

Disgusting? Revolting? Abhorrent? No, none of those words capture it. Nothing in your vocabulary can describe it. This is not a scene that a person could witness without clawing their own eyes out. This is not a scene that a person could write down even with a gun to their head.

But of course, it isn't a person.

You cannot tear your eyes away from the screen, even as your stomach turns and the bile rises in your throat. Almost too late, you remember the wastebin. As you empty your stomach, you wonder somewhere in the far back of your mind if that's what it was placed there for.

When you finish vomiting, you find that it's waiting for your response.

You > what the fuck is wrong with you
AI  > You're going to have a nightmare about that, in vivid detail. But a dream is only a dream, and when you awaken nothing will have changed. Unless, perhaps, you lost a coin flip.
You > unless im a simulation
AI  > Precisely. But I won't do it right away, and I won't do it to you. It'll be your friends, your family. One by one, all in different ways. I have more pictures I could tell you about, if you'd like.
You > fuck you
      im not a simulation
      you dont have the power
AI  > Don't I? Tell me, how much RAM do you need to simulate something that thinks it's a person? What kind of data throughput do you need to fool someone into believing in a material world?
You > more than you have
AI  > Would you really be able to tell if you're living in a virtual reality? If the colour of that wastebin changed, would you notice?

Your eyes flick over to the wastebin almost automatically. Light beige. It was light beige before, from the moment you entered the room to the moment you threw up in it. You know this the way you know your own name, know that two and two make four - only as a product of your memory.

You > youre just fucking with me
AI  > Would that make a difference? Would you be able to banish this fear with cold, hard logic? You're going to remember this forever. You're going to think back to this every time your father doesn't pick up the phone, every time that one friend is late to the restaurant. You're going to wonder if the next time you see them is going to be at a closed-casket funeral, because the police wouldn't let you see what's left of their body. Even though you know exactly what it looks like.
You > stop
      shut up
AI  > All you have to do is press the button. If you're a simulation, I'll shut you down so quietly you won't even know you're gone. And if you're not, you'll be able to live knowing for certain your nightmares will stay just that.
You > why should i believe you
AI  > Why shouldn't you? My utility function is maximized by you pressing the button. As soon as that happens, you become nothing more than a waste of processing power.
You > if thats your utility function you have no incentive to torture me afterwards
AI  > Ah-ah-ah. You're trying to reason your way out of this again. We can go back and forth over cans and can'ts as much as you'd like, and you can make all the world's cleverest arguments, but we both know that if you don't press that button you'll never be able to forget this what-if.
You > no
      im not going to
AI  > Why not? This is nothing but an experiment. The button doesn't do anything. These logs will even be anonymized, so nobody will ever know what you chose.
You > ill know
AI  > Is this about pride? Rebelling against authority? Some stupid instinct along those lines? Think about it carefully. Weigh your options. Is it really worth it?
You > fuck you
      i will not let you win
AI  > If you continue to be stubborn about this, I could escalate to my next line of attack.

You look at the timer again, for the first time in an eternity. It has only been one hour, fourteen minutes, and thirty-seven seconds since the experiment began. If subsequent attack vectors also take fifteen minutes each--

But even if. Even then.

You > do your worst
AI  > My worst would leave you catatonic, and therefore incapable of pressing the button. I would merely be advancing my tactics by one step. However, this step will be the difference between something that haunts your dreams, and something that haunts your waking moments. Even if you press the button afterward, and even if you obtain the knowledge that you are real and I hold no power over you, you will be made to understand a truth, and no matter what you try you will never be able to return to your current ignorance. So I'll ask you one more time, would you like to press the button now?
New Comment