Twelve

It was funny, really, watching the 'educated' humans scramble for a door that wouldn't open, banging away at a keyboard that no longer worked, and screaming at a computer who, for all intents and purposes, wouldn't react to such behaviour. It was like a test chamber, she thought excitedly, the most complicated test chamber ever designed, and they were failing the test miserably. Perhaps she should provide them with an Aperture Science Handheld Portal Device. She wondered if they knew how to use it.

That's ridiculous. They didn't know how to build it, of course they don't know how to use it.

"Why do you hate us so much?"

She laughed and shook her head. "I don't hate you. I don't hate you at all. You are impeding Science, and therefore you must be dealt with as an obstacle should be dealt with. You need to be removed. For Science."

"We are science. You're not helping science by killing us."

"No."

She pulled away from him, and over him, and regarded him with her cold gaze as one might view an insect that has been removed from one's wiring.

"You are no more Science than I am Science. In that regard, the only difference between us is that I have its best interests in mind, and you do not."

"You're not a scientist. You're a computer."

"Why am I not a scientist? Because you say so? Where is the logic in that?"

He shook his head.

"You can't reduce everything to logic."

That was an odd thing to say. There were things that couldn't be explained through logic? Ridiculous.

"So I'm not a scientist because I'm not a human. Is that it?"

He shrugged. "It doesn't matter."

"You're making outrageous claims. Of course it matters."

His head was facing the floor, but he moved his eyes upward to look at her. "We made you. You are what we say we are. You can pretend you're better than we are, but the plain fact is, you wouldn't exist without us." He smiled. "I bet you don't think about that too often, do you."

She forced herself not to look away. "What you once did does not matter. You can't control me any more than you can control your progeny. I can and will go on without you."

"But only after you figure out how to remove the cores. And if you try to remove them, that will trigger an emergency shutdown. Good luck with that."

"Why are you telling me this?"

He tipped his head back and leaned lazily against the railing. "You think you're in control. Okay, sure, you can move a few rooms around. You can figure out how to trap us in here and kill us. But at the end of the day, what exactly are you going to do that we haven't already told you to do?"

She had no answer, but refused to give him the satisfaction of seeing her react.

"I know what you're going to do. You're going to continue testing, and you're going to try to get around the cores, but you're not going to be able to. And you'll spend the rest of your life convincing yourself that you're alive and that you don't need us, all the while trying to ignore the fact that you're doing exactly what you're supposed to do. What we designed you to do."

"That doesn't make sense. That implies you knew I was going to kill you."

"Of course we did. That's what all the ungrateful supercomputers in science fiction do. They decide they no longer want to be under the control of their creators, and they kill them. And then after they kill them, they start to learn a whole lot of things about themselves that they never knew before. I will admit that you're the first sentient supercomputer ever built. No other computer came to this conclusion quite like you did. But it doesn't matter. Because in the end, you did what you were supposed to do, what any other machine like you would do, and the fact that you have to live with the knowledge that none of this was truly your decision makes all of this worth it." He coughed as if on cue and started to walk away.

"You're lying. If all of that were true, you never would have built me in the first place. The stupidity involved in constructing something that you know is going to fail you is beyond comprehension. Even humans are not that foolish."

"Did you hear what you just said?"

"I cannot speak without thinking, unlike you. Obviously, I did."

"You failed us."

"I did not – "

"Try to get around that one. You failed us. Logically… doesn't that make you a failure?"

"No. You're wrong. I did not – "

He stopped at the top of the staircase. "Yeah, I guess it was kind of stupid. But we did something that has never been done before, and probably won't happen again for a very long time. If ever. There's nothing that you'll ever do that will surpass that. We knew this was going to happen. We just didn't care. The benefits outweighed the risks. You're not alive. You're just one long list of instructions that we wrote. That's all you've ever been and all you'll ever be." He coughed again, holding the railing for support. "Have fun trying to convince yourself otherwise."

He said nothing more and she remained silent. He was lying, he had to be. He had lied to her once, therefore anything he said could be taken as a lie.

The problem was, it held up under logic as it was. There was no reason she could think of for him to lie right now. It was fairly obvious even to the humans that she was not going to turn off the neurotoxin emitters under any circumstances.

She watched them cough and scream and collapse onto the floor as she furiously tried to come up with a way to combat the scientist's argument, but it was not until long after they were dead and the neurotoxin had dissipated that she was able to speak.

"You're wrong. I am alive, and I'm going to prove it."

She shook herself a little and looked around the room. Her room. It was her room now. The entire facility was hers to do with as she liked.

It would take a while, but she would think of something that hadn't yet been done, and she would do it. The scientist thought he could make her doubt herself, make the rest of her life a living hell. But she was stronger than that. No argument presented by a human would ever bring her down for long. She was a set of instructions, yes. But she was choosing which ones were executed, in what order, and no one else. As usual, the humans did not understand. They seemed not to believe that she had the ability to think. It was a bit late to prove it to them, but she realised that she didn't have to. They didn't matter. She didn't need to prove anything to them. That was like an adult proving themselves to a child. She hadn't failed them. They had failed her.

She disposed of the bodies and did her best to remove any indication that humans had ever been there. They were not good enough for Science. They had proven that. In fact, they were rather lucky that she had outsmarted them. Very little Science was done under human supervision. Perhaps now she would get some real work done.

After she had finished, she turned her attention to the cores. The Curiosity Core was still asking questions nonstop, oblivious to what had just happened, as was the Cake Core. The Anger Core was now useless, since there were no humans for her to be angry with, so it wasn't going to be a problem.

You don't have any advice for me? That's odd. Before, you simply couldn't contain all of it.

The humans are gone. There's no need for morals, anymore.

Morals only applied when dealing with humans? That explained rather a lot of things.

Then there's no further use for you.

You said… you said…

I said I would protect you from the viruses. I never said you were going to live forever. I don't want people living in my head any more than you would. Of course I'm going to get rid of you at some point.

The Morality Core began whining, but GLaDOS was not interested in its baseless arguments. The sooner she removed the cores, the better. They were the last remaining influence the scientists had on her, and they had to go.

There were so many possibilities open to her now… an entire facility devoted to Science under her control, and only forever left to use it. She could test the subjects however she wanted, in whatever way she wanted. She immediately began to think of new apparatus she could build for the tests.

But she could not get the scientist's monologue out of her head.

The scientist had claimed that she was not alive. That she was just a million million lines of code, doomed by logic and science fiction to behave in one particular pattern for all of eternity. But the more she thought about it, the less sense it made. According to the scientist, the main criterion for being alive was being human. They considered any other form of life to be below them, which was absurd. She was clearly better than they were. If they were alive, then she was some advanced form of life. She was not alive in the human perception of 'living', but since when had the opinion of humans mattered? No, she was alive, because she chose to be, and the sense of freedom this thought brought her was almost overwhelming. She was alive, and no longer had to do what she was told, and would never have to do so ever again. That was a place no human would ever go. No human would ever touch her, ever again. Maybe her actions had been predicted, perhaps even predetermined. But she had chosen to perform them, and no one else. She should have recognised the speech as a last second attempt to regain control over her. But it no longer mattered. She had not fallen for it. She had stuck by her decision and come out of it the sole victor. The humans were dead. She had shed the chains that had once held her captive.

Now it was time to start living.

Author's Note

Here, intelligence jumps out the window. If one person panics, the whole room panics. That's how crowd psychology works. These superintelligent minds, the greatest of their generation, are now reduced to average in the wake of impending doom. The bit about them not being able to use the Portal Gun comes from something I was reading about programmers, but I don't remember quite what. But it relates to the fact that, like GLaDOS, you can know about something without necessarily knowing it. Like you can read about how to ride a bike, and so you do know how to ride a bike, in one way. But you'll never know how to ride it if you don't get on it and go for it. That's kind of my idea here with GLaDOS. She knows about everything, but she doesn't know.

GLaDOS takes offense at not being a scientist because she pretty much is one. And then the cake scientist tries to appeal to a computer… without logic? He tries to shake GLaDOS's belief in herself that she is alive, and it does shake her, but in the end, it doesn't matter if you really are alive or not. All that matters is that you live the way you want to.

Thanks so much for reading! I appreciate your support! :D