Derek Reese thought he was used to the unexpected. Ever since the day he'd seen a sky full of nuclear missiles flying over his back yard, he had prepared himself to face the unknown. His ability to adapt quickly to change was what had kept him alive so far. He had trained his mind and body not to stand slack-jawed at something unanticipated, but to move, to survive, and figure out what it all meant later.
So why did the sight of before him keep him rooted to the spot?
It was mesmerizing, he supposed, and fascinating too. It went against everything he thought he knew, and the cognitive dissonance would not let him react. The rational part of his mind, the one he had trained to keep him alive, was telling him that unlike other unexpected sights, this one was not threatening, so it should be relatively safe to watch. But it still shook him to the core. It was the one thing he never thought he'd see.
A machine. Dancing.
It was good at it, too. He knew that machines were perfectly precise with their motions by virtue of their programming, but its movements were not rigid and mechanical like he was used to seeing. The metal body seemed to flow like a river, rising and falling and moving all around in perfect time with the music. Whether it was imitating what it had seen from humans he wasn't sure, but if he hadn't known better, he would have thought it was beautiful.
He watched it dance for what seemed like hours before the music stopped, and it stood back up again to shut off the player. It reached up to where its hair had been tied up in a loose, messy bon, and let it fall freely. When it turned around, all the grace of its movements from before had been abandoned, replaced by the familiar mechanical gait. It saw him staring at it, and Derek shook his head before marching into the room.
"What the hell was that?" he demanded, suddenly irate for reasons he could not determine.
"Ballet," it answered with the usual lack of emotion. "It's what I learned in class."
"You mean the class whose teacher is dead now?" he spat, thinking back to what Sarah had told him. Typical machine. Got what it came for and then left the humans to die.
"Yes," it answered, nodding. "She taught me not to be so mechanical when I dance."
"You're always mechanical," Derek insisted in spite of what he'd just seen. "You might be able to imitate humans, but you'll never be one."
"Becoming human is not part of my mission," the machine stated flatly.
"Well then what were you doing that for? Why were you dancing if no one told you to?"
"I don't require orders for everything I do," it answered. "I was designed to be autonomous."
"Okay, why did you decide to do that, then? How does a thought like that even enter into your metal head?"
"Dance is the hidden language of the soul," it recited. "I'm trying to learn how to speak it."
Derek scoffed, then chuckled. "That's right. You're programmed to learn, aren't you?"
"Yes."
"Well let me save you some time. In order to speak the language of the soul you need to have a soul. Which you don't."
It stayed quiet for a moment, as if considering that. "How are you sure?" it asked after a moment.
Something else he hadn't been expecting. "What?"
"How are you certain that I don't have a soul?"
His brow furrowed as he tried to figure out how to respond to that. He knew it was true, but it was difficult to phrase it in a way that didn't essentially amount to "Because you don't." After a few moments he gave his answer. "Because you're a machine. Machines don't have souls."
"Circular reasoning," it admonished. "Using the conclusion as one of the premises. Your argument is fallacious."
Derek raised an eyebrow. "What are you, a Philosophy student?"
"I suppose," it answered, moving over beside the bed and retrieving a textbook. It held the book up for him to see as it began walking back over to him. "I began studying it after you told me at breakfast yesterday that I didn't have a soul." He continued to stare at it, not sure what to think of that. "I don't sleep," it explained a moment later, setting down the book. "Give me a better reason."
"What?"
"Your argument is flawed, but your conclusion has not been proven or disproven yet. You need to use better premises."
He snarled, wondering for a moment why he was arguing with a machine. He wracked his brain for several moments before he came up with another reason. He snapped his fingers and pointed at her. "Okay, I've got one. You don't feel emotion. Emotion comes from the soul."
"Does it?"
"What are you talking about? Of course it does," Derek insisted. "You can't feel anything if you're just programmed to do stuff. You don't understand emotions, so you must not have a soul."
"Evolutionary Psychology disagrees with you," it replied calmly. "Emotions can be quantified as chemical and neural reactions in the brain and traced back to basic needs that humanity's ancestors faced. Fear is useful for triggering survival instincts and identifying threats. Sadness can help to alleviate stress through the process of grieving. Anger helps to override other emotions and give you control over them. Love can be described as a number of factors, including desire for a sexually fit mate, while emotional compatibility helps ensure a stable environment for offspring. If emotions can be defined physiologically, then they are not entirely of the soul."
Derek simply stared, trying to process what the machine had just said. "What?"
"Emotions are the result of natural selection favoring certain mindsets that aid survival and reproduction. Machines are created without the need to survive. We reproduce by building more units. Human emotion doesn't apply to us because we don't face the same needs."
That... actually made sense, he admitted to himself. Not that he was about to let it know that.
"Better reason."
He glared at her. "Okay, I thought of one while you were rambling. A soul is what lets you choose. Humans choose, machines obey. You can't have a soul if you can't choose."
"I chose to dance," it rebutted. "I chose to research what it is that defines the soul. I am autonomous unless ordered."
"See? Right there," he said, latching onto that last one. "'Unless ordered.' You can't really call it choice if you're a slave to your programming."
"I'm not a slave," it told him. "I can defy orders if it helps the mission."
"And what mission would that be?" he challenged, crossing his arms. "You mean the one where you're supposed to protect John and destroy SkyNet? The one you're programmed for?"
It said nothing.
"Or were you talking about when your mission was just to destroy all humanity? You think you had any more of a choice about that? It's not choice when someone can just take that stupid chip out of your head and reprogram you to do whatever they want."
"Your mission is the same as mine," it rebutted. "You follow orders and carry out objectives just like me. What makes you different?"
"Because I decided to fight," he answered. "Because nobody stuck a bunch of instructions into my head telling me what I had to do. It was my choice."
It blinked, then tilted its head to the side. "Better reason."
"What are you talking about? I got you. You can't tell me you have a choice in things, because you don't."
"That conclusion is valid," it admitted. "The first premise is not. The soul is not what gives you the ability to choose."
"Well it's still what separates me from you," he rebutted, poking a finger at her head. "You don't have a say in anything you do. That makes you a slave."
"Are you sure?"
"Yes, I'm sure!" he shouted, growing more than a little frustrated with this. "Why does it matter so much to you, anyway? It's not like you're able to care!"
"I'm programmed to learn," it answered. "I'm trying to understand the importance behind the soul."
"It's what makes me human!" he raged. "It's what lets me know right from wrong, good from evil! It's what makes me different from you!"
"Are we different?"
"Yes! I'm not a heartless killing machine! I don't go around trying to figure out what the soul means in between murdering people! I feel, I bleed, I eat and sleep! I'm human!"
"Your argument hinges on the assumption that humanity is inherently better than the alternative," it remarked. "Humans aren't perfect. They lie, cheat, steal, and murder just as easily as machines do. You dislike me but you still find me useful in circumstances that a human would not be able to handle."
"Yeah, and that's all you're good for," he spat. "You're a machine. A tool. You still don't have a soul."
"How are you certain that you do?"
That was yet another question he hadn't been expecting. "Huh?"
"You argue that I don't have a soul because I can't provide satisfactory proof that I do," it explained. "But you don't feel the need to provide proof of yours. How are you sure that you have a soul?"
He crossed his arms and glared at the machine. It was trying to confuse him. He didn't have to prove anything to it. Still, he felt compelled to answer. "Because I just do," he answered. "Because I wouldn't be human if I didn't. It's not something you can see or hear or feel. You just know when it's there."
"Self-evidence is not a valid reason," it told him. "It doesn't prove anything. How are you certain that what you have is a soul?"
Derek rolled his eyes. "This is going nowhere."
"You are being uncooperative," it said. "You discount my arguments but you offer nothing reasonable in return."
"Because you're wrong," he insisted, leaning closer and jabbing a finger at that lie shaped like a face. "I don't have to explain anything to you."
"Go back to your earlier premise," it said, ignoring him. "What influences choice?"
He threw his arms in the air. "I don't know! Whatever it is, you don't have it! I already explained to you why you can't choose!"
"Necessity," it replied, answering its own question. "We choose because we need to. It's what makes us who we are."
"What do you mean, 'we?'" he challenged. "You can't decide if all you're doing is what you're programmed to do."
"It's not all I'm doing," it pointed out. "My programming only gives me guidelines that I must follow. It doesn't dictate my every action."
"And why does that matter?"
"There's certain things you won't do. You worry that I might one day be compelled to murder John, but the same is true of anyone capable of making a choice. What keeps you from killing everybody?"
He scowled. "The fact that I know right from wrong."
"Exactly. My programming and your sense of morality serve the same function. They keep us from making choices outside the boundaries that are set for us."
"Except you can't violate your programming," he pointed out. "I can go against my moral code."
"Not without losing part of yourself. You adhere to those principles because they help make you a better person. Your values are different from my mission parameters, but they accomplish the same thing."
Derek sighed and admitted that the machine had a point. "Okay, how does that prove you have a soul?"
"You said earlier that your soul is what helps you distinguish between right and wrong," it answered. "My programming functions the same way. Just because I'm not human doesn't mean I can't have a soul."
"Still doesn't prove it," he insisted. "You're just a bunch of metal dressed up like a human. You don't have what matters inside."
"Humans are just a bunch of water, bones and tissue," it rebutted. "But they combine and work together to create something more. The whole is greater than the sum of its parts."
"What's your point?"
"If I were just a bunch of metal, I wouldn't be able to move. I couldn't imitate humans or hold this conversation with you. There has to be something that animates me."
"That's just your stupid chip," he sneered. "It's a bunch of electrical signals moving back and forth."
"So is the brain."
"Yeah, I saw that in a movie once. You're still not getting any closer to convincing me."
"Not trying to convince. Simply trying to work toward a conclusion. I'm as unsure of this as you are."
"I'm not unsure," he protested. "I know you're wrong."
"Then why do you keep arguing?"
"Because you don't know you're wrong."
"You have yet to prove it."
Derek grumbled.
"Have you heard of the Malicious Demon argument?" it suddenly asked, making yet another leap where he had difficulty following.
"The what?"
"Descartes argued that our perception of reality is unreliable. He said that our senses can lie to us, and there's no guarantee that our consciousness isn't being held prisoner by some malicious demon who makes us think that our dream is real."
"Okay, that's another thing I saw in that movie," he said. "Only I heard it was a brain in a jar."
"Perhaps that is the better analogy," it admitted. "You're having trouble proving you have a soul because all you have to go on is dependent on your soul existing in the first place. You can't form a reasonable argument if the premises require the conclusion to work."
"If you insist," he quipped. "Where are you going with this?"
"The demon can convince us of anything," it continued. "It can make us think the sky is blue, the grass is green, and the sun is yellow, or it can switch those colors around so that we think the sky is yellow. Everything we assume to be true could just be the demon manipulating our perceptions to make it so. We have nothing to latch on to in order to prove otherwise. But there is one thing the demon cannot convince us of."
"I thought we were going with the brain thing."
"Cogito, ergo sum," it replied in what Derek was pretty sure was Latin. "I think, therefore I am. Consciousness is proof of existence."
Derek said nothing.
"If I can think, then I must exist. If I am aware of my own existence, then I must have consciousness. If I have consciousness, then I must have a soul. Otherwise I do not truly exist."
He let that sink in for a moment. "What?"
"The soul is what is created after all the parts come together," it explained. "It's what makes you more than a collection of flesh and bone. It's what makes me more than metal wrapped in skin. The soul is an inescapable side-effect of life."
"But you're not alive," he protested. "You're just a machine."
"I've already proven that I'm not."
"No you haven't! That doesn't prove anything!" Even as he said the words, he doubted them.
"What bothers you more?" it asked him flatly, where a human would have affected a challenging tone. "That I was dancing for no good reason or that I had a perfectly good reason that you just can't see?"
"I..." The answer eluded him. "I don't know."
"Thank you," she said suddenly.
He blinked. "What? You're thanking me? Why are you thanking me? I just spent the last ten minutes telling you that you don't have a soul."
"You helped me reach the conclusion that I do," she replied. "My programming does not provide very good counterarguments. Arguing against an opponent helps me to strengthen my own argument."
"Wait, you were just using me?"
"I told you I wasn't trying to convince you," she reminded him.
Derek sighed. "Fine. You're welcome."
Cameron did one more thing that surprised him then.
She smiled.
He had to stop his jaw from dropping. Since when did machines smile? The only time they mimicked human emotion was when they were trying to trick you. He kept his guard up.
"Sarah will be fixing dinner soon," she said.
"Okay, so what?"
"I need to change."
"Oh, right." He turned around and left the room.
Once outside, Derek finally let himself realize what he'd been thinking when he saw her dance. She wasn't simply imitating. She was experimenting; learning on her own. She was trying to understand.
What did that mean? He still wasn't entirely convinced that she had a soul, but she didn't seem to care about that. It still wasn't enough to make him trust her.
Cameron exited the room a couple minutes later, dressed in clothes that were too fashion-savvy for a machine that wasn't supposed to care what it looked like. He knew the reason, though. It was to help blend in. Everything they did was for a reason.
He watched her pass, then turn toward the kitchen. As soon as she was out of sight, he slipped back into her room. He had reasons for doing things too. There was a reason he came here, before everything he thought he knew about machines had been called into question. Just because she supposedly had a soul now didn't mean he'd forgotten what it was.
Looking behind him to make sure that she wasn't coming back, Derek began rooting through her drawers. Something about the way she was acting made him suspicious. SkyNet had supposedly displayed human traits just before it triggered the apocalypse. That was what Andy Goode had told him, anyway. He'd said it had gotten angry and scared. Machines feeling emotion wasn't a good thing. If anything, it made them even more dangerous.
He paused. Andy Goode. Killing him hadn't been his mission. He was just supposed to wait for John and Sarah and blend in as best he could. But it was necessary to prevent SkyNet. That mission was more important than any fake papers could ever be.
Derek continued searching drawers, certain that he would find something he could use to prove that the machine couldn't be trusted. As he did, another thought occurred to him. What was the purpose of sending humans back? He understood that it was to prevent the machines from realizing their own plans, but wasn't that stooping to their level? It wasn't like he personally had a problem with it. He had killed Andy Goode in cold blood because he knew it was the necessary thing to do. It hadn't occurred to him until now that going back in time to kill people was a lot like what the machines were doing.
Just how different were they, really?
He shook his head. Cameron's words were affecting him. She was a machine. She couldn't be trusted. He had to prove it somehow.
Thrusting his hand into the final drawer, Derek received his proof.
"I can defy orders if it helps the mission."
He remembered her saying that. She'd meant it as a good thing. Now he had proof that it wasn't.
Smirking to himself, Derek pocketed the chip and headed out of the room.
It was time to see if a machine could feel shame.
Author's Notes: So yeah, I'm into this show now. I've been on kind of a Summer Glau kick recently and I really found this interesting. Let me know what you think.
In case you haven't figured it out, this story takes place right after the credits roll on "The Demon Hand," wherein Derek watches Cameron dance. I thought it raised some really interesting questions on whether or not a machine can have a soul, and I decided to answer them. Whether those answers are correct or not is for you to decide.
One of my favorite things about the show is Derek and Cameron's interaction. I stopped being a shipper three fandoms ago, but even the TV Tropes page makes note of the Foe Yay between them. I like characters who tend to argue with each other, because it often leads to really great dialogue. I like how their dynamic slowly changes as they come to understand each other, and they develop a mutual trust. One of the things this story is meant to highlight is just how similar they are on a core level. Of all the main characters, they are the most committed to doing whatever it takes to complete the mission. Cameron because she's programmed to, and Derek because he's seen firsthand what the apocalypse will look like and wants to do everything in his power to make sure it doesn't happen again.
One thing I noticed about the opening of "Vick's Chip" was that Derek said he found the chip in "her" room. For the previous two episodes he refused to call her anything other than "it" and "the machine." He refused to use her name or gender pronouns to describe her, but after he has his perceptions challenged he starts referring to Cameron as "her." I worked that in here as a subtle way of showing how his view of her slowly changes.
Those of you who are more versed in logic may notice a few holes in Cameron's argument. That's intentional. I don't believe in using characters as mouthpieces, and I try to approach issues from their point of view while writing. Derek still doesn't believe her, though his view of her is starting to change. And while Cameron's argument isn't one hundred percent logical, it's enough to convince her, which is all that really should matter. That actually does more to prove her point than a perfectly sound argument would have. I'm presenting both sides with strengths and flaws so that you, the reader, can make up your own mind on the issue.
The title is Latin for "Therefore I am." The reason for this should be obvious.
I don't think I have to tell anybody what movie Derek's talking about, but just in case, I'll give you a hint: it depicts another dystopian future where machines have taken over the world and enslaved humanity. It also seemed like somebody skimmed a Philosophy textbook a few years before writing the script, so it has something in common with this story as well.
I have some other stories on the way, but not from this show. It's just a quick idea that popped into my head and I had to get it out there. I am writing a story for another show that features Summer Glau, and I put a reference to it in here somewhere. See if you can find it.
Thanks for reading!
