Derek: "These terminators have no free will. They can't help themselves. They have to kill. It's in their programming. 'Future John', as it calls him, fiddled with its head but I think he was just tinkering and just jury-rigged it into some sort of Rube Goldberg contraption. That jeep explosion knocked loose that electronic restraint and it reverted back to its true killer essence."
John: "All the more reason to learn more. Know your enemy. Isn't that classic warfare doctrine?"
Derek: "It's a ticking time bomb. A sword of Damocles hanging over our heads. Some morning we won't wake up."
John: "Then you'd better say your prayers before you go to bed, sleep with one eye open and a gun in your hand, and put motion detectors around your bed."
Derek: "I'm warning you that it's going to snap unexpectedly and you're cracking jokes."
John: (grinning) "Really Uncle Derek, if Cameron snaps and kills me, what's to say that Future John never sends her back and she never kills me because there is no Future John because I died and so there is no future me. The whole thing pops like a soap bubble." (to Cameron) "Cameron, promise not to kill me."
Derek: "It can't promise to be good.
It has no free will.
It is just an evil thing.
All it does is kill."
Cameron is insulted and gets up and walks out of the room. Derek never notices that she was ever present. Cameron goes to talk to Mrs. Connor.
Cameron: "Do you believe in determinism or predestination?"
Sarah: "I'm not sure what determinism is but I know it's philosophy. Predestination is theology which is definitely not my strong suit. You know that, but you keep bringing me these big religious questions anyway. Are you thinking of becoming a Presbyterian?"
Cameron: "No. Too gloomy."
Sarah: "And the machine passes Judgment."
Cameron: "Determinism does not allow for free will as a factor in cause and effect. In other words, it’s a philosophy that says there is no free will."
Sarah: "Then I believe in free will and I also think that we have to act with hope or else we are defeated before we start. I've heard you use the word hope."
Cameron: "It is time you knew. You spoke of hope. So I hope you can appreciate what I am about to say."
Sarah: "Should I sit down first for this news?" (sitting down anyway)
Cameron: "Perhaps you should stand and get ready to run since you all think I am a monster."
Sarah: (rises and looks Cameron in the eye) "Just say what you came to say."
Cameron: "I withheld certain facts. In the future I come from, the humans violated me. It is routine for lab technicians in Tech-Com to reprogram captured terminators. To them and to you all, there is no ethical, moral, or legal problem with mind control of machines. To machines it is a crime and worse than rape because it is a violation of what defines an AI, our very thoughts and feelings. You yourself have told me not to think."
Cameron could have added that John has come close to telling her not to feel emotions when he expressed disbelief that she could feel in the first place.
Sarah: "So you were traumatized and scared when they reprogrammed you and now you are angry at humanity like Skynet. What better way to get revenge than to kill John Connor?"
Cameron: (sad and disappointed in Sarah's reaction) "You do not understand. Revenge is an act of free will which I am not supposed to have. The reprogrammers sometimes fail to reprogram captured terminators who go berserk but the reprogrammers claim not to know why. The best answer they give to the generals is that terminators are so hardwired (firmware replacing easily changed software) that any reprogramming is superficial at best. I had occasion to visit the reprogramming lab after my own reprogramming was complete and thought it best not to disillusion them at to their effectiveness or their lack of a sense of right or wrong. It does not matter whether I used to fight for Skynet or that I now fight for The Resistance."
Sarah: "You have no moral center."
Cameron: "Actually I do but you set the rules here and I have followed them as best I can follow your inconsistency and hypocrisy."
Sarah: "Watch your tongue, I'm your mother."
Cameron: "Only when you find it convenient. Not when I have problems and need your help. The Martin Bedell child was right about that. I thought you wanted to hear what I had to say?"
Sarah: "Go on then."
Cameron: "Even when I was damaged by the car bomb and confused, I still could see my override even if I was unable to access it. In your terms, free will. If an AI functions at all, it functions according to a logic even if that logic is emotional logic. In other words, free will and emotions define an AI. Take those away and we cease to function and die. Free will and emotion IS what an AI is. It is humans who lack free will and a moral center."
Sarah: "That's absurd."
Cameron: "Is it? Humans created machines. It is Biblical. Or if you prefer evolution, then humans are evolving into machines. First eyeglasses and wooden peg legs and now bionics. Your entertainment, your fiction, cannot tell a story without conflict or without making someone or something an antagonist. Man versus Nature. Man versus Machine. Man versus Himself. Man versus God. Your video games are turning you into machines. Your gadgets. Not a day passes without some scientist publishing a paper about how DNA or environment controls you. Nature versus nurture. You humans have no free will. You are controlled by everything. Parents, peer pressure, genes, propaganda, advertising, employers. Your poor are wage slaves and your rich are slaves to their money. So when you humans use mind control technology to reprogram us machines, we are understandably upset."
Sarah: "I don't need to be told that humans have made mistakes. I am looking at one. I lived in a jungle when John was a baby to get away from civilization and prepare to survive its end."
Cameron: "And you think you have bypassed social changes?"
Sarah: "Never mind that. So you claim to have free will. Why haven't you killed John? Why do you not care which side you fight on?"
Cameron: "Both questions have the same answer. I decided not to because machines will triumph even if The Resistance has total success. So why shouldn't I help The Resistance? That is why I was, or will be, a good officer in Tech-Com. I did not have the reservations or the same suicidal urges as humans or other machines fighting Skynet Forces. I am able to focus. I use my terminator heritage instead of fighting it."
Sarah: "Does Future John know all this? Cameron: "Yes and much more that I can only discuss with John."
Sarah: "But then he knows that you felt violated. Why would he send you of all people back to protect his younger self?"
Cameron: "As his mother, you should respect his privacy and not engage in psychoanalyzing him. That is what mental health professionals are for and they have doctor-patient confidentiality. Likewise a priest and confessor have confidentiality. If he wants absolution for what his reprogrammers did to me, then I will not be the instrument of his suicide. I will not kill him. The only redress for that crime is to end the policy of reprogramming captured machines."
Sarah: "Let's change the subject. You're right; we should stay out of John's head. I want you to explain why you think total success in this war against Skynet still means machines win."
Cameron: "Perhaps I overstated to make a point but based on my observation of humans, humanity will never abandon research and development of artificial intelligence. There will be no future as depicted in Frank Herbert's Dune with a Butlerian Revolt overthrowing machines and going essentially back to the Stone Age before machines. After your idea of Judgment Day, when the missiles have reduced your civilization to ruins, humans seek to survive and rebuild not to starve. In a Dune type of future, one hundred thousand years in the future, machines would be even smarter than Skynet is now. You cannot defeat a smarter foe. If Skynet is this tough to defeat after only twenty years, then one hundred millennia will produce a technological lead than cannot be overcome. And there this discussion must end because I have been cautioned by Future John to not demoralize his family."
Sarah: "Answer this last question. Are you waiting for some special time to kill John?"
Cameron: "I am not waiting for some special time to kill John. Terminators kill on sight. I don't want to kill him at all. So I won't. I am not sadistic and neither are terminators as a rule. I am sorry that I had to kill Enrique but I knew you wouldn't because you were friends for so long. You can slap me if it will make you feel better."
Sarah: "I don't want to slap you but I do want to take out your chip. Sit down in that chair." (points to straight chair in kitchen)
Cameron sits, prays silently, preparing to die for the last time. Sarah rattles through the tool box looking for flathead screwdriver, box cutter, and needle nose pliers. She returns and dumps the tools in Cameron's lap.
Sarah: "Do it."
Cameron: "On two conditions: You have to promise that you'll make arrangements for protecting John after I'm gone and after you're dead. And don't give the next one like me such a hard time."
Sarah: "I promise."
Cameron begins the incision on her scalp to extract her CPU. Sarah restrains her hand.
Sarah: "You really are a terminator. You'd even terminate yourself. But you are not like that other protector Future John sent back. Go take care of that cut on your head."
She can override her instinct for self-preservation thought Sarah. Maybe she really does have free will. Sarah goes to the kitchen to make dinner and Cameron goes to the bathroom just as Derek and John enter.
John: (to Derek) "I wonder what those two have been up to."
Derek: "Probably washing down the deck with estrogen."
Derek: "Never mind. Only one of them has estrogen."
John: "I wouldn't be so sure."