[Private to Chromie, audio]
I was told by Agent K that you found it acceptable for me to start work in the library. I'd like to know more about the specifics--what are my duties and what is my schedule?
[Private to Kay, text]
I've nearly completed the design for a game.
[Then there's a short delay and a blinking cursor, because that little tidbit wasn
(
Read more... )
Reply
[He'll wait patiently for Kay's arrival by doing more work on the game.
It's become a pleasant distraction as of late.]
Reply
Reply
The room is a comfortable reflection of the Grid; the walls are sleek and smooth and cut with circuit lines, the edges outlined with a faint and warm glow. The furniture is rather stark and sparse, and CLU gestures for Kay to take any of the seats.] If you want to sit. [Then he settles neatly into his own, favorite chair, and regards Kay for a moment.]
I wanted to ask a question.
Reply
Go ahead.
Reply
Why am I here?
Reply
Well, he was so excited over them, he failed to notice something else. That you and other programs, you changed too. You evolved beyond your programming. You didn't just have the directive to serve- you had developed a real drive and need for approval. You'd developed actual emotions, as real as his.
He didn't think about that. He was paying so much attention to something else he completely missed it.
Reply
Nonetheless, he takes the time to process what Kay says and furrows his brow.]
So I am here because of Flynn's failures? Because I am [pause, and the corner of his mouth tightens] how I am?
I followed my programming. I enacted the duty assigned to me, my directive.
Reply
You're here because he never acknowledged those emotional developments, didn't teach you to integrate them into your activities and efficiencies soon enough- which is something parents should do, and because of that when you followed your directive, you also destroyed the system and it's purpose in your efforts to create a new and perfect one. One that he didn't want, and never intended.
Let me ask you something. Are you more upset with the changes in Rinzler because he no longer approves of you, or because he no longer obeys you. ...Think about it for a while. You don't have to answer it now.
Reply
And so normally a statement with that amount of weight would've received a sharp and harsh denial from him, an immediate dismissal. Instead, now, CLU leans his elbow against the chair's arm, buries one hand in his hair, and gives Kay a hollow, drained look.]
I didn't--the system, it wasn't destroyed [said in a tone that sounds very close to 'I don't understand']. It was completely stable, it was efficient. It was perfect, like it was created to be. Like it should have been.
And Flynn--[there's a short pause, a creak from the material of his suit as his fingers curl into his palm. A muscle jumps in his jaw. What Kay's saying is skipping dangerously close to the last sentiment that Flynn had tried to get across to CLU, that the way he'd perfected the system had been incorrect]--that was what he wanted [hadn't it been?], what he created me to do ( ... )
Reply
[He thinks about it, very carefully, how to approach it.] Flynn might have changed the purpose of his system, but it was still his system. You were to represent him and the system with it. His biggest fault was not being able to see that you were capable of change. It might have been an efficient system, but it wasn't what he wanted.
[He tapped his fingers against his knee and pulled out his communicator.] Can I see your game? I'd like to take it for analysis.
Reply
[And then his mouth twitches with a frown. He'd thought of the Grid as his system for so long that that statement about it being Flynn's is hard to acknowledge.] Then what did Flynn want from me? Why did he give me a specific task, a specific purpose to perfect it, only to disapprove of it when I performed my function? Only to wrongly change what he wanted and go back on his word? I did what he told me to, but then I am told I am wrong for it.
[Then, at the request, he shifts in his seat and leans to the side to take the small computer from nearby, then settle it in his lap.] It isn't finished. Do you still want it?
Reply
He probably thought that you understood the implication that he wanted it changed to conform to the ISOs. But at the same time, he didn't quite understand that he needed to tell you that, and it probably made you feel bad that he was putting so much stock in them, and so much in you.
[The Rinzler thing he would find a way to address later.]
Reply
There was no--[brief pause] emotional component to it. But. Flynn understood very little, particularly about the ISOs. Conformity of the system to the ISOs' presence was impossible. Their integration into the system did not work, would not work, and he repeatedly failed to see that.
The failure was his, not mine.
[And then he'll move to hand Kay the computer.] I'd like to work more on it, when your analysis is complete.
Reply
[He takes the computer.] I'll bring it back before long. But, do you really think it was impossible for a program of your capacity to adjust the existing system to integrate the ISOs? Isn't that doing you a disservice too?
Reply
I--Flynn--[that catches him a moment, as he's not sure exactly how to answer that.]. I didn't [lengthy pause] see how that could work. If it could work. Or that it was necessary. Their presence conflicted with my directive, their presence was horribly incompatible with a perfect system. The system was worse off because the ISOs existed, even after the attempts I made to integrate them.
Elimination became the best and only option.
Reply
Leave a comment