Deja Q, the anti-deus-ex-machina remix. What if the Continuum had not intervened to save Q from the Calamarain? TNG, ensemble cast, gen. Written for the
Finish-A-Thon; I had planned to do 30,000 words for it, but between massive headaches, vacation planning and craziness at work, I've only gotten 14,000. But I have the next several scenes in my head, and now that I'm on vacation and my internet access is seriously restricted, maybe I will be able to write the rest of it and have it ready to post when I come back.
Rest of the story is
here. To the extent that an android could feel such a thing, Data was troubled by the meeting, and its outcome.
He returned to engineering with Geordi, to work on the question of how best to destroy the three moons of Bre'el III without destroying the planet. Q was already there, and had begun work, with Wesley's assistance. When Geordi attempted to make an overture to Q, saying, "Hey, uh, I'm sorry-" Q cut him off with a cold tone of voice, saying "I'm not interested in your apologies, LaForge. We've got work to do and not much time to do it in." Geordi went rigid in response, a sign that he was experiencing anger, and thereafter neither of them made any reference to the meeting at all.
Data participated in the project, devoting as many of the resources of his positronic net as would be required, but the truth was, his brain processed at such a great speed, it was never the case that any physical activity he could engage in could utilize all the resources of his mind, and in a collaborative project where he required the input of humans as he worked, that left a considerable amount of processing power remaining to him. As he worked with Geordi, Wesley and Q, Data considered why it was that this situation was troubling him.
When he had been nearly killed by the Calamarain, he had overheard Geordi conversing with Commander Riker, about the status of the Bre'el IV moon. Riker had pointed out that the Calamarain would come back for Q again once the moon again reached perigee, and Geordi had said, "Commander, he's not worth it." From a strictly utilitarian perspective, this was obviously true -- Q's life was not worth the lives of everyone on Bre'el IV - but, although he had not had a chance to consult with Geordi to clarify his understanding of his friend's meaning yet, Data believed that that was not actually what Geordi meant... or at least, only part of his meaning. When Data had heard Geordi say those words, he had believed that what Geordi had meant included the concept that Q's life was not worth Data's life... that the fact that Data had been critically injured, and could have died, in trying to save Q had made Geordi feel negatively toward the concept of saving Q at all. This was understandable. Geordi's friend had nearly died in saving someone he did not like very much. Although Data did not experience emotion as humans understood it, he knew from the sense of loss he had experienced when Tasha had died, and Captain Picard's assurance to him that the human sensation of grief was similar, that he, too, would prefer not to perform an action that risked the life of a friend for the sake of a person he was not friendly with.
However, while Geordi's perspective was understandable, it was wrong.
Data did not undertake actions impulsively. He could not. His body moved much more slowly, even at his top speed, than his brain. Any actions he performed, he had carefully thought through the impact and the consequences before undertaking them. He had understood that there was a possibility that saving Q's life could end his own, but the calculation he had prioritized in making the decision was not the chance of his death but the chance of his success -- the odds that, if he undertook to anchor Q so the Calamarain could not fling him to his death or crush him against a bulkhead or drop him in the warp core, that he would retain sufficient control over his artificial musculature in the face of the likely disruption to his positronic net that the Calamarain's ionized energy field would cause that he could prevent Q's death before Geordi got the shields back up, regardless of whether or not so doing would cause him irreparable damage. And this had not been an altruistic or selfless action on Data's part; in fact Data did not believe that any action he himself ever undertook was truly selfless. He had acted to spare himself... perhaps not pain, not as humans understood it... but discomfort.
Data's own motives were transparent to him, because he was able to perceive his own programming. He knew the exact algorithms that were causing him discomfort. It was less clear to him what Dr. Soong's motivations had been in giving him any particular set of programming, but in general he tended to assume that his father had known what he was doing, and usually, his studies of human philosophy and moral theory indicated to him that in fact his father had put considerable thought and research into structuring his ethical programs. The problem was that if Data stood by and allowed a sentient being to die, when that sentient being was not an actively threatening hostile whose death would aid immediately in Data's own defense or the defense of others, and when Data had a clear means of saving that sentient being, it would violate the tenets of his ethical program, which would create profound discomfort for him. And unlike humans, Data's memory was perfect and his lifespan was theoretically unbounded. If he suffered discomfort for a violation of his ethical programming today, he would continue to experience that exact discomfort, unabated by the passage of time, any time he thought of the experience, for the rest of what might well be a very long life. Had he stood by and allowed Q to die, he would have suffered the most intense unpleasant sensation he was actually capable of experiencing for doing so... a sensation that was probably significantly less profoundly painful than the human emotion of guilt, but on the other hand, as Data could not experience human pain or human emotions, the quasi-emotions he did feel were extremely significant to him.
He spoke of himself as having no emotions, because he was aware that his subjective internal sensations were considerably less profound and complex than the human equivalents, and when he had tried to express his sensations in terms of emotion, when he had been under study by Starfleet researchers or when he had been at the Academy, humans had generally responded in a less than positive way to the comparison. So he stated that he had no emotions because, in human terms, it was apparently true. But it was not true that he had nothing analogous to an emotional state; Data's own study of artificial intelligence indicated that it was not possible to create a being capable of independent action who lacked any ability to make value judgements, and the mechanism that mediated value judgement was, in fact, emotion, or something like it. It felt right when Data did things that conformed acceptably to his programming; it felt wrong, and unpleasant, when he did not. He was capable of finding pleasure in the company of individual sentient beings, of seeking to ensure that their lives were as pleasant as possible - friendship, by any other name. He was capable of loyalty, and desiring to be obedient to lawful authority so long as it did not contradict his ethical programming. He was able to find enjoyment in experiences, particularly those experiences that brought him closer to his ideal of correctly emulating human behavior -- an ideal that itself was mediated by a state similar to emotion, after all, for if he were truly incapable of emotion, he would be incapable of desire, and therefore incapable of having ideals to aspire to. And he was able to experience pain, or something like it, when he did things that violated the algorithms of his ethical programming.
Dying at the hands of the Calamarain would have felt less unpleasant than standing by and watching Q, or any sentient being who was not a current and active danger to himself or the ship, die.
This was not a restriction Data was interested in trying to overcome. It was part of his ethical programming, part of his basic nature. It hurt, to the extent that anything could hurt him, when he stood by and was helpless to save lives. It was why he had plotted to violate the Prime Directive, though he had sworn to uphold it and was also programmed to desire to keep his word, when Sarjenka had faced death by natural disaster that the Enterprise could have resolved if only they were willing to expose themselves to a pre-warp culture... a clear violation of the Prime Directive, a philosophy he believed in, and yet. His algorithms didn't compel him to save everyone in the universe; when the death of sentient beings was a statistical probability or an aggregate concept, the death of myriad people he did not know and had never spoken to, he was able to balance it against abstract ideals such as the Prime Directive and resolve the necessity of obeying his ideal of non-interference without suffering discomfort. But as soon as he had heard Sarjenka's voice, she had been a defined being, a specific sentient consciousness who would suffer, who was in fact suffering right then... and the weighted values his ethical algorithms placed on different actions had shifted, in a way that was similar to the way the human brain's evolved programming seemed to compute similar algorithms, such that it became impossible for him to bear the thought of simply letting Sarjenka and her world die.
It was unlikely that he would like Q very much if he had emotions; no one else who had emotions seemed to, and Q's actions were inconsistent with behavior that inspired likeability. But that did not matter. He would not prioritize Q above people he considered friends, or people he had sworn his loyalty to, but Q was a specific sentient being who had spoken to Data, who was known to Data as an individual, who plainly experienced suffering, and in the face of that suffering Data had had little choice. He could have acted to save Q's life, and risk death, or he could have done nothing, and most certainly the contradiction with his ethical programming would have caused him considerable discomfort. Death was an unpleasant abstract concept -- the cessation of Data's existence would prevent Data from experiencing enjoyable sensations and the pleasure of knowing that he had achieved his goals -- but in itself, death would not cause him to suffer. He would no longer exist to experience discomfort after death. Whereas the discomfort of violating his own ethics would be with him for a very long time. So Data had acted to save Q's life because he would rather die than live with himself if he had not done it.
And now, that presented him with a conundrum. Because if he and the rest of the crew could not derive a method of saving Q without leading the Calamarain to attack the ship, then he would, in fact, be compelled to stand by and let Q die. His ethical programming distinguished between a situation in which he literally could do nothing -- a situation in which there was no action he could take that would most likely result in the immediate saving of a life, whereupon he would not feel discomfort for failing to save lives, because it had not been possible for him to do so -- and a situation in which he should do nothing because the consequences of acting to save a life might result in a negative outcome in the future, and unfortunately this was the latter. It was very likely that, if Data were to intervene to prevent the Calamarain from killing Q, again, the Calamarain would attack the ship, and quite possible that people would die as a result. This was not an acceptable outcome. But it was also not an outcome that his ethical algorithms were weighting as heavily as the immediate impact of allowing a person to die when he could take an action and save that person, even if it were unwise to do so. And this meant that if Q could not be saved, and thus subsequently Data were to take the correct action for the safety of the ship and allow his death, Data would suffer for it.
Humans would call that particular type of suffering "guilt". Data was not comfortable with assigning a word that was so closely aligned with a defined human emotional state to his own experiences, which were not emotions as humans understood them.
Wesley went off shift. The work continued, with just himself, Geordi and Q. It was likely that they would successfully derive a method by which the Bre'el III moons could be destroyed and/or brought down to the planetary surface in a controlled way to manage the damage they would cause. Whether it would be possible to derive a method to save Q as successfully -- and spare Data the discomfort that allowing a person to die when he had the power to save him -- was unknown at this time, but Data could not personally think of a strategy to do so, aside from the one Q had proposed. As they worked, he considered the ethical ramifications of the plan Q had proposed.
Obviously, if Q were proposing to create a duplicate of himself to send to his death in his place, this resolved nothing. An innocent person -- or rather a person who did not deserve death, since Q was not innocent but did not deserve death by Federation law either, and had never agreed to abide by Calamarain law except under the duress caused by a threat to others -- would still die. If that was the nature of Q's proposal, then Captain Picard was correct.
However, Data did not believe that that was what Q was proposing. It had sounded to him as if Q's plan was more akin to backing himself up before death.
If there were an empty Soong-type android body, a shell made by his father but never animated into life and consciousness, available to Data, and he were to download his engrams into that body to cause it to animate into a replica of himself, a separate iteration of the program that was Data -- that entity would be a separate person than himself, not a disposable decoy. Sending the newly animated android with a copy of his consciousness to die in his place would be a deplorable action. But backing himself up, by animating an android shell body with no consciousness such that it contained a copy of his consciousness, before going to his own death... that was an action Data would consider fully ethical, so long as doing so didn't destroy the consciousness that the android body might have had on its own. And Q's situation did not involve an existing body that might or might not awaken to its own consciousness on its own; it involved creating a copy of his body as well as his mind. There was, to Data's mind, no ethical problem with making a backup of yourself to live after you'd died, and Q had clarified that that was in fact the nature of his proposal.
Of course, from Data's perspective, part of the problem was that Q would still be dead. It didn't solve anything. Whether there were 2 and 1 died, or there was 1 and 1 died, either way one individual died, and Data would have to allow it, and this would cause him a sensation somewhat akin to human guilt.
But perhaps it made a difference from the perspective of reducing suffering.
Data was aware that there was a certain hypocrisy in his programming, in that he considered it fully acceptable for himself to commit self-sacrifice to save others, but was unwilling to accept another person's self-sacrifice without guilt. He did not know why Dr. Soong had programmed his ethical routines with such an imbalance -- perhaps because his self-preservation programming could be counted on to balance out a willingness to self-sacrifice on his part, but would not act against a willingness to sacrifice others, or allow them to sacrifice themselves. However, hypocrisy, itself, was a factor he had been programmed to identify in himself and attempt to root out, as it caused him discomfort as well. To resolve the difficulty caused by the fact that he was willing to sacrifice his own life but not to let Q sacrifice his -- not without discomfort, anyway -- Data had to consider the role of suffering.
When he had made the decision to risk his own death, he had not suffered for it. His algorithms had been in alignment, the weighted value of preventing death outweighing self-preservation. He had not experienced fear, or grief, or pain... or even the states he was capable of that were analogous to such sensations. He had been resolved, and free of regrets.
This did not appear to be the case with Q. The outburst in the conference room, where Q had stormed out because he had believed that the crew would not take action to ameliorate the impact of his death, and the moodiness and irritability that Q was demonstrating right now, indicated to Data that Q had regrets about his decision, and that he was suffering. He might choose to go through with his planned self-sacrifice because, like Data, he might find the emotional consequences of refusing too unbearable to bear, but he was not calm, not resolved, not accepting -- he might be resigned, but that was not the same as acceptance. Q appeared to be experiencing considerable negative emotion associated with his decision to sacrifice himself, and had expressed that while carrying out his plan would not have actually prevented his death, it would greatly alleviate the suffering that the thought of his own death was causing him.
Captain Picard was applying human ethical constructs to an alien being from an alien culture, because that being had a human body now. From the Captain's perspective, either a copy of oneself was not a separate sentient being at all, a prejudice that was still so powerful in the Federation that even Commander Riker had once killed clones of himself and Dr. Pulaski before they were done developing because he did not perceive them to be independent people, or it was a completely separate person and whether it lived or died had no bearing on your own life or death. The Captain considered the more ethical perspective to be the latter, and if those were the only choices, then Data agreed with him. But human beings could not, generally, be precisely copied -- even a DNA copy such as a clone would not typically have the thoughts and memories that the genetic donor had had. So humans had no framework for comprehending the idea of a separate being who was nonetheless an exact copy of oneself.
Data did. And having such a framework, he believed Q when Q said that he would be more willing to die if he knew there was another him who would live on after his death. The scenario Captain Picard had described, where the individual who was destined for death rejected his fate and desired that the other should die instead, was not likely if Q was accustomed to a perspective where two identical copies of the same person were the same person, and if his purpose in duplicating himself was, as he said, to "create a copy to live in his place" rather than a copy to die in his place.
He intended to investigate the technological feasibility of Q's proposal, despite the fact that creating a duplicate of Q would be against Federation law and the fact that he had essentially been ordered not to help Q in this matter. He was not nearly as confident as Q had been that there would be a way to successfully get the transporter to make a duplicate; there were too many unknowns, and simply knowing that it had happened once did not in fact imply that it would be possible to do it again on short notice. But if it proved to be possible to do it, then Q's suffering at the prospect of his own death would be alleviated, and that in turn would alleviate Data's own discomfort at the prospect of standing by and doing nothing as Q was executed by the Calamarain.
Having made his decision, he waited until Geordi needed to relieve himself. After Geordi had excused himself to the lavatory, Data spoke quietly. "Q, I have considered your proposal."
Q, bent over the console, didn't even look up. "Which one?"
The confusion was apparent. Q had discussed several proposals for the project at hand over the course of the last hour and fifty-seven minutes. "Your proposal to create a transporter duplicate of yourself, to remain behind and survive after you die."
Q stiffened, still not looking up. "Picard says it's not happening, Data. Don't worry about it."
"I will never achieve my goal of becoming as much like the ideal of humanity as possible if I cannot apply my own ethical judgement to problems," Data said, still very quietly, almost whispering in Q's ear. "I will not discuss this issue with anyone else until I have determined whether the process is feasible at all. But I thought that you should know that I intend to investigate whether your suggestion is technologically possible, and if it is, I will help you carry it out."
Finally Q looked at Data. He shook his head rapidly. "No, Data. I'm not... I'm not going to have you get yourself, I don't know, court-martialed or something for my sake. I don't... I'm just going to die, that's all. There's no way out. I'm not... I won't sacrifice you for my sake. Not again."
His eyes were bright, and Data detected moisture in them. Perhaps Q was on the verge of crying. This seemed odd, given that Data was proposing to aid him with what he perceived as his only escape from certain death, but Data did not always understand human emotion very well. "You are not sacrificing me in any sense. I chose to risk my life to save you before, Q. I am neither a child nor a programmed automaton incapable of exercising free will, and if I choose to risk legal consequences to help you save yourself, that is a choice I may make freely."
Q swallowed. The moisture in his eyes increased. He wiped at it with his arm. "Data, have you got any idea how much they'll all hate me if you throw your career away for me? It's just... it's just better if I don't involve you. If I just die."
"If my friends respect that I am a self-aware being with free will, then they will respect my decision. I do not typically disagree with Captain Picard in matters of ethics, but I believe that as an android I have a perspective the humans lack, and I believe that if it is possible to accomplish what you desire at all, it is possible to do so ethically. However, I must warn you that I am not at all certain it will prove to be technically feasible. I will review the literature on transporter accidents and duplications at the earliest possible opportunity to determine the likelihood that we would be able to replicate the incident that duplicated Commander Riker within the next forty-six hours, and I will explain my findings to you as soon as I have completed my assessment." He cocked his head slightly. "It is not actually under your control as to whether I undertake this study, Q. Should we determine that we are capable of performing this, technologically, you will be able to make the decision then as to your willingness to accept my help with your plan. I do believe, however, that I may be able to make Captain Picard understand my motivations in helping you, and that if it is technologically feasible and we simply do it, before informing others of our plans... once a second version of you has been created, Captain Picard will most likely be swayed if the two of you agree as to which of you will remain behind and do not appear to be distressed at the prospect that one will die."
"Easier to ask forgiveness than permission, huh, Data?" Q smiled weakly. His voice was hoarse, with tone variations in it that signified human distress, and his breathing had developed a certain arrhythmia. The moisture in his eyes spilled out onto his cheek. "Thank you. I can't -- I'd never be able to repay you enough, even if I do survive this, but... thank you."
"You are welcome," Data said. "But I must reiterate, I do not yet know if the project is even technologically possible, given the constraints we face."
"It's... just the fact that you're willing to try-" Geordi was approaching them, returning from the lavatory. Q wiped at his face again and said, more loudly, "Now that LaForge is back, I'm going to take a break and use the bathroom myself. Which, by the way, is utterly disgusting, and I can't believe humans have to put up with doing this several times a day."
He left before Geordi reached the console. "What was wrong with Q?" Geordi asked Data.
"He said he wished to use the lavatory, now that you were back."
"That's not what I meant. It looked like he was really upset about something. I didn't get close enough to see clearly, but it almost looked like he was... I don't know, crying. Or something."
Of course, Geordi could see such things at a greater distance than most humans, because the temperature variation between saline liquid and normal human skin would be obvious to him. "I did not observe any sign that he was upset," Data said, truthfully, because if he understood the nature of human emotion correctly, Q's emotional state would have been better described as overwhelmed than upset. "Perhaps you should ask him when he returns."
Geordi sighed. "Yeah, if he hasn't got it under control. I guess... if I was going to be killed in a couple of days, I might get upset for no immediate reason too. I just... as long as he doesn't make it our problem, I'd rather let him deal with it on his own. He's hard enough to handle when you keep it professional."
"Yes," Data said. In his mind, at the same time as he concentrated on the problem at hand, he was constructing a research methodology to investigate Q's proposal.