The trouble was, Kaelor had to know what she was doing as well as she did. How far would he be able to go before First Law imperative overrode the Second Law compulsion to obey orders?
There was one last thing she could do to help Kaelor. Fredda did not have any realistic hope that the Third Law’s requirement for self-preservation would help sustain Kaelor, but she could do her best to reinforce it all the same. “It is also vital for you to remember that you are important as well. Dr. Lentrall needs you, and he very much wants you to continue in his employ. Isn’t that so, Doctor?”
Lentrall looked up from the hole he was staring at in the floor, and glanced at Fredda before settling his gaze on Kaelor. “Absolutely,” he said. “I need you very much, Kaelor.”
“Thank you for saying so,” Kaelor said. He turned his gaze back on Fredda. “I am ready for your questions,” he said.
“Good,” said Fredda. It might well help Kaelor if she kept the questions as disordered as possible, and tossed in a few unrelated ones now and then. “You work for Dr. Lentrall, don’t you?” she asked.
“Yes,” said Kaelor.
“How long have you been in his employ?”
“One standard year and forty-two days.”
“What are the specifications for your on-board memory system?
“A capacity of one hundred standard years non-erasable total recall for all I have seen and heard and learned.”
“Do you enjoy your work?”
“No,” said Kaelor. “Not for the most part.”
An unusual answer for a robot. Generally a robot, when given the chance, would wax lyrical over the joys of whatever task it was performing.
“Why do you not enjoy your work?” Fredda asked.
“Dr. Lentrall is often abrupt and rude. He will often ask for my opinion and then reject it. Furthermore, much of my work in recent days has involved simulations of events that would endanger humans.”
Uh-oh, thought Fredda. Clearly it was a mistake to ask that follow-up question. She would have to reinforce his knowledge of the lack of danger, and then change the subject, fast, before he could pursue that line of thought. Thank Space she had turned down his pseudo-clock-rate. “Simulations involve no actual danger to humans,” she said. “They are imaginary, and have no relation to actual events. Why did you grab Dr. Lentrall and force him under a bench yesterday?”
“I received a hyperwave message that he was in danger. First Law required me to protect him, so I did.”
“And you did it well,” Fredda said. She was trying to establish the point that his First Law imperatives were working well. In a real-life, nonsimulated situation, he had done the proper thing. “What is the status of your various systems, offered in summary form?”
“My positronic brain is functioning within nominal parameters, though near the acceptable limit for First Law-Second Law conflict. All visual and audio sensors and communications systems are functioning at specification. All processing and memory systems are functioning at specification. A Leving Labs model 2312 Robotic Test Meter is jacked into me and running constant baseline diagnostics. All motion and sensation below my neck, along with all hyperwave communication, have been cut off by the test meter, and I am incapable of motion or action other than speech, sight, thought, and motion of my head.”
“Other than the functions currently deactivated by the test meter, deliberate deactivations, and normal maintenance checks, have you always operated at specification?”
“Yes,” said Kaelor. “I remember everything.”
Fredda held back from the impulse to curse out loud, and forced herself to keep her professional demeanor. He had violated her order not to volunteer information, and had volunteered it in regard to the one area they cared about. Only a First Law imperative could have caused him to do such a thing. He knew exactly what they were after, and he was telling them, as best he could under the restrictions she had placed on him, that he had it.
Which meant he was not going to let them have it. They had lost. Fredda decided to abandon her super-cautious approach, and move more quickly toward what they needed.
“Do you remember the various simulations Dr. Lentrall performed, and the data upon which they were based?”
“Yes,” Kaelor said again. “I remember everything.”
A whole series of questions she dared not ask flickered through her mind, along with the answers she dared not hear from Kaelor. Like a chess player who could see checkmate eight moves ahead, she knew how the questions and answers would go, almost word for word.
Q: If you remember everything, you recall all the figures and information you saw in connection with your work with Dr. Lentrall. Why didn’t you act to replace as many of the lost datapoints as possible last night when Dr. Lentrall discovered his files were gone? Great harm would be done to his work and career if all those data were lost for all time.
A: Because doing so would remind Or. Lentrall that I witnessed all his simulations of the Comet Grieg operation and that I therefore remembered the comet’s positional data. I could not provide that information, as it would make the comet intercept and retargeting possible, endangering many humans. That outweighed the possible harm to one man’s career.
Q: But the comet impact would enhance the planetary environment, benefiting many more humans in the future, and allowing them to live longer and better lives. Why did you not act to do good to those future generations?
A: I did not act for two reasons. First, I was specifically designed with a reduced capacity for judging the Three-Law consequences of hypothetical circumstances. I am incapable of considering the future and hypothetical well-being of human beings decades or centuries from now, most of whom do not yet exist. Second, the second clause of the First Law merely requires me to prevent injury to humans. It does not require me to perform any acts in order to benefit humans, though I can perform such acts if I choose. I am merely compelled to prevent harm to humans. Action compelled by First Law supersedes any impulse toward voluntary action.
Q. But many humans now alive are likely to die young, and die most unpleasantly, if we do no repair the climate. By preventing the comet impact, there is a high probability you are condemning those very real people to premature death. Where is the comet? I order you to tell me its coordinates. mass, and trajectory.
A. I cannot tell you. I must tell you. I cannot tell you-
And so on, unto death.
It would have gone on that way, if it had lasted even that long. Either the massive conflict between First and Second Law compulsions would have burned out his brain, or else Kaelor would have invoked the second clause of First Law. He could not, through inaction, allow harm to humans.
Merely by staying alive, with the unerasable information of where the comet was in his head, he represented a danger to humans. As long as he stayed alive, there was, in theory, a way to get past the confidentiality features of Kaelor’s brain assembly. There was no way Fredda could do it here, now, but in her own lab, with all her equipment, and with perhaps a week’s time, she could probably defeat the safeties and tap into everything he knew.
And Kaelor knew that, or at least he had to assume it was the case. In order to prevent harm to humans, Kaelor would have to will his own brain to disorganize, disassociate, lose its positronic pathing.
He would have to will himself to die.