The aircar eased itself to a halt in midair, then began to sink lower. They had arrived at Jomaine’s house, hard by Leving Labs, close to where it had all begun. The car landed on his roof and the hatch sighed open. The cabin light came gently up. Jomaine stood and reached out to Fredda across the narrow cabin, took her hand and squeezed it. “There is a great deal you have to think about, Fredda Leving. But no one can protect you anymore. Not now. The stakes are far too high. I think you had best start asking yourself what sort of answer Caliban is likely to come up with.”
Fredda nodded. “I understand,” she said. “But remember that you are as deeply involved as I am. I can’t expect you to protect me-but remember, we will sink or swim together.”
“That’s not strictly true, Fredda,” Jomaine said. His voice was quiet, gentle, with no hint of threat or malice. His tone made it clear that he was setting out facts, not trying to scare her. “Remember that you, not I, designed the final programming of Caliban ‘ s brain. I have the documentation to prove it, by the way. Yes, we worked together, and no doubt a court could find me guilty of some lesser charge. But it was your plan, your idea, your experiment. If that brain should prove capable of assault, or murder, the blood will be on your hands, not mine. “
With that, he looked into her eyes for the space of a dozen heartbeats, and then turned away. There was nothing left to say.
Fredda watched Jomaine leave the car, watched the door seal itself, watched the cabin light fade back down to darkness. The aircar lifted itself back up into the sky and she turned her head toward the window. She stared sightlessly out onto the night-shrouded, slow-crumbling glory that was the city of Hades. But then the car swung around, and the Leving Labs building swept across her field of view. Suddenly she saw not nothing, but too much. She saw her own past, her own folly and vaulting ambition, her own foolish confidence. There, in that lab, she had bred this nightmare, raised it on a steady diet of her own disastrous questions.
It had seemed so simple back then. The first New Law robots had passed their in-house laboratory trials. After rather awkward and fractious negotiations, it had been agreed they would be put to use at Limbo. It was a mere question of manufacturing more robots and getting them ready for shipment. That would require effort and planning, yes, but for all intents and purposes, the New Law project was complete insofar as Fredda was concerned. She had time on her hands, and her mind was suddenly free once again to focus on the big questions. Basic, straightforward questions, obvious follow-ons to the theory and practice of the New Law robots.
If the New Laws are truly better, more logical, better suited to the present day, then won’t they fit a robot’s needs more fully? That had been the first question. But more questions, questions that now seemed foolish, dangerous, threatening, had followed. Back then they had seemed simple, intriguing, exciting. But now there was a rogue robot on the loose, and a city enough on edge that riots could happen.
If the New Laws are not best suited to the needs of a robot living in our world, then what Laws would be?What Laws would a robot pick for itself!
Take a robot with a wholly blank brain, a gravitonic brain, without the Three Laws or the New Laws ingrained into it. Imbue it instead with thecapacity for Laws, theneed for Laws. Give it a blank spot, as it were, in the middle of its programming, a hollow in the middle of where its soul would be if it had a soul. In that place, that blank hollow, give it the need to find rules for existence. Set it out in the lab. Create a series of situations where it will encounter people and other robots, and be forced to deal with them. Treat the robot like a rat in a maze, force it to learn by trial and error.
It will have the burning need to learn, to see, to experience, to form itself and its view of the universe, to set down its own laws for existence. It will have the need to act properly, but no clear knowledge of what the proper way was.
But it would learn. It would discover. And, Fredda told herself most confidently, it would end up conferring on itself the three New Laws she had formulated.That would be a proof, a confirmation that all her philosophy, her analysis and theory, was correct.
The car reached its assigned altitude. The robot pilot swung the aircar around, pointed its nose toward Fredda’ s house, and accelerated. Fredda felt herself pressed back into the cushions. The gentle pressure seemed to force her down far too deep into the seat, as if some greater force were pressing her down. But that was illusion, the power of her own guilty imagination. She thought of the things she had told her audience, the dark secrets of the first days of robotics, untold thousands of years before.
The myth of Frankenstein rose up in the darkness, a palpable presence that she could all but see and touch. There were things in that myth that she had not told to her audience. The myth revolved about the sin of hubris, and presuming on the power of the gods. The magician in the story reached for powers that could not be his, and, in most versions of the tale, received the fitting punishment of complete destruction at the hands of his creation.
And Caliban had struck her down in his first moment of awareness, had he not? She had given him that carefully edited datastore, hoping that coloring the facts with her own opinions would help form a link between the two of them, make him more capable of understanding her.
Had he understood her all too well, even in that first moment? Had he struck her down? Or was it someone else?
It was impossible for her to know, unless she tracked him down, got to him before Kresh did, somehow, and asked Caliban herself.
Thatwas a most disconcerting idea. Would it be wise to go out looking for the robot that had seemingly tried to kill her?
Or was that the only way she could save herself? Find him and establish his innocence? Besides, it was not as if Caliban was the only threat she faced, or that simple physical attack was the only way to destroy a person.
The whole situation was spiraling out of control. It would not need to go much further in order to destroy her reputation altogether. Perhaps it was too late already. If her reputation collapsed, she would not be able to protect the New Law robots for the Limbo Project. There was a great deal of infighting left to do before the NLs would be safe. Rebuilding Limbo would require robot labor; there simply weren’t enough skilled people, Spacer or Settler, available to do the work. But Tonya Welton had made it clear that it was New Law robots or nothing for Limbo. Without the New Law robots, the Settlers would pullout; the project would die.
And so would the planet.
Was it sheer egotism, hubris on a new, wider, madder plane, to imagine herself as that important? To think that without her there to protect the New Law robots, theplanet would collapse?
Her emotions told her that must be so, that one person could not be that important. But reason and logic, her judgment of the political situation, told her otherwise. It was like the game she had played as a child, setting up a whole line of rectangular game pieces balanced on their ends. Knock one down, and the next would fall, and the next and the next.
And she could hardly save the New Law robot project from inside a prison cell.
There were other versions of the old Frankenstein myth that she had found in her researches. Rarer, and somehow feeling less authentic, but there just the same. Versions where the magician redeemed himself, made up for his sins against the gods, by protecting his creation, saving it from the fear-crazed peasants that were trying to destroy it.