Выбрать главу

That’s obvious. In modern lax and permissive times, we forget, but parents always expect to be obeyed, and in more rigid times-in the days of the Romans or Victorians-they went all apoplectic and psychotic if they were not. Roman fathers had the power of life and death over their children, and I imagine death for disobedience was not completely unheard of. And we all know that God reserves places in Hell for disobedient sinners.

The Third Law would read: A child must protect its own existence, unless that would violate the First or Second Laws.

To us, it is rather unthinkable that a parent would expect a child to die or even to suffer injury in the protection of his parents or his obedience to them (thus refraining from violating First and Second Laws). Rather, parents are likely to risk their own lives for their children.

But consider the Divine Father. In the more rigid Godcentered religions, such as Judaism, Christianity, and Islam, it is expected that human beings will readily, and even joyously, suffer harm all the way to death by torture rather than transgress the least of God’s commandments. Jews, Christians, and Moslems have all gone to their death sturdily rather than do such apparently harmless things as eat bacon, throw a pinch of incense on an idolatrous altar, acknowledge the wrong person as Caliph, and so on. There, one must admit, the Third Law holds.

If, then, we wish to know how robots would react to the loss of human beings, we must see how human beings react to the loss of all-wise, all-powerful parents. Human beings have to find substitutes that supply the loss, and, therefore, so must robots. This is really an obvious thought and is rarely put forward only because most people are very nervous about seeming to be blasphemous. However, back in mo, that magnificent iconoclast, Voltaire, said, “If God did not exist, it would be necessary to invent him.” And if I may be permitted to paddle my rowboat in the wake of Voltaire’s ocean liner, I make bold to agree with him.

It follows, then, that if robots are stranded in a society which contains no human beings, they will do their best to manufacture some. Naturally, there may be no consensus as to what a human being looks like, what its abilities are, and how intelligent it might be. We would expect, then, that all sorts of paths would be taken, all sorts of experiments would be conducted.

After all, think how many gods-and with what variety of nature, appearance and ability-have been invented by human beings who had never seen one, but wanted one desperately just the same. With all that in mind, read the fourth entry in the “Robots and Aliens” series.

Chapter 1. New Beginnings

“So, have you decided on a new name yet?”

“Yes.”

Derec waited expectantly for a moment, then looked around in exasperation from the newfound robot to his companions. Ariel and Dr. Avery were both grinning. Wolruf, a golden-furred alien of vaguely doglike shape, was also grinning in her own toothy way. Beside Wolruf stood two more robots, named Adam and Eve. Neither of them seemed amused.

The entire party stood in the jumbled remains of the City Computer Center. It was a testament to Dr. Avery’s engineering skills that the computer still functioned at all, but despite the thick layer of dust over everything and the more recent damage from the struggle to subdue the renegade robot that now stood obediently before them, it still hummed with quiet efficiency as it carried out Avery’s orders to reconstruct the city the robot had been in the process of dismantling.

The robot had originally called itself the Watchful Eye, but Derec had tired of that mouthful almost immediately and had ordered it to come up with something better. Evidently the robot had obeyed, but…

“Ask a simple question,” Derec muttered, shaking his head, but before he could ask a more specific one, such as what the new name might be, the robot spoke again.

“I have chosen the name of a famous historical figure. You may have heard of him. Lucius, the first creative robot in Robot City, who constructed the work of art known as ‘Circuit Breaker.’”

“Lucius?” Derec asked, surprised. He had heard of Lucius, of course, had in fact solved the mystery of Lucius’s murder, but a greater gulf than that which existed between the historical figure and this robot was hard to imagine. Lucius had been an artist, attempting to bring beauty to an otherwise sterile city, while this robot had created nothing but trouble.

“That is correct. However, to avoid confusion I have named myself ‘Lucius II.’ That is ‘two’ as in the numeral, not ‘too’ as in ‘also.’”

“Just what we need,” Or. Avery growled. “Another Lucius.” Avery disliked anything that disrupted his carefully crafted plan for Robot City, and Lucius’s creativity had disrupted it plenty. In retaliation, Avery had removed the creative impulse from all of the city’s robots. He looked at his new Lucius, this Lucius II, as if he would like to remove more than that from it.

The robot met his eyes briefly, its expression inscrutable, then turned to the two other robots in the group surrounding it.

“We should use speech when in the presence of humans,” Adam said after a moment, and Derec realized that Lucius II had been speaking via comlink.

“Is this your judgment or an order given to you by humans?” asked Lucius II.

“Judgment,” replied Adam.

“Does it matter?” Ariel asked.

“Yes. If it had been an order, I would have given it higher priority, though not as high as if it had been an order given directly to me. In that case it would become a Second Law obligation.”

The Second Law of Robotics stated that a robot must obey the orders of human beings unless those orders conflicted with the First Law, which stated that a robot could not harm a human or through inaction allow a human to come to harm. Those, plus the Third Law, which stated that a robot must act to preserve its own existence as long as such protection did not conflict with the first two Laws, were built into the very structure of the hardware that made up the robot’s brain. They could not disobey them without risking complete mental freeze-up.

Derec breathed a soft sigh of relief at hearing Lucius II refer to the Second Law. It was evidence that he intended to obey it, and, by implication, the other two as well. Despite his apparent obedience since they had stopped him, Derec hadn’t been so sure.

Lucius II was still his own robot, all the same. Ariel’s question had been an implicit Second-Law order to answer, and he had done so, but now that he had fulfilled that obligation, Lucius II again turned to Adam and Eve and said, “We seem to have much in common.” As he spoke, his features began to change, flowing into an approximation of theirs.

Adam, Eve, and Lucius II were not ordinary robots. Where ordinary robots were constructed of rigid metal and plastics, these three were made of tiny cells, much like the cells that make up a human body. The robot cells were made of metal and plastic, certainly, but that was an advantage rather than a limitation, since the robot cells were much more durable than organic cells and could link together in any pattern the central brain chose for them. The result was that the robots could take on any shape they wished, could change their features-or even their gross anatomy-at will.

The other robots in Robot City, with one exception, were also made of cells, but Dr. Avery’s programming restricted them to conservative robot forms. Not so with these three. They were not of Avery’s manufacture, and without his restriction they used their cellular nature far more than the City robots, forgoing hard angles, joints and plates in favor of smooth curves and smooth, continuous motion. They looked more like metal-coated people than like the stiff-jointed caricatures of men that were normal robots, but even those features weren’t constant. They imprinted on whomever was foremost in their consciousness at the time, becoming walking reflections of Derec or Ariel or Avery, or even the alien Wolruf.