Выбрать главу

His remark had made her furious at the time, but time had softened the edges of her anger. Avery might be a conceited, egocentric ass, but there had been some truth in what he’d said. She’d looked in that mirror too often and seen herself backing away from contact with other people to be with her robots. Surely she’d been content here on this ship for the last few years, with only Basalom and a few other robots for company.

Avery she missed not at all; her son sometimes she missed terribly. Basalom and the others had become her surrogate children.

“Gently,” she cautioned Basalom. A spheroid of silvery-gray metal approximately two meters in diameter sat on the workbench before her, its gleaming surface composed of tiny dodecahedral segments. She’d just finished placing the delicate, platinum-iridium sponge of a positronic brain into a casing within the lumpy sphere. Now Basalom draped the sticky lace of the neural connections over the brain and sealed the top half of the casing. The geometric segments molded together seamlessly.

“You can put it in the probe,” Janet told the robot, then added: “What’s this about being uneasy?”

“You have built me very well, Doctor; that is the only reason I sense anything at all. I am aware of a millisecond pause in my positronic relays due to possible First Law conflicts,” Basalom replied as he carefully lifted the sphere and moved it to the launching tube. “While there is no imminent danger of lock-up, nor is this sufficient to cause any danger of malfunction or loss of effectiveness, it’s my understanding that humans feel a similar effect when presented with an action that presents a moral conflict. Thus, my use of the human term.”

Janet grinned, deepening the lines netting her eyes. “Longwinded, but logical enough, I suppose.”

Basalom blinked again. “Brevity is more desired than accuracy when speaking of human emotions?”

That elicited a quick laugh. “Sometimes, Basalom. Sometimes. It’s a judgment call, I’m afraid. Sometimes it doesn’t matter what you say so long as you just talk.”

“I am not a good judge when it comes to human emotions, Doctor.”

“Which puts you in company with most of us, I’m afraid.” Janet clamped the seals on the probe’s surface and patted it affectionately. LEDs glowed emerald on the launching tube’s panel as she closed the access.

“What does a human being do when he or she is uneasy, Doctor Anastasi?”

Janet shrugged, stepping back. “It depends. If you believe in something, you go ahead with it. You trust your judgment and ignore the feeling. If you never have any doubts, you’re either mad or not thinking things through.”

“Then you have reservations about your experiment as well, but you will still launch the probe.”

“Yes,” she answered. “If people were so paralyzed by doubt that they never did anything without being certain of the outcome, there’d never be children, after all.”

As Janet watched, Basalom seemed to ponder that. The robot moved a step closer to the controls for the launching tube; its hand twitched-another idiosyncrasy. The robot seemed to be on the verge of wanting to say more. The glimmer of a thought struck her. “Basalom?”

“Yes, Doctor?”

“Would you care to launch this probe?”

Blink. Twitch. For a moment, the robot didn’t move. Janet thought perhaps it would not, then the hand reached out and touched the contact. “Thank you, Doctor,” Basalom said, and pressed.

Serried lights flashed; there was a chuff of escaping air, and the probe was flung into the airless void beyond. Basalom turned to watch it on the viewscreen; Janet watched him.

“You never said what your reservations were exactly, Basalom,” she noted.

“These new robots-with your programming, so much is left for them to decide. Yes, the Three Laws are imbedded in the positronic matrix, but you have given them no definition of ‘human.”‘

“You wonder what will happen?”

“If they one day encounter human beings, will they recognize them? Will they respond as they are supposed to respond?”

Janet shrugged. “I don’t know. That’s the beauty of it, Basalom. I don’t know.”

“If you say so, Doctor. But I don’t understand that concept.”

“They’re seeds. Formless, waiting seeds coded only with the laws. They don’t even know they’re robots. I’m curious to see what they grow up to be, my friend.”

Janet turned and watched the hurtling probe wink in sunlight as it tumbled away from the ship. It dwindled as it fell into the embrace of the world’s gravity and was finally lost in atmospheric glare. Janet sighed.

“This one’s planted,” she said. She took a deep breath. “Now let’s get out of here,” she said.

Chapter 2. The Doppleganger

The probe lay encased in mud halfway down a hillside. The once-silvery sides were battered and scorched from the long fall through the atmosphere; drying streamers of black earth coated the dented sides. Ghostly heat waves shimmered, and the metallic hull ticked as it cooled and contracted. The echoes of its landing reverberated for a long time among the hills.

Inside the abused shell, timed relays opened and fed power to the positronic circuitry of the robot nestled in its protective cradle. The neophyte mind found itself in total darkness. Had it been a living creature, its birth instincts would have taken over like a sea turtle burrowing from the wet sand to find the shimmering sea. The robot had its own instinct-analogue-the Three Laws of Robotics. Knowledge of these basic rules flooded the robot’s brightening awareness.

First Law: A robot may not harm a human being, or, through inaction, allow a human being to come to harm.

Second Law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

Third Law: A robot must protect its own existence as long as such protection does not interfere with the First or Second Laws.

This was the manner in which most of known human space defined the Laws. Any schoolchild of Aurora or Earth or Solaris could have recited them by rote. But to the fledgling, there was one important, essential difference. To the fledgling, there were no words involved, only deep, core compulsions. The fledgling had no sense that it had been built or that it was merely a constructed machine.

It didn’t think of itself as a robot.

It only knew that it had certain instructions it must obey.

As survival instincts, the Laws were enough to spark a response. Second Law governed the fledgling’s first reactions, enhanced by Third Law resonance. There were imperious voices in its mind: inbuilt programming, speaking a language it knew instinctively. The robot followed the instructions given it, and more circuits opened.