R. Daneel considered that for a long time. Then he held out his hand. “I must leave now, friend Elijah. It was good to see you. May we meet again soon.”
Baley gripped the robot’s hand, warmly, “If you don’t mind, R. Daneel,” he said, “not too soon.”
Lenny
United States Robots and Mechanical Men Corporation had a problem. The problem was people.
Peter Bogert, Senior Mathematician, was on his way to Assembly when he encountered Alfred Lanning, Research Director. Lanning was bending his ferocious white eyebrows together and staring down across the railing into the computer room.
On the floor below the balcony, a trickle of humanity of both sexes and various ages was looking about curiously, while a guide intoned a set speech about robotic computing.
“This computer you see before you,” he said, “is the largest of its type in the world. It contains five million three hundred thousand cryotrons and is capable of dealing simultaneously with over one hundred thousand variables. With its help, U. S. Robots is able to design with precision the positronic brains of new models. “The requirements are fed in on tape which is perforated by the action of this keyboard-something like a very complicated typewriter or linotype machine, except that it does not deal with letters but with concepts. Statements are broken down into the symbolic logic equivalents and those in turn converted to perforation patterns.
“The computer can, in less than one hour, present our scientists with a design for a brain which will give all the necessary positronic paths to make a robot…”
Alfred Lanning looked up at last and noticed the other. “Ah, Peter,” he said.
Bogert raised both hands to smooth down his already perfectly smooth and glossy head of black hair. He said, “You don’t look as though you think much of this, Alfred.”
Lanning grunted. The idea of public guided tours of U. S. Robots was of fairly recent origin, and was supposed to serve a dual function. On the one hand, the theory went, it allowed people to see robots at close quarters and counter their almost instinctive fear of the mechanical objects through increased familiarity. And on the other hand, it was supposed to interest at least an occasional person in taking up robotics research as a life work.
“You know I don’t,” Lanning said finally. “Once a week, work is disrupted. Considering the man-hours lost, the return is insufficient.”
“Still no rise in job applications, then?”
“Oh, some, but only in the categories where the need isn’t vital. It’s research men that are needed. You know that. The trouble is that with robots forbidden on Earth itself, there’s something unpopular about being a roboticist.”
“The damned Frankenstein complex,” said Bogert, consciously imitating one of the other’s pet phrases.
Lanning missed the gentle jab. He said, “I ought to be used to it, but I never will. You’d think that by now every human being on Earth would know that the Three Laws represented a perfect safeguard; that robots are simply not dangerous. Take this bunch.” He glowered down. “Look at them. Most of them go through the robot assembly room for the thrill of fear, like riding a roller coaster. Then when they enter the room with the MEC model-damn it, Peter, a MEC model that will do nothing on God’s green Earth but take two steps forward, say ‘Pleased to meet you, sir,’ shake hands, then take two steps back-they back away and mothers snatch up their kids. How do we expect to get brainwork out of such idiots?”
Bogert had no answer. Together, they stared down once again at the line of sightseers, now passing out of the computer room and into the positronic brain assembly section. Then they left. They did not, as it turned out, observe Mortimer W. Jacobson, age 16-who, to do him complete justice, meant no harm whatever.
In fact, it could not even be said to be Mortimer’s fault. The day of the week on which the tour took place was known to all workers.
All devices in its path ought to have been carefully neutralized or locked, since it was unreasonable to expect human beings to withstand the temptation to handle knobs, keys, handles and pushbuttons. In addition, the guide ought to have been very carefully on the watch for those who succumbed.
But, at the time, the guide had passed into the next room and Mortimer was tailing the line. He passed the keyboard on which instructions were fed into the computer. He had no way of suspecting that the plans for a new robot design were being fed into it at that moment, or, being a good kid, he would have avoided the keyboard. He had no way of knowing that, by what amounted to almost criminal negligence, a technician had not inactivated the keyboard.
So Mortimer touched the keys at random as though he were playing a musical instrument.
He did not notice that a section of perforated tape stretched itself out of the instrument in another part of the room-soundlessly, unobtrusively.
Nor did the technician, when he returned, discover any signs of tampering. He felt a little uneasy at noticing that the keyboard was live, but did not think to check. After a few minutes, even his first trifling uneasiness was gone, and he continued feeding data into the computer.
As for Mortimer, neither then, nor ever afterward, did he know what he had done.
The new LNE model was designed for the mining of boron in the asteroid belt. The boron hydrides were increasing in value yearly as primers for the proton micropiles that carried the ultimate load of power production on spaceships, and Earth’s own meager supply was running thin.
Physically, that meant that the LNE robots would have to be equipped with eyes sensitive to those lines prominent in the spectroscopic analysis of boron ores and the type of limbs most useful for the working up of ore to finished product. As always, though, the mental equipment was the major problem.
The first LNE positronic brain had been completed now. It was the prototype and would join all other prototypes in U. S. Robots’ collection. When finally tested, others would then be manufactured for leasing (never selling) to mining corporations.
LNE-Prototype was complete now. Tall, straight, polished, it looked from outside like any of a number of not-too-specialized robot models.
The technician in charge, guided by the directions for testing in the Handbook of Robotics, said, “How are you?”
The indicated answer was to have been, “I am well and ready to begin my functions. I trust you are well, too,” or some trivial modification thereof.
This first exchange served no purpose but to show that the robot could hear, understand a routine question, and make a routine reply congruent with what one would expect of a robotic attitude. Beginning from there, one could pass on to more complicated matters that would test the different Laws and their interaction with the specialized knowledge of each particular model.
So the technician said, “How are you?” He was instantly jolted by the nature of LNE-Prototype’s voice. It had a quality like no robotic voice he had ever heard (and he had heard many). It formed syllables like the chimes of a low-pitched celeste.
So surprising was this that it was only after several moments that the technician heard, in retrospect, the syllables that had been formed by those heavenly tones. They were, “Da, da, da, goo.” The robot still stood tall and straight but its right hand crept upward and a finger went into its mouth.