Выбрать главу

You: "Well, no, it's not that Jim has done anything. It's that someone else has done something for Bill that was so wonderful, that he has been promoted over Jim's head and has instantly become Bill's new best friend."

Robot: "But who has done this?"

You: "The man who ran away with Bill's wife, of course."

Robot (after a thoughtful pause): "But that can't be so. Bill must have felt profound affection for his wife and a great sadness over her loss. Is that not how human males feel about their wives, and how they would react to their loss?"

You: "In theory, yes. However, it turns out that Bill strongly disliked his wife and was glad someone had run off with her."

Robot (after another thoughtful pause): "But you did not say that was so."

You: "I know. That's what makes it funny. I led you in one direction and then suddenly let you know that was the wrong direction."

Robot: "Is it funny to mislead a person?"

You (giving up): "Well, let's get on with building this house."

In fact, some jokes actually depend on the illogical responses of human beings. Consider this one:

The inveterate horseplayer paused before taking his place at the betting windows, and offered up a fervent prayer to his Maker.

"Blessed lord," he murmured with mountain-moving sincerity, "I know you don't approve of my gambling, but just this once, Lord, just this once, please let me break even. I need the money so badly."

If you were so foolish as to tell this joke to a robot, he would immediately say, "But to break even means that he would leave the races with precisely the amount of money he had when he entered. Isn't that so?"

"Yes, that's so."

"Then if he needs the money so badly, all he need do is not bet at all, and it would be just as though he had broken even."

"Yes, but he has this unreasoning need to gamble."

"You mean even if he loses."

"Yes."

"But that makes no sense."

"But the point of the joke is that the gambler doesn't understand this."

"You mean it's funny if a person lacks any sense of logic and is possessed of not even the simplest understanding?"

And what can you do but turn back to building the house again?

But tell me, is this so different from dealing with the ordinary humorless human being? I once told my father this joke:

Mrs. Jones, the landlady, woke up in the middle of the night because there were strange noises outside her door. She looked out, and there was Robinson, one of her boarders, forcing a frightened horse up the stairs.

She shrieked, "What are you doing, Mr. Robinson?"

He said, "Putting the horse in the bathroom."

"For goodness sake, why?"

"Well, old Higginbotham is such a wise guy. Whatever I tell him, he answers, 'I know. I know,' in such a superior way. Well, in the morning, he'll go to the bathroom and he'll come out yelling, 'There's a horse in the bathroom.' And I'll yawn and say, 'I know, I know."'

And what was my father's response? He said, "Isaac, Isaac. You're a city boy, so you don't understand. You can't push a horse up the stairs if he doesn't want to go."

Personally, I thought that was funnier than the joke.

Anyway, I don't see why we should particularly want a robot to have a sense of humor, but the point is that the robot himself might want to have one-and how do we give it to him?

Chapter l. Can You Feel Anything When I Do This?

"Mandelbrot, what does it feel like to be a robot?"

"Forgive me, Master Derec, but that question is meaningless. While it is certainly true that robots can be said to experience sensations vaguely analogous to specified human emotions in some respects, we lack feelings in the accepted sense of the word."

"Sorry, old buddy, but I can't help getting the hunch that you're just equivocating with me."

"That would be impossible. The very foundations of positronic programming insist that robots invariably state the facts explicitly."

"Come, come, don't you concede it's possible that the differences between human and robotic perception may be, by and large, semantic? You agree, don't you, that many human emotions are simply the by-products of chemical reactions that ultimately affect the mind, influencing moods and perceptions. You must admit, humans are nothing if not at the mercy of their bodies. "

"That much has been proven, at least to the satisfaction of respected authorities. "

"Then, by analogy, your own sensations are merely byproducts of smoothly running circuitry and engine joints. A spaceship may feel the same way when, its various parts all working at peak efficiency, it breaks into hyperspace. The only difference between you and it being, I suppose, that you have a mind to perceive it."

Mandelbrot paused, his integrals preoccupied with sorting Derec's perspectives on these matters into several categories in his memory circuits. "I have never quite analyzed the problem that way before, Master Derec. But it seems that in many respects the comparison between human and robot, robot and spaceship must be exceedingly apt."

"Let's look at it this way, Mandelbrot. As a human, I am a carbon-based life-form, the superior result of eons of evolution of inferior biological life-forms. I know what it feels like because I have a mind to perceive the gulf between man and other species of animal life. And with careful, selective comparison, I can imagine-however minimally-what a lower life-form might experience as it makes its way through the day. Furthermore, I can communicate to others what I think it feels like."

"My logic circuits can accept this.”

“Okay then, through analogy or metaphor or through a story I can explain to others what a worm, or a rat, or a cat, or even a dinosaur must feel as they hunt meat, go to sleep, sniff flowers, or whatever."

"I have never seen one of these creatures and certainly wouldn't presume to comprehend what it must be like to be one."

"Ah! But you would know-through proper analogy-what it must be like to be a spaceship."

"Possibly, but I have not been provided with the necessary programming to retrieve the information. Furthermore, I cannot see how such knowledge could possibly help me fulfill the behavioral standards implicit in the Three Laws."

"But you have been programmed to retrieve such information, and your body often reacts accordingly, and sometimes adversely, with regards to your perceptions."'

"You are speaking theoretically?”

“Yes."

"Are you formally presenting me with a problem?"

"Yes."

"Naturally I shall do my best to please you, Master Derec, but my curiosity and logic integrals are only equipped to deal with certain kinds of problems. The one you appear to be presenting may be too subjective for my programmed potentials. "

“Isn't all logic abstract, and hence somewhat subjective, at least in approach? You must agree that, through mutually agreed upon paths of logic, you can use the certain knowledge of two irrefutable facts to learn a third, equally irrefutable fact. "

“Of course."

"Then can't you use such logic to reason how it might feel to be a spaceship, or any other piece of sufficiently advanced machinery?"

“Since you phrase it that manner, of course, but I fail to comprehend what benefit such an endeavor may bring me-or you."

Derec shrugged. It was night in Robot City. He and Mandelbrot had been out walking. He had felt the need to stretch his muscles after a long day spent studying some of the problems complicating his escape from this isolated planet. But at the moment they were sitting atop a rectangular tower and staring at the stars. "Oh, I don't know if it would be of any benefit, except perhaps to satisfy my curiosity. It just seems to me that you must have some idea of what it is like to be a robot, even if you don't have the means to express it."