Выбрать главу

That assumes, however, that the robot knows everything about ships and can tell that the order is a dangerous one. Suppose, however, that the robot is not an expert on ships, but is experienced only in, let us say, automobile manufacture. He happens to be on board ship and is given an order by some landlubber and he doesn’t know whether the order is safe or not.

It seems to me that he ought to respond, “Sir, since you have no knowledge as to the proper handling of ships, it would not be safe for me to obey any order you may give me involving such handling.”

Because of that, I have often wondered if the Second Law ought to read, “A robot must obey orders given it by qualified human beings…”

But then I would have to imagine that robots are equipped with definitions of what would make humans “qualified” under different situations and with different orders. In fact, what if a landlubber robot on board ship is given orders by someone concerning whose qualifications the robot is totally ignorant.

Must he answer, “Sir, I do not know whether you are a qualified human being with respect to this order. If you can satisfy me that you are qualified to give me an order of this sort, I will obey it.”

Then, too, what if the robot is faced by a child of ten-indisputably human as far as the First Law is concerned. Must the robot obey without question the orders of such a child, or the orders of a moron, or the orders of a man lost in the quagmire of emotion and beside himself?

The problem of when to obey and when not to obey is so complicated and devilishly uncertain that I have rarely subjected my robots to these equivocal situations.

And that brings me to the matter of aliens.

The physiological difference between aliens and ourselves matters to us-but then tiny physiological or even cultural differences between one human being and another also matter. To Smith and Campbell, ancestry obviously mattered; to others skin color matters, or gender or eye shape or religion or language or, for goodness sake, even hairstyle.

It seems to me that to decent human beings, none of these superficialities ought to matter. The Declaration of Independence states that “All men are created equal.” Campbell, of course, argued with me many times that all men are manifestly not equal, and I steadily argued that they were all equal before the taw. If a law was passed that stealing was illegal, then no man could steal. One couldn’t say, “Well, if you went to Harvard and were a seventh-generation American you can steal up to one hundred thousand dollars; if you’re an immigrant from the British Isles, you can steal up to one hundred dollars; but if you’re of Polish birth, you can’t steal at all.” Even Campbell would admit that much (except that his technique was to change the subject).

And, of course, when we say that “All men are created equal” we are using “men” in the generic sense including both sexes and all ages, subjected to the qualification that a person must be mentally equipped to understand the difference between right and wrong.

In any case, it seems to me that if we broaden our perspective to consider non-human intelligent beings, then we must dismiss, as irrelevant, physiological and biochemical differences and ask only what the status of intelligence might be.

In short, a robot must apply the Laws of Robotics to any intelligent biological being, whether human or not.

Naturally, this is bound to create difficulties. It is one thing to design robots to deal with a specific non-human intelligence, and specialize in it, so to speak. It is quite another to have a robot encounter an intelligent species whom it has never met before.

After all, different species of living things may be intelligent to different extents, or in different directions, or subject to different modifications. We can easily imagine two intelligences with two utterly different systems of morals or two utterly different systems of senses.

Must a robot who is faced with a strange intelligence evaluate it only in terms of the intelligence for which he is programmed? (To put it in simpler terms, what if a robot, carefully trained to understand and speak French, encounters someone who can only understand and speak Farsi?)

Or suppose a robot must deal with individuals of two widely different species, each manifestly intelligent. Even if he understands both sets of languages, must he be forced to decide which of the two is the more intelligent before he can decide what to do in the face of conflicting orders-or which set of moral imperatives is the worthier?

Someday, this may be something I will have to take up in a story but, if so, it will give me a lot of trouble. Meanwhile, the whole point of the Robot City volumes is that young writers have the opportunity to take up the problems I have so far ducked. I’m delighted when they do. It gives them excellent practice and may teach me a few things, too.

Prologue

A Synopsis Of Robot City, Books 1-6

He woke up…somewhere.

He didn’t know where he was or how he had managed to get there. He didn’t remember anything of his past.

Not even his name.

He was in some small capsule without windows. He could not even see where he was going.

His awakening had stirred a computer into life, and through its positronic personality he found that he was in a Massey lifepod. A badge on his clothing identified him as Derec-the name seemed to fit as well as anything. The positronic intelligence built into the lifepod could help him with very little; it had no information to aid him at all, not even the name of the ship from which it had been ejected.

The lifepod had landed on an asteroid that Derec quickly found was inhabited by a colony of robots. He seemed to be the only human there. The robots were as little help to him as the lifepod. Strangely silent about their task, they ignored him for the most part. They were obviously looking for something buried in the rock of the asteroid-it seemed to be the only explanation. While he tried to decipher just what it was they were looking for and why, a raider ship appeared.

While the robot colony prepared to self-destruct, Derec made a desperate attempt to escape from the asteroid and contact the raider.

As he was doing so, the raider’s bombardment uncovered a shiny silver object, perhaps five centimeters by fifteen centimeters. He would later learn that it was called a “Key to Perihelion.” A pursuing robot revealed that this was the object for which the robots were so obsessively searching.