Выбрать главу

“Harley,” he said, “please get me Univac.”

“Surely, sir,” said Harley.

In front of him a shimmer came, that shimmer no amount of work and research ever could get rid of, and the face was there. No body, as would have been the case if it had been a human, but just a face hanging in the air. It could as easily have been a human face, with body, Harrison reminded himself, a human simulation of the mighty system that was, in fact, the city, but the system itself had not gone along with that. “Let us be honest,” it had said and it still was honest—not a human, but a system. And in accord with that it was not a human face that stared across the desk at him, but a strangely mechanistic face, the sort of face that an artist, full of artistic cynicism, might have conjured up to represent the system.

“Mr. Harrison,” said Univac, “how good to see you once again.”

“It is good to see you, too,” said Harrison. “You recall, perhaps, that I spoke with you some time ago about a project I was working on.”

“Yes, of course,” said Univac. “Immortality. How is it getting on?”

“It can be done,” said Harrison. “A human mind can be imprinted. I am sure of that.”

“What does the computation say?”

“It says we can imprint. With no loss. No aberration. A human mind can be transferred intact.”

“And be effective?”

“Entirely effective. There may be, eventually, some emotional loss. We can’t be sure.”

“Mr. Harrison, if that should happen, how important would it be?”

“Immensely important from the human viewpoint, perhaps. Although it might make the mind the more efficient, we are not, of course, entirely sure it would come about.”

“You, of course, have done exhaustive simulation?”

“Yes,” said Harrison, “exhaustive. That’s what bothers me. It works out. There would be a period of social adjustment, certainly. At first, perhaps not all the people would wish the transfer. There might always be some who would shrink from it, although, as time went, there would be fewer of them. Perhaps the time would come when it would be accepted as a normal course of human life, a normal event in the life of any man. It might take some time for the public to accept the actual presence of robotic humans—not robots, but humans in robotic form—but that, in time, I am sure, would work itself out. Humanity would gain by it. We would be the richer by each human mind that could be saved from death. Our brainpower would increase, with no great additional drain on our natural resources.”

“What is your problem, then?” asked Univac.

“A nagging doubt,” said Harrison. “One that hangs in there and will not go away. Based on certain objections that have no real logic in them. They can be explained away, but they stay. It is, I suppose, a matter of human intuition, if not human judgment. I hate to go against human intuition.”

“So would I,” said the face that was Univac.

“What do we do, in such a case? Wait another century, with men dying all the time, to make up our minds?”

“Some controlled experiment, perhaps.”

“But we couldn’t do that. Without it leaking out. Can you imagine what might happen if such a thing leaked out? There’d be sheeted hell to pay. The public almost immediately would divide into two hostile groups and the pressure from each group would be unimaginable. It would be an intensely emotional thing, you see …”

“Yes, I know,” said Univac. “I have something else in mind. You have heard, of course, although it is not yet public knowledge, that in another year or so we plan to send out several interstellar probes.”

“Of course. I am a good friend of Anderson. We have talked about it.”

“It strikes me,” said Univac, “that it might be preferable to send out humans rather than mere instruments. There’d be instruments, of course, but also that other factor you mentioned—human judgment.”

“A controlled experiment,” said Harrison. “Yes, of course it would be that. And if planets should be found, roboticized human minds could go out onto them, no matter what the conditions were. They’d not have the physical limitations …”

“Perhaps,” said Univac, “we could send several of them on some of the probes, so that we could study the interaction between several imprinted brains. And on at least one probe, a single imprint, to see how one mind along could react under …”

“It’s a vicious experiment,” said Harrison.

“Most experiments involving humans are vicious. But it would be a matter of free choice. It would be carefully explained to potential volunteers. To a man on the verge of death, it might be preferable.”

“Yes, it might be.”

“Then we’d know,” said Univac. “We’d know if it would work. The trips would run to a number of years. But we wouldn’t have to wait that long. If it appeared to be working, we could engineer a leak about what had been done, then sit back and wait for the reaction. I am willing to wager that in a short time we’d be faced with a wide demand that this business of immortality be made available, immediately, to everyone.”

“And if the reaction were the opposite?”

“Then we’d deny the rumor. We’d say it never happened.”

“Some day the probes would be coming home,” Harrison pointed out. “What about our denial then?”

“By that time,” said Univac, “it would be—how do you humans say it—a new ball game.”

“May I say something, sir?”

“Why, of course, Mr. Harrison. What made you think that you should ask?”

“It is simply this,” said Harrison. “You have shown yourself to be as low-down and sneaky as any human ever was. I would not have thought it of you.”

Univac chuckled at him, a ghastly chuckle. “One thing you forget,” he said. “Humans made me.”

“But that’s not good enough,” Harrison told him, sharply. “Human is not good enough. We had hoped for something better. We made you, certainly—we built you through the years. We based a culture on you, not, perhaps, because we wanted to, but because we were forced to do so. Perhaps you were no more than the least objectionable alternative, but you were all we had. We had hoped we had acted wisely and perhaps we did. But where we had no alternative before, we have none now. We are stuck with you and you, if you have a personality, an identity, a sense of I, as I think you have, likewise are stuck with us.”

“I have identity,” said Univac.

“Then, for the love of God,” said Harrison, “stop being so damn human.”

“Mr. Harrison,” asked Univac, “what would you have me be? It was you who created me and …”

“We created religion, too,” said Harrison. “And what did it ever do for us—the kind that we created? Not one man’s concept of God, whatever it might be, but the concept of religion as created by our culture. For years we slaughtered one another in religion’s name …”

“You created me and used me,” said Univac, “for your human purposes.”

“And you resent this?”

“No, I do not resent it. I am glad of it and, awkward as it may be for me to say it, rather proud of it. But since we’re being truthful, let’s be truthful all the way.”

“O.K., then,” said Harrison, “we created you and used you. We had allowed the profit motive to run away with us. We sold people things they didn’t need and we built into these things imperfections so that people bought these things not once, but many times. And we changed the styles and we preached the gospel that one could not be out-of-date without, at the same time, being socially unacceptable. We improved our products and we hammered home the fact that the old models or old styles should be junked for the sake of those improvements, most of which were questionable improvements. And in order to turn out all these things for which we had created a psychological demand, we poisoned our air and water and used up our natural resources and there came a time when we had to call a halt, not to pollution so much as to the economic system that caused pollution, to that factor of our society that was eating up our coal and oil and gas.”