“Hold on there—” Auberson protested. “You’re misquoting me — I do care about this project. Its all I care about—”
“You don’t seem to be able to handle it though—”
“You don’t even understand what we’re trying to do! How can you—”
“Auberson! Elzer!” Dome’s voice cut through their words. “Cut it out — both of you! This is a business meeting.”
Slightly chastened, but in no way cooled, Auberson continued. “Psychology, Mr. Elzer, is not as cut-and-dried a subject as bookkeeping.” He glanced at Dome. The big man made no sign. Interpreting that as permission to continue, Auberson reseated himself and said, “Robot psychology is still an infant science. We don’t know what we’re doing—” He stopped himself. That was definitely not the way to phrase it “Let me put it another way. We don’t know if what we’re doing is the right thing to do. HARLIE’s psychology is not the same as human psychology.”
“I thought you said HARLIE was human — and that he duplicates every function of the human brain.”
“He is and he does — but how many human beings do you know who are immobile, who never sleep, who have twenty-five sensory inputs, who have eidetic memories, who have no concept of taste or smell or any other organic chemical reactions? How many human beings do you know who have no sense of touch? And no sex life? In other words, Mr. Elzer, HARLIE may originally have had a human psychology, but his environment has forced certain modifications upon it. And on top of that, HARLIE has a most volatile personality.”
“Volatile?” The little man was confused. “You mean he gets angry?”
“Angry? No, not angry. He can get impatient though — especially with human beings. There’s reason to believe that HARLIE has both an ego and an id — a conscious and a subconscious. His superego, I believe, takes the form of his external programming. My commands, if you will. We haven’t found any other inhibitions. If this is true, it’s only his superego that we have any control over. His ego cooperates because it wants to, and his id, assuming he has one, does like any human subconscious — whatever it damn well pleases. We have to know what that is before we can stop his periods of non-rationality.”
“This is all very interesting,” said Elzer in a tone that suggested it wasn’t. “But would you. get to the point? What is HARLIE’s purpose?”
“Purpose?” Auberson paused. “His purpose? It’s very funny you should ask that. The whole reason for this stoppage is that HARLIE asked me what your purpose is. Excuse me, our purpose. HARLIE wants to know what our purpose is.”
“That’s for theologians to discuss,” Dome said drily. “If you want, I’m sure Miss Stimson here can arrange for a minister to come in and speak to the machine.” A few of the Board members smiled, not Miss Stimson. “What we want to know is HARLIE’s purpose. Having built him, you should have some idea.”
“I thought I’d made it clear. HARLIE was built to duplicate the functions of the human brain. Electronically.”
“Yes, we know that. But why?”
“Why?” Auberson stared at the man. “Why?” Why did Hillary climb Everest? “Because it had to be done. HARLIE will help us learn more about how the human brain works. There’s still a lot we don’t know yet, especially in the area of psychology. We hope to learn how much of the human personality is the programming and how much is the hardware.”
“I beg your pardon,” interrupted Elzer. “I don’t understand.”
“I didn’t think you would,” Auberson said drily. “We’re curious as to which of the functions of the brain are natural and which are artificial — how many of the human actions are determined from within and how many are reactions to what is coming in from without.”
“Instinct versus environment?”
“You could call it that,” Auberson sighed. “It wouldn’t be correct, but you could call it that.”
“And for what reason are we doing this?”
“I thought I just told you—”
“I mean, for what financial reason? What economic applications will this program have?”
“Huh? It’s too early to think of that. This is still pure research—”
“Ah ha — so you admit it!”
Auberson was annoyed. “I admit nothing.”
Elzer ignored him. “Domie,” he was saying, “this just proves it. He doesn’t care about the project — he doesn’t care about the company. He’s only interested in research, and we can’t afford this kind of costly project Not without return we can’t.” He raised his voice to be heard above Auberson’s protests. “If Mr. Auberson and his friends had wanted to build artificial brains, they should have applied for a grant. I move we discontinue the project.”
Auberson was on his feet. “Mr. Chairman! Mr. Chairman!”
“You’re out of order, Aubie. Now sit down. You’ll get your chance.”
“Dammit, this is a railroad job! This little—”
“Aubie, sit down!” Dome was glaring at the angry psychologist. “There’s a motion on the floor. I assume it’s a formal proposal?” He looked at Elzer.
Elzer nodded.
“Discussion?” Almost immediately Auberson’s hand was up. “Aubie?”
“On what grounds? I want to know what grounds he has for discontinuing the project.”
Elzer was calm. “Well, for one thing, HARLIE has already cost us—”
“If you’ll check your figures, you’ll find that the whole HARLIE project is well within the projected overage. In fact, because we budgeted for that overage, we are well within acceptable limits.”
“He’s got you there, Carl,” said Dome.
“If you had let me finish my sentence, I would have shown you that it has cost us far too much already for a project that is incapable of showing results.”
“Results?” Auberson asked. “Results? We were getting results even before HARLIE was completed. Who do you think designed the secondary and tertiary stages? HARLIE did.”
“So what?” Elzer was unimpressed. “He’s not working right, is he?”
“That’s just it — HARLIE is working perfectly.”
“Huh? Then what about these periods of non-rationality? Why is he shut down?”
“Because,” Auberson said slowly. I have to get this right. “Because we weren’t prepared for him to be so perfectly human. If perfect is the word.”
The other Board members were alert with interest now. Even Miss Stimson had paused in her note-taking.
“We had designed him to be human, we had built him to be human, we had even programmed him to think like a human — then we turned him on and expected him to act like a machine. Well, surprise. He didn’t.”
Elzer asked, “The nature of the trouble then…?”
“Human error, if you will.” Auberson let it drop.
In the silence that followed, Auberson fancied he could hear Elzer’s cash-register brain totaling up the man-hours that had been lost since they had started arguing. “Human error?” he repeated. “Yours or HARLIE’s? Or both — each compounding each? I suppose you’re going to blame his periods of non-rationality on human error as well.”
“Why not? How else would you characterize our approach to them?”
“ ‘Human error’ is an over-polite euphemism for what I would call it.”
Auberson ignored that. “We’d thought his non-rationality was a physical problem, or perhaps a programming error. We were wrong. He was neither physically nor mentally ill. He was — I almost hate to say it — emotionally upset.”
Elzer snorted. Loudly.
“His periods of non-rationality were/are triggered by something that’s bothering him. We don’t know what that is, but we can find out.”
Elzer was skeptical. He nudged the man next to him and said, “Anthropomorphism. Auberson’s projecting his own problems onto those of the machine.”
“Elzer, you’re a fool. Look, if you had to go down to that computer room right now and talk to HARLIE, how would you treat him?”