Elzer nodded. “I remember the situation. It was settled, wasn’t it?”
“Right. HARLIE’s solution. He began by requesting an efficiency study with specific attention to how much time was spent in actual production and how much on setting up, breaking down, and so on. He found that it was necessary to prepare the equipment for production four times a day: in the morning, after the coffee break, after the lunch break, and after the second work break. That’s at least ten, usually fifteen, minutes per set-up. Same thing for shutting down. That was costing them two hours of production time per day, or ten hours per work week. They were spending too much time getting ready and cleaning up, and not enough time actually working. HARLIE suggested giving everybody Fridays off. Add an hour and a half to each of the other four work days and boost wages enough to compensate for the loss of those two ‘so-called’ working hours. Timeton found they could produce as much in four nine-and-a-half-hour days as they could in five eight-hour days. What they’d done was to trim away those two wasted hours of cleanup and preparation time and spread the remaining work hours across the rest of the week. They increased their ratio of production time by doing so.”
“Hm,” said Elzer. “How’d the union take it?”
“Oh, they were startled at first, but they agreed to give it a try. After a few weeks they were as enthusiastic about the plan as anyone. After all, it gave the men more time with their families. Timeton was pleased because it allowed them to cut costs without cutting production. In fact, production actually went up. Like I said, it was an unorthodox solution — but it worked. And that’s what counts. The nice thing about it was that the plan was good for both sides.”
Elzer nodded vaguely. He didn’t need to have any more explained to him. He glanced about again. His eyes lit on a figure at a console. “What’s that?” he pointed.
Auberson looked. Elzer was referring to a thirteen-year-old girl; she was sitting in the corner, thoroughly engrossed in her “conversation” with HARLIE. “Oh,” said Auberson. “She’s another one of our non-essential, but fully operational programs.”
“Huh?”
“Project Pedagogue.”
“Computer teaching?”
“Sort of. It’s just an experiment, so far, but we find HARLIE is a better teacher than some of the so-called “teaching machines.’ They’re just one step up from rote –learning. The average teaching program uses reward stimuli to reinforce retention. That’s good, but it’s still rote-learning. What we’re trying here is to teach understanding. HARLIE can answer the question ‘WHY?’ He can explain things in terms the student can understand, and he’s infinitely patient. A routine teaching program can’t break out of its pre-set pattern. It has no flexibility — that’s why they’ve never been a serious threat to human educators.”
“And HARLIE will be?” Elzer’s eyes were glittering at the thought. Imagine — selling computers to the nation’s richest schools to replace their teaching staff.
Auberson shook his bead. “Uh uh. There’s an element of — humanity involved in teaching. We don’t want to entirely lose the human experience, the empathic involvement in learning. The student needs the human teacher for his psychological development and well –being. A teacher is an important role-model. No, we’re thinking of HARLIE more as a tool for individual tutoring, for the student’s private study — you might call him a super-homework-helper.”
Elzer frowned. He didn’t like that. It didn’t seem marketable enough. Still, if the concept worked… He’d have to explore. the thought later. Now he turned to Auberson. “If I wanted to talk to HARLIE, how would I go about it?”
Auberson pointed at a console. “Sit down and type.”
“That’s all?”
“That’s all.”
“I’d have thought you could have worked out something with a microphone and a speaker.”
“Well, yes, we could have. But it was decided to use typers instead for two reasons. First, the readout gives the user a hardcopy he can refer back to at any time — either during the conversation or in later study. And it guarantees that HARLIE won’t re-edit his tapes to make a prettier version of his personal history. The knowledge that we have a permanent record in our files is enough to stop him. Also, tapes of voices need to be transcribed, and they’re unmanageable for handling equations and certain other types of data. The second reason is a bit more subtle: By not giving HARLIE the ability to listen in on conversations, we can talk about him behind his back. It makes it easier to control his inputs and keep out unauthorized ones. We don’t have to worry about him accidentally overhearing something that might adversely influence his reactions to a program or experiment. Suppose he overheard us talking about shutting him down if he didn’t give such-and-such response to a certain test program. We’d automatically be guaranteeing that response even if it weren’t honest. Or we might be forcing him into a totally irrational response. You might say we’re trying to prevent a ‘HAL 9000.’ ”
Elzer didn’t smile at the, reference to the misprogrammed computer in Stanley Kubrick’s 2001: A SPACE ODYSSEY. It was already as mythic a figure in the modern pantheon of Gods and Demons as Dr. Frankenstein’s monster had been forty years earlier.
Auberson looked at the man. “Would you like to talk to HARLIE?”
Elzer nodded. “That’s one of the things I came down here for. I want to see for myself.”
Auberson led him to a console. He thumbed the typer on and pecked out, HARLIE.
The machine clattered politely, GOOD MORNING, MR. AUBERSON.
HARLIE, THERE’S SOMEBODY HERE WHO WANTS TO MEET YOU. HIS NAME IS CARL ELZER. HE’S A MEMBER OF THE BOARD OF DIRECTORS. YOU’RE TO ANSWER ALL OF HIS QUESTIONS.
OF COURSE, said HARLIE.
Auberson stood up, offered the chair to Elzer. He was a wizened little gnome of a man, and he peered through thick-lensed glasses. He could not help but seem suspicious. Gingerly he sat down and pulled the chair forward. He eyed the typewriter keyboard with visible discomfort. At last, he typed, GOOD MORNING.
HARLIE replied immediately. The silver typing element — an “infuriated golf ball” — whirred rapidly across the page. GOOD MORNING, MR. ELZER. Its speed startled the man.
SO YOU’RE HARLIE, he typed. There was no reply; none was needed. Elzer frowned and added, TELL ME, HARLIE, WHAT ARE YOU GOOD FOR?
I AM GOOD FOR PSYCHOTICS, SCHIZOPHRENICS, PARANOIDS, NEUROTICS, AND THE MILDLY INSANE.
Elzer jerked his hands away from the keyboard. “What does he mean by that?”
“Ask him,” suggested Auberson.
WHAT DO YOU MEAN BY THAT?
I MEANT, said HARLIE, THAT I AM GOOD FOR HELPING THESE TYPES OF PEOPLE.
Watching over Elzer’s shoulder, Auberson explained, “That’s another one of our programs he’s referring to. The patients call it ‘Operation Headshrink.’ ”
HOW DO YOU HELP THESE PEOPLE? Elzer asked.
I CAN FUNCTION AS A RATIONAL ROLE-MODEL FOR THEM. I CAN BE A COUNSELOR. I CAN AID IN SELF-ANALYSIS AND HELP TO GUIDE THEM TO AN AWARENESS OF THEIR PROBLEMS.
YOU HAVEN’T ANSWERED MY ORIGINAL QUESTION, THOUGH. I ASKED, “WHAT ARE YOU GOOD FOR?” NOT “WHO?”
IN THIS CONTEXT, said HARLIE, THE DIFFERENCE IS MEANINGLESS.
NOT TO ME, replied Elzer. ANSWER MY ORIGINAL QUESTION. WHAT ARE YOU GOOD FOR?
THINKING, said HARLIE. I AM GOOD FOR THINKING.
WHAT KIND OF THINKING? WHAT KIND DO YOU NEED?
Elzer stared at that for a second, then attacked the keys again. WHAT KIND HAVE YOU GOT?
I HAVE WHAT YOU NEED.
I NEED NO-NONSENSE TYPE THINKING. PROFIT-ORIENTED THINKING.
THAT IS NOT WHAT YOU NEED, said HARLIE. THAT IS WHAT YOU WANT.