It was one of the odder episodes in the history of computer science, yet also one of the more telling. Over the course of a few months in 1964 and 1965, Joseph Weizenbaum, a forty-one-year-old computer scientist at the Massachusetts Institute of Technology, wrote a software application for parsing written language, which he programmed to run on the university's new time-sharing system. A student, sitting at one of the system's terminals, would type a sentence into the computer, and Weizenbaum's program, following a set of simple rules about English grammar, would identify a salient word or phrase in the sentence and analyze the syntactical context in which it was used. The program would then, following another set of rules, transform the sentence into a new sentence that had the appearance of being a response to the original. The computer-generated sentence would appear almost instantly on the student's terminal, giving the illusion of a conversation.
In a January 1966 paper introducing his program, Weizenbaum provided an example of how it worked. If a person typed the sentence "I am very unhappy these days," the computer would need only know that the phrase "I am" typically comes before a description of the speaker's current situation or state of mind. The computer could then recast the sentence into the reply "How long have you been very unhappy these days?" The program worked, Weizenbaum explained, by first applying "a kind of template to the original sentence, one part of which matched the two words 'I am' and the remainder [of which] isolated the words 'very unhappy these days.' " ' It then used an algorithmic "reassembly kit," tailored to the template, that included a rule specifying that "any sentence of the form 'I am BLAH' " should be "transformed to 'How long have you been BLAH,' independently of the meaning of BLAH."
Weizenbaum's application was a product of its time. During the 1950s and '6os, the enthusiasm for computers, software programming, and artificial intelligence gave rise not only to the idea that the human brain is a type of computer but to the sense that human language is the output of one of the algorithms running inside that computer. As David Golumbia explains in The Cultural Logic of Computation, a new breed of "computational linguists," led by Weizenbaum's MIT colleague Noam Chomsky, posited that the form of the "natural language" that people speak and write reflects "the operation of the computer inside the human mind that performs all linguistic operations." In a 1958 article in the journal Information and Control, Chomsky had written that "one possible method for describing a grammar is in terms of a program for a universal Turing machine. What made the computationalist theory so compelling was that it came wrapped in a seductive "penumbra of technological
newness," writes Columbia. It offered a "mechanic clarity," replacing language's human "messiness" with "a clean internal computer." By reverse-engineering the way people talk, you could discover language's underlying code, which you could then replicate as software.
Weizenbaum named his program ELIZA, after Eliza Doolittle, the cockney flower girl who, in Ceorge Bernard Shaw's Pygmalion, learns to speak proper English under the tutelage of the haughty phonetics professor Henry Higgins. To make the simulated conversations a little more interesting, Weizenbaum also gave his artificial interlocutor a persona—that of a Rogerian psychotherapist.[55] The choice, as he explained in his paper, was a pragmatic one. Using a technique developed by the psychologist Carl Rogers in the 1940s, Rogerian therapists pretended, in their conversations with patients, to have no understanding of the world. For the most part, they simply parroted their patients' statements back to them in the form of banal, open-ended questions or comments. Knowing that the naivete was a pose, the patients were free to attribute to their therapists "all sorts of background knowledge, insights and reasoning ability." The Rogerian persona, Weizenbaum recognized, had "a crucial psychological utility'' for ELIZA: it suggested that the program's vacuity masked some sort of real intelligence.
Weizenbaum provided a transcript of part of an exchange between ELIZA and a 5 young woman who had been an early tester of the program (the machine's responses are in italic type):
Men are all alike.
In what way ?
They're always bugging us about something or other.
Can you think of a specific example?
Well, my boyfriend made me come here.
Your boyfriend made you come here.
He says I'm depressed much of the time.
I am sorry to hear you are depressed.
It's true. I am unhappy.
Do you think coming here will help you to be not unhappy ?
I need some help, that much seems certain.
What would it mean to you if you got some help?
Perhaps I could learn to get along with my mother.
Tell me more about your family.
My mother takes care of me.
Who else in your family takes care of you?
My father.
Your father.
You are like my father in some ways.
What resemblance do you see?
You are not very aggressive but I think you don't want me to notice that.
What makes you think I am not very aggressive?
You don't argue with me.
Why do you think 1 don't argue with you?
You are afraid of me.
Does it please you to believe I am afraid of you?
My father is afraid of everybody.
ln commenting on his creation, which he acknowledged to be a modest and even slightly silly contribution to the fledgling field of natural-language processing, Weizenbaum observed how easy it is for computer programmers to make machines "behave in wondrous ways, often sufficient to dazzle even the most experienced observer." But as soon as a program's "inner workings are explained in language sufficiently plain to induce understanding," he continued, "its magic crumbles away; it stands revealed as a mere collection of procedures, each quite comprehensible. The observer says to himself 'I could have written that.' " The program goes "from the shelf marked 'intelligent' to that reserved for curios."
But Weizenbaum, like Henry Higgins, was soon to have his equilibrium disturbed. ELIZA quickly found fame on the MIT campus, becoming a mainstay of lectures and presentations about computing and time-sharing. It was among the first software programs able to demonstrate the power and speed of computers in a way that laymen could easily grasp. You didn't need a background in mathematics, much less computer science, to chat with ELIZA. Copies of the program proliferated at other schools as well. Then the press took notice, and ELIZA became, as Weizenbaum later put it, "a national plaything." While he was surprised by the public's interest in his program, what shocked him was how quickly and deeply people using the software "became emotionally involved with the computer," talking to it as if it were an actual person. They "would, after conversing with it for a time, insist, in spite of my explanations, that the machine really understood them." Even his secretary, who had watched him write the code for ELIZA "and surely knew it to be merely a computer program," was seduced. After a few moments using the software at a terminal in Weizenbaum's office, she asked the professor to leave the room because she was embarrassed by the intimacy of the conversation. "What I had not realized," said Weizenbaum, "is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people."