Can a computer hold a conversation? You can certainly communicate with one in a variety of ways: when you type in the amount of money you want from an automatic cash dispenser, when you move the joystick to produce an effect in a computer game, even sometimes when you simply dial a telephone number, you are communicating information to a computer, and you hope it will produce an appropriate response. In these cases the response will be selected from a relatively small range of possibilities. To use a computer to the full you can give it instructions in a language such as BASIC, PASCAL or POP-11 which express in general terms what you want it to do (if you choose to read the chapter appendices of this book, you will begin to learn one of these languages, POP-11), but computer languages are not a natural form of communication for people.
What about English? Computers are, after all, general purpose information-processing machines, and they would presumably be all the more useful if they could respond to normal human language. There are a number of answers to this question, some of which will be discussed in this chapter. To begin with, here is an example of a real conversation, in English, with a computer (or to be precise, with the combination of a computer and a program). The computer starts the conversation and its contribution is in upper case.
GOOD DAY WHAT IS YOUR PROBLEM?
I haven't much of a problem really
DO YOU THINK YOU ARE UNIQUE?
I think I am unique in some ways
SOMETIMES YOU THINK YOU AREN'T UNIQUE IN SOME WAYS
That's true as well, I'm sure
ARE YOU SURE?
Yes, I am certain
IS THIS THE FIRST TIME YOU'VE BEEN CERTAIN?
No, I have felt certain of things before now
DOES ANYONE ELSE HAVE THAT PROBLEM?
It's hardly a problem, is it?
FIRST TELL ME WHY YOU HAVE COME HERE
To talk to a computer
TO TALK TO A COMPUTER
Yes, to talk to a computer
SUPPOSE I WERE NOT A COMPUTER
Then I wouldn't be talking to you
EARLIER YOU SAID YOU HAVEN'T MUCH OF A PROBLEM REALLY
That's true, apart from money like everyone
WHAT DO YOU THINK ABOUT MONETARIST POLICIES?
They don't seem to make much money for me
HAVE YOU TALKED TO THE PRIME MINISTER ABOUT THAT?
No of course not
WOULD EVERYONE FIND THAT OBVIOUS?
Anyone who wasn't a computer
DO MACHINES WORRY YOU?
Whether machines worry you or not, you might find that conversation slightly at odds with the image of a computer as a machine for calculating payroll slips or spacecraft trajectories. This computer seems to be displaying some unexpected qualities: its conversation is rather aimless and chatty, though it has ready answers in a number of subjects; it can refer to itself and its interlocutor; it appears to show human peculiarities such as impatience and facetiousness; it seems to be somewhat on the defensive too. The program in question can carry on a conversation like the one above for a considerable length of time, and can even occasionally give the impression of having an insight into human affairs.
So, let us continue with this sample of conversation (though few of us would wish to consult a computer that manages to be both condescending and evasive), as the human partner becomes increasingly irritated with the program's aimlessness.
No, computers worry me very little
MOST COMPUTERS ARE AS STUPID AS THEIR
PROGRAMMERS
Why do you say that?
IS THAT QUESTION RHETORICAL?
No, I really want to know
BEWARE OF ADDICTIONS
Why should I?
WHAT MAKES YOU THINK I KNOW THE ANSWER?
Tell me about yourself
WE WERE DISCUSSING YOU NOT ME
But I want to talk about you
BEWARE OF ADDICTIONS
You have said that twice now
I HAVE SAID THAT TWICE NOW
Can't you be a bit more interesting than that?
I CAN'T HELP IF YOU ASK TOO MANY QUESTIONS
OK, but I think you are not being very helpful at all
WOULD YOU TRUST A MACHINE TO HELP?
No
YOU ARE BEING SOMEWHAT SHORT WITH ME
Yes
ARE YOU PREPARED TO ELABORATE?
I don't want to elaborate
DO YOU EVER SUFFER WITHDRAWAL SYMPTOMS?
Your comments get sillier and sillier
DOES ANYONE ELSE HAVE THAT PROBLEM?
You asked that before when you couldn't think of anything better to say
DID YOU REALLY EXPECT ME TO UNDERSTAND THAT?
You understand nothing, I think
I UNDERSTAND NOTHING, YOU THINK
You understand nothing
PLEASE EXPLAIN SO THAT A STUPID COMPUTER CAN
FOLLOW YOU
If I could explain to you I could explain anything to anyone
IF YOU COULD EXPLAIN TO I YOU COULD
EXPLAIN ANYTHING TO ANYONE
You are a very simple-minded program
SUPPOSE I WERE NOT A VERY SIMPLE-MINDED PROGRAM
We'll come to that later
COME TO THAT LATER
Stop repeating me and say something new!
ARE YOU USING YOURSELF AS A SCAPE-GOAT?
It wouldn't pay me to
WHAT DO YOU THINK ABOUT MONETARIST POLICIES?
You already asked, you want to be more original
CAN YOU DO WITHOUT?
Without what?
IS THAT QUESTION IMPORTANT TO YOU?
The program is loosely based on a well-known one, developed by Joseph Weizenbaum in the mid-1960s, called ELIZA. ELIZA altered people's view of computers, since it appeared that computer science had breached the defenses of human language. In one version, ELIZA took the role of a nondirective therapist -- that is to say, a psychiatrist whose main aim is to help patients determine the direction of the therapy themselves, and to find their own solutions.
There are anecdotes concerning people who, for brief periods, treated ELIZA as if it were a sympathetic human. Weizenbaum's own secretary, for instance, asked him to leave the room while she confided in the program. And when Weizenbaum suggested that he might set ELIZA to record all its interactions, he was accused of spying on people's most intimate thoughts. Another case concerns a vice-president in a computer company who arrived one morning to find a version of ELIZA running on the computer, but was under the impression that the teletype (that is, the keyboard and printer) was linked directly to one in a colleague's home. The conversation that ensued was a masterpiece of misunderstanding. (It is reproduced in Boden, 1986). The vice-president took a good deal of persuading that he had really been talking to a computer program.
So does this mean that ELIZA would succeed in Turing's `Imitation Game'? The answer is that it would not. If you had to distinguish between ELIZA and a human on the basis of their typed replies, you would soon find that ELIZA's lack of initiative, lack of knowledge, and lack of common sense gave it away. For some reason, though, many people are willing to attribute `human' characteristics to a computer program even though there is only the flimsiest evidence that it might possess them. Perhaps this is not surprising; our bias is demonstrated by an advertising slogan for cars: ``Minis have feelings too.'' Such advertising may be harmless, but in the case of computers it is dangerous to believe that a program has feelings or other human attributes, since we inevitably use programs in ways that influence the decisions we take in our lives. In this chapter, we will show how a program like ELIZA works, and we hope this will inoculate you against what Weizenbaum, in his book Computer Power and Human Reason (1984), called ``powerful delusional thinking in quite normal people.''
In the same book, Weizenbaum comments that he was surprised that many people saw ELIZA as demonstrating a general solution to the problem of programming a computer to understand natural human language. His own conclusion was that language is only understood within a contextual framework. In other words, we only understand what someone says to us because of what we have in common in our understanding of the world. ELIZA only makes an impression because knowledge about the world is built into its responses; but its representation of knowledge is, as we shall see, too shallow and inflexible to make ELIZA a program that could begin to be called intelligent. The remainder of this chapter will be concerned with the way our version of ELIZA operates, and in particular what it lacks for achieving real communication in English. Since the program is not quite the same as Weizenbaum's, we shall call it Eliza.