[Next] [Up] [Previous]
Next: SyntaxSemantics, and Intentionality Up: AI and the Philosophy Previous: Mind and Body

Consciousness and the Puzzle of Other Minds

Consider again the case of the pinched arm. The child whose arm has been pinched will have a familiar enough mental state. We can use the example to construct a venerable old philosophical puzzle -- the problem of `Other Minds' (see the editor's introduction to Chappell, 1962). We are unable to observe the child's pain directly, although we can observe all the outward behavioural manifestations and, with the right equipment, various related physiological changes. So we cannot be absolutely certain that the child really does have that inner, subjective feeling of pain -- as certain, that is, as we can be of the existence of our own pain. This is what is meant by saying that the experience of pain is `private', whereas the outward manifestations are `public'.

We have a similar problem, it would appear, with all mental states of people other than ourselves. Only my own mental states, it seems, are directly observable by me. But then how can I possibly be sure that there are any other minds besides my own? I cannot simply infer it from the fact that the outward signs are the same in the case of myself and of others, for this begs the very question which is at issue -- namely whether I have any right to take it for granted, as I do, that other human beings are like me in having both the outward manifestations and the inner subjective states. Maybe, on the contrary, I am a very special human being (perhaps unique, or perhaps one of a small minority), and that most people do not have any of this inner stream of conscious experiences at all.

There are many ways of dealing with this puzzle, and we shall not go into them here. (See Chappell, 1962, and Shaffer, 1968, for representative discussions.) Let us limit ourselves to one underlying difficulty. Suppose we change the initial example upon which the puzzle is constructed. Imagine this time, not a child whose arm is being pinched, but someone sitting across the table playing chess with you. Once again we can suppose that there are various externally observable processes -- physical movements of various sorts, and no doubt certain characteristic physiological processes. Again, too, there would seem to be various internal processes. But here it is not at all clear that the sorts of `inner' processes of thinking which are attributable to a person playing chess are so very different from the kinds of `inner' processes which are attributable to a computer which is running a good chess-playing program. Of course there are various characteristic sensations -- a certain quivering feeling in the pit of one's stomach, for instance -- which may occur whenever you play chess. But these are not an inseparable part of the cognitive process of chess playing. If you do not have such experiences or sensations, but still play a good game of chess, we would still allow that you were genuinely playing the game, and indeed that you were playing it `with your mind'. In the case of `chess'-type mental states, all that seems to be essential is that the person is capable of having various sorts of internal representations and cognitive processes, and these appear to be much closer to the sorts of processes which we attribute to a computer chess program. So the puzzle about Other Minds seems much harder to launch if you use examples like chess playing as a departure point, rather than examples like arm pinching.

Suppose I consider my chess opponent and wonder whether there really are chess-like thoughts in her mind underlying the external appearances? If she is playing chess in the normal way (that is, assuming she is not under post-hypnotic suggestion, using a hidden auto-cue, etc.), could such a possibility be coherently entertained? She may lack the usual characteristic (but inessential) sensations, but can she fail to be performing the sorts of cognitive operations which are necessary ingredients of normal chess play? Can she fail, that is, to be doing things like considering alternative possible moves, reviewing different strategies, making inferences about her opponent's likely moves, and so on? This all seems to render more plausible the suggestion made earlier that there are two rather different classes of mental processes or mental statees: experiential processes (like pains) and cognitive processes (thoughts about chess, etc.); and that perhaps computer systems are at least capable of having genuine mental states of the latter kind, if not the former.

I said `perhaps'. All we are doing here is removing one obstacle to admitting computers with minds. There is an alternative kind of response, of course, which is to say that computers could, after all, have conscious mental experiences. It is not clear whether this has been seriously claimed by anyone. (For an interesting discussion of this issue see Dennett, 1978a). But the claim has certainly been made that the notion of consious experience is so difficult to pin down that, in the end, the issue of whether or not computers could have conscious experiences simply becomes a practical, or ethical issue, hinging on how we decide to treat them. What has been argued is that if it became a widespread practice to treat certain kinds of extremely sophisticated machines as conscious beings, because of the richness and depth of their behaviour, it would then not make any sense to ask the further question ``But are they really conscious?'' This, the argument goes, would be like asking whether my hand is occupying the very same point in space now as it was five minutes ago, and refusing (as relativity theory says you must) to accept any answer that was relativized to a given inertial frame (Sloman, 1986).


[Next] [Up] [Previous]
Next: SyntaxSemantics, and Intentionality Up: AI and the Philosophy Previous: Mind and Body

Cogsweb Project: luisgh@cogs.susx.ac.uk