Выбрать главу

It is well-nigh impossible to disentangle such a system’s response to the outside world from its own self-involved response, for the tiniest external perturbation will trigger a myriad tiny interconnected events, and a cascade will ensue. If you think of this as the system’s “perception” of input, then clearly its own state is also “perceived” in a similar way. Self-perception cannot be disentangled from perception.

The existence of a higher-level way of looking at such a system is not a foregone conclusion; that is, there is no guarantee that we could decode the chime state into a consistent set of English sentences expressing the beliefs of the system, including, for instance, the set of rules of chess (as well as how to play a good game of chess!). However, when systems like that have evolved by means of natural selection, there will be a reason that some have survived and most others failed to: meaningful internal organization allowing the system to take advantage of its environment and to control it, at least partially.

In the wind chime, the hypothetical conscious ant colony, and the brain, that organization is stratified. The levels in the wind chime corresponded to the different levels of branches dangling from other branches, with the spatial disposition of the highest branches representing the most compact and abstract summary of the global qualities of the chime state, and the disposition of the many thousands (or millions?) of quivering individual tinklers giving a totally unsummarized, unintuitive, but concrete and local description of the chime state. In the ant colony, there were ants, teams, signals at various levels, and finally the caste distribution or “colony state”—again the most incisive yet abstract view of the colony. As Achilles marveled, it is so abstract that the ants themselves are never mentioned! In the brain, we just do not know how to find the high-level structures that would provide a readout in English of the beliefs stored in the brain. Or rather, we do—we just ask the brain’s owner to tell us what he or she believes! But we have no way of physically determining where or how beliefs are coded.[16]

In our three systems, various semiautonomous subsystems exist, each of which represents a concept, and various input stimuli can awaken certain concepts, or symbols. Note that in this view there is no “inner eye” that watches all the activity and “feels” the system; instead the system’s state itself represents the feelings. The legendary “little person” who would play that role would have to have yet a smaller “inner eye,” after all, and that would lead to further little people and ever-tinier “inner eyes”—in short, to infinite regress of the worst and silliest kind. In this kind of system, contrariwise, the self-awareness comes from the system’s intricately intertwined responses to both external and internal stimuli. This kind of pattern illustrates a general thesis: “Mind is a pattern perceived by a mind.” This is perhaps circular, but it is neither vicious nor paradoxical.

The closest one could come to having a “little person” or an “inner eye” that perceives the brain’s activity would be in the self-symbol—a complex subsystem that is a model of the full system. But the self-symbol does not perceive by having its own repertoire of smaller symbols (including its own self-symbol—an obvious invitation to infinite regress). Rather, the self-symbol’s joint activation with ordinary (nonreflexive) symbols constitutes the system’s perception. Perception resides at the level of the full system, not at the level of the self-symbol. If you want to say that the self-symbol perceives something, it is only in the sense that a male moth perceives a female moth, or in the sense that your brain perceives your heart rate—at a level of microscopic intercellular chemical messages.

The last point to be made here is that the brain needs this multileveled structure because its mechanisms must be extraordinarily flexible in order to cope with an unpredictable, dynamic world. Rigid programs will go extinct rapidly. A strategy exclusively for hunting dinosaurs will be no good when it comes to hunting woolly mammoths, and much less good when it comes to tending domestic animals or commuting to work on the subway. An intelligent system must be able to reconfigure itself—to sit back, assess the situation, and regroup—in rather deep ways; such flexibility requires only the most abstract kinds of mechanisms to remain unchanged. A many-layered system can have programs tailored to very specific needs (e.g., programs for chess playing, woolly-mammoth hunting, and so on) at its most superficial level, and progressively more abstract programs at deeper layers, thus getting the best of both worlds. Examples of the deeper type of program would be ones for recognizing patterns; for evaluating conflicting pieces of evidence; for deciding which, among rival subsystems clamoring for attention, should get higher priority; for deciding how to label the currently perceived situation for possible retrieval on future occasions that may be similar; for deciding whether two concepts really are or are not analogous; and so on.

Further description of this kind of system would carry us deep into the philosophical and technical territory of cognitive science, and we will not go that far. Instead, we refer readers to the “Further Readings” section for discussions of the strategies of knowledge representation it humans and in programs. In particular, Aaron Sloman’s book The Computer Revolution in Philosophy goes into great detail on these issues.

D.R.H

12

Arnold Zuboff

The Story of a Brain

I

Once upon a time, a kind young man who enjoyed many friends and great wealth learned that a horrible rot was overtaking all of his body but his nervous system. He loved life; he loved having experiences. Therefore he was intensely interested when scientist friends of amazing abilities proposed the following:

“We shall take the brain from your poor rotting body and keep it healthy in a special nutrient bath. We shall have it connected to a machine that is capable of inducing in it any pattern at all of neural firings and is therein capable of bringing about for you any sort of total experience that it is possible for the activity of your nervous system to cause or to be.”

The reason for this last disjunction of the verbs to cause and to be was that, although all these scientists were convinced of a general theory that they called “the neural theory of experience,” they disagreed on the specific formulation of this theory, they all knew of countless instances in which it was just obvious that the state of the brain, the pattern of its activity, somehow had made for a man’s experiencing this rather than that. It seemed reasonable to them all that ultimately what decisively controlled any particular experience of a man-controlled whether it existed and what it was like—was the state of his nervous system and more specifically that of those areas of the brain that careful research had discovered to be involved in the various aspects of consciousness. This conviction was what had prompted their proposal to their young friend. That they disagreed about whether an experience simply consisted in or else was caused by neural activity was irrelevant to their belief that as long as their friend’s brain was alive and functioning under their control, they could keep him having his beloved experience indefinitely, just as though he were walking about and getting himself into the various situations that would in a more natural way have stimulated each of those patterns of neural firings that they would bring about artificially. If he were actually to have gazed through a hole in a snow-covered frozen pond, for instance, the physical reality there would have caused him to experience what Thoreau described: “the quiet parlor of the fishes, pervaded by a softened light as through a window of ground glass, with its bright sanded floor the same as in summer.” The brain lying in its bath, stripped of its body and far from the pond, if it were made to behave precisely as it naturally would under such pond-hole circumstances, would have for the young man that very same experience.

вернуться

16

See selection 25, “An Epistemological Nightmare,” for a story featuring a machine that can outdo a person at “brain reading.”