Von Neumann starts his discussion by articulating the similarities and differences between the computer and the human brain. Given when he wrote this manuscript, it is remarkably accurate. He noted that the output of neurons was digital—an axon either fired or it didn’t. This was far from obvious at the time, in that the output could have been an analog signal. The processing in the dendrites leading into a neuron and in the soma neuron cell body, however, was analog, and he described its calculations as a weighted sum of inputs with a threshold. This model of how neurons work led to the field of connectionism, which built systems based on this neuron model in both hardware and software. (As I described in the previous chapter, the first such connectionist system was created by Frank Rosenblatt as a software program on an IBM 704 computer at Cornell in 1957, immediately after von Neumann’s draft lectures became available.) We now have more sophisticated models of how neurons combine inputs, but the essential idea of analog processing of dendrite inputs using neurotransmitter concentrations has remained valid.
Von Neumann applied the concept of the universality of computation to conclude that even though the architecture and building blocks appear to be radically different between brain and computer, we can nonetheless conclude that a von Neumann machine can simulate the processing in a brain. The converse does not hold, however, because the brain is not a von Neumann machine and does not have a stored program as such (albeit we can simulate a very simple Turing machine in our heads). Its algorithm or methods are implicit in its structure. Von Neumann correctly concludes that neurons can learn patterns from their inputs, which we have now established are coded in part in dendrite strengths. What was not known in von Neumann’s time is that learning also takes place through the creation and destruction of connections between neurons.
Von Neumann presciently notes that the speed of neural processing is extremely slow, on the order of a hundred calculations per second, but that the brain compensates for this through massive parallel processing—another unobvious and key insight. Von Neumann argued that each one of the brain’s 1010 neurons (a tally that itself was reasonably accurate; estimates today are between 1010 and 1011) was processing at the same time. In fact, each of the connections (with an average of about 103 to 104 connections per neuron) is computing simultaneously.
Von Neumann’s estimates and his descriptions of neural processing are remarkable, given the primitive state of neuroscience at the time. One aspect of his work that I do disagree with, however, is his assessment of the brain’s memory capacity. He assumes that the brain remembers every input for its entire life. Von Neumann assumes an average life span of 60 years, or about 2 × 109 seconds. With about 14 inputs to each neuron per second (which is actually low by at least three orders of magnitude) and with 1010 neurons, he arrives at an estimate of about 1020 bits for the brain’s memory capacity. The reality, as I have noted earlier, is that we remember only a very small fraction of our thoughts and experiences, and even these memories are not stored as bit patterns at a low level (such as a video image), but rather as sequences of higher-level patterns.
As von Neumann describes each mechanism in the brain, he shows how a modern computer could accomplish the same thing, despite their apparent differences. The brain’s analog mechanisms can be simulated through digital ones because digital computation can emulate analog values to any desired degree of precision (and the precision of analog information in the brain is quite low). The brain’s massive parallelism can be simulated as well, given the significant speed advantage of computers in serial computation (an advantage that has vastly expanded over time). In addition, we can also use parallel processing in computers by using parallel von Neumann machines—which is exactly how supercomputers work today.
Von Neumann concludes that the brain’s methods cannot involve lengthy sequential algorithms, when one considers how quickly humans are able to make decisions combined with the very slow computational speed of neurons. When a third baseman fields a ball and decides to throw to first rather than to second base, he makes this decision in a fraction of a second, which is only enough time for each neuron to go through a handful of cycles. Von Neumann concludes correctly that the brain’s remarkable powers come from all its 100 billion neurons being able to process information simultaneously. As I have noted, the visual cortex makes sophisticated visual judgments in only three or four neural cycles.
There is considerable plasticity in the brain, which enables us to learn. But there is far greater plasticity in a computer, which can completely restructure its methods by changing its software. Thus, in that respect, a computer will be able to emulate the brain, but the converse is not the case.
When von Neumann compared the capacity of the brain’s massively parallel organization to the (few) computers of his time, it was clear that the brain had far greater memory and speed. By now the first supercomputer to achieve specifications matching some of the more conservative estimates of the speed required to functionally simulate the human brain (about 1016 operations per second) has been built.5 (I estimate that this level of computation will cost $1,000 by the early 2020s.) With regard to memory we are even closer. Even though it was remarkably early in the history of the computer when his manuscript was written, von Neumann nonetheless had confidence that both the hardware and software of human intelligence would ultimately fall into place, which was his motivation for having prepared these lectures.
Von Neumann was deeply aware of the increasing pace of progress and its profound implications for humanity’s future. A year after his death in 1957, fellow mathematician Stan Ulam quoted him as having said in the early 1950s that “the ever accelerating progress of technology and changes in the mode of human life give the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.” This is the first known use of the word “singularity” in the context of human technological history.
Von Neumann’s fundamental insight was that there is an essential equivalence between a computer and the brain. Note that the emotional intelligence of a biological human is part of its intelligence. If von Neumann’s insight is correct, and if one accepts my own leap of faith that a nonbiological entity that convincingly re-creates the intelligence (emotional and otherwise) of a biological human is conscious (see the next chapter), then one would have to conclude that there is an essential equivalence between a computer—with the right software—and a (conscious) mind. So is von Neumann correct?
Most computers today are entirely digital, whereas the human brain combines digital and analog methods. But analog methods are easily and routinely re-created by digital ones to any desired level of accuracy. American computer scientist Carver Mead (born in 1934) has shown that we can directly emulate the brain’s analog methods in silicon, which he has demonstrated with what he calls “neuromorphic” chips.6 Mead has demonstrated how this approach can be thousands of times more efficient than digitally emulating analog methods. As we codify the massively repeated neocortical algorithm, it will make sense to use Mead’s approach. The IBM Cognitive Computing Group, led by Dharmendra Modha, has introduced chips that emulate neurons and their connections, including the ability to form new connections.7 Called “SyNAPSE,” one of the chips provides a direct simulation of 256 neurons with about a quarter million synaptic connections. The goal of the project is to create a simulated neocortex with 10 billion neurons and 100 trillion connections—close to a human brain—that uses only one kilowatt of power.