They conferred in dismay on what they had found. “We’d be crippling ourselves!” Tropile cried.
“I don’t know about that,” said Mercedes van Dellen unexpectedly. “The poor child never had a chance. His mother must have left him. Did you ever hear such longing and unhappiness?”
Tropile, like all Lucifers, Captain Flandrays, Byrons, Duquesnes and other sardonic folk, was convinced that there was no sorrow like unto his sorrow; he thought surely that Mercedes van Dellen might have known as much. He fell sulkily silent.
Kim Seong cackled that it was all one to her; the difference between an idiot and the wisest man who ever was did not make any difference to anything as far as she could see. She enjoyed their consternation.
Django Tembo decided for them. They invited Willy to join them not with words and syllogisms but by opening like a flower and letting him be the butterfly. His shy animal soul flowed into theirs and they were richer by it. He had been an animal, with an animal’s powers of joy and sorrow, undiluted by apprehension for the one or philosophical consolations for the other.
Spyros Gulbenkian later said wistfully: “Perhaps being young was not so bad after all.” This was during a brief spell during which they disengaged and were themselves apart again. Such spells became briefer and more infrequent.
Chiefly the Snowflake floated in its tank and thought in its own mode, in eight-part counterpoint rather than in human melodic lines. Sometimes it thought in chord progressions, battering at problems and questions until they yielded.
Ceaselessly it did its work for the Pyramids; no second went past without the sixteen hands’ clicking manipulation of their switches. Ceaselessly it did its own work of analysis and planning. The difference was that for the Pyramids it did its work with eight times Rashevsky’s number of switchings; for itself it worked with Rashevsky’s Number of switchings to the eighth power.
In human transcript, the Snowflake began by exhausting all its memories and arranging them for ready access—the ancient dream accomplished at last. Did an off-color rice grain in three-year-old Kim Seong’s Korean bowl fit into a problem? It was there. Must Corso Navarone remember the serial number of a bicycle that whizzed past him in Milan one Friday when he was twelve? He remembered. If a persuasive shrug of Spyros Gulbenkian thirty years ago in Paris was of use, they had it when it was needed.
The Snowflake decided: “I am unfulfilled. Sex does not matter, for immortality is possible to me. Love does not matter, for I have more than love. What matters is increasing my store of sense-data, and taking readings off scales.”
But when this was done the Snowflake was not satisfied. There was something about the sum of its individual memories taken as a whole that totaled more than the sum of each individual one of them. There was a collective memory of some kind.
For some reason, it seemed rather urgent.
So the Snowflake put its collective mind to work, and after a time was able to remember what that collective memory was about.
Oh, yes, humanity.
The human race was in trouble.
At first the Snowflake (trying one thing, then another) thought that bringing human beings to the binary planet would, at least, rescue those particular individuals from their trouble. The Snowflake did that for a while, since it was so easy, relatively speaking, to redefine “ripeness” for the Pyramid on Mount Everest. (The Mount Everest Pyramid, of course, did not question its directives. The Pyramids collected ripe Components when available, according to the ancient maxim of “Take a chicken when you can.” It never occurred to any Pyramid that their Components might stockpile Components.)
Then the Snowflake discovered there were other possibilities.
Among the million million systems that filled the binary planet were a great number whose functions were construction, maintenance and repair. All of these were semi-intelligent and semi-autonomous, which is to say they were directed by controlling mechanisms—by other Components.
What would happen, the Snowflake wondered, if they were to try to awaken some of these others?
When they tried it, it was very nearly a disaster. They chose to begin with a tunnel-digging system that had not been used for some forty thousand years. Its Components were not at all human. What made it really bad was that they were soft-bodied slugs no longer than a thumb; on their home world they had clung to the underside of great jungle leaves, and their worst enemy was a primate-looking sort of arboreal mollusc which hunted them out and ate them. When they realized what sort of creature’s mind was touching theirs they went mad with fear. The drilling machine corkscrewed and curlicued shafts through a dozen major power centers. Battalions of repair systems raced into action to deal with the damage. It was a very high probability that a Pyramid itself might before long come sailing by, to see what was wrong.
Fortunately, Tropile’s Snowflake still had firm control over Component selection and replacement. The Snowflake sadly dumped the mad Components into the recycling hoppers, and took thought for the future.
It was not a good idea to awaken Components at random.
Therefore it would be best merely to issue commands.
There was a way of doing that without detection. It relied on the redundancy of Pyramid-system commands. As a fail-safe measure, every click of their controls was repeated.
Thereafter the Snowflake began to malfunction, as far as the Pyramids were concerned, on a level below detectability. Man’s idiot servant the thermostat is not, except in laboratories, set to perform with knife-edge precision; there is always some tolerance. In an automobile of the Car Age the radiator thermostat was doing well if it opened and closed within a range of ten degrees. Home oil-burner thermostats were more precise, operating to plus or minus a degree, but what is a degree? It is ten thousand ten-thousandths of a degree, a million millionths of a degree. There is always room for improvement, so much room that no engineer bothers beyond the area significant to him.
The Snowflake was allowed one false transmission per thousand-odd clicks; the process in which it was engaged would not suffer from such a tolerance. There is no perfection. There is no sense in doing the work that insures one thousand clicks out of one thousand are dead-sure accurate—except when your thermostat is part Wolf.
In the Pyramid’s work upon which the Snowflake was engaged, it was now assigned to send messages to automatic machinery throughout the binary planet; it was building propulsion units from scratch, procurement, logistics and all. It started by scouring the planet for surplus material; it was continuously scrounging. A zinc torus in live storage would be scrutinized; it would be determined that it was last used during the Magellanic Raid as a weapons component, that in this section of the Galaxy there were no life-forms susceptible to that type of weapon (it produced a sort of marbled fog whose sight was death to the Color Sculptors of the Magellanic Cloud). The torus floated then, at the right time, to the right place to be alloyed into material for the emitter of an ion gun which would be sub-assembled later, and still later go into the master assembly, and still later take its proper place for maximum push when the binary next corrected its course towards more Components.
One click per thousand was false. This was low tolerance, and sensibly set that way: the Snowflake’s job was of such versatility that errors could not possibly be cumulative. It switched from this task to that continuously. Had the false clicks been random, they would have caused the zinc torus to wobble on its way to smelting, or recognizably wrong information about its function to have been applicable to dielectrics instead of conductors, say, which would have given the Snowflake pause and made it ask again.