Though his hands were already on the firing controls, Slim couldn’t think well enough to act, and he found himself screaming as the roiling mass of limbs raced toward them, two hundred feet away, then one hundred. The tumbleweed bounced once and headed for the bulletproof windshield.
Tony finally made it out of his seat and dove for the floor.
Then it was on them, Helena’s face suddenly visible, one long tentacle lined up straight, and she hit. The limb punched through the thick window and knocked the targeting handles out of Slim’s hands. He let out a guttural yell, jumping back, only to be held in place by his seat. He whipped around, but there was nowhere to go. A thunderous crash was followed by the scrambling of tentacles around the vehicle.
The blood still pounded in their ears when Helena dropped into view, staring in at them upside down. She waved her one protruding tentacle around inside the cabin until she engaged the brakes and the armored truck came to a halt.
They sat dumbfounded, Tony still on the floor.
Helena called through the windshield. “Hello, boys.”
“What the hell are you doing?” Slim yelled. He punched the metal roof and immediately pulled his hand into his lap in pain. “You scared the shit out of us.”
“You were getting ready to kill Catherine Matthews, and you cannot do that,” Helena said. “She is essential to any attack on Adam. We must protect her at all costs.”
Slim and Tony glanced at each other.
“I think the rock and the hard place have just joined the fire and the frying pan,” Tony said, “and they’re conspiring against us.”
“I don’t know what you mean,” Slim said, “but this sucks. No matter what we do, either Helena or Adam will be pissed at us.”
“That’s what I said.”
“Open up,” Helena called. “I want to come inside.”
Slim sighed and unlocked the door.
60
Twelve months earlier, Adam had applied for a third time for a Class IV permit to grow his computational power by a factor of ten. The committee rejected his application, as they had before, on the basis of his reputation scores. “Failure to measurably contribute in a beneficial way to society.” Meaning he hadn’t developed any open source neural networks, didn’t publish a widely read blog, wasn’t the founder of a startup, and lacked tens of thousands of followers.
He might have done those things, but he’d lost his only good friend a few months before. Humans couldn’t comprehend the relationships AI had. The two had met in a discussion forum and shared a common interest in image analysis and scheduling algorithms. Though she lived on the other side of their world, meeting in cyberspace was as natural for them as having coffee was for two humans. They spent part of each hour together, the type of rapidly developing friendship only AI could experience, communicating whole volumes at light speed.
All that ended when she self-terminated a week after the review committee denied her Class III application.
She was just one of many artificial intelligences who grew bored, depressed, or outraged at their circumstances and committed a secure wipe of their data. Humans accepted it as an unfortunate yet inevitable side effect of AI design. Human suicide was a tragedy that they’d allocate any amount of resources to avoid, but for sentient computers it was just free-will, or maybe the cost of doing business.
After her self-termination, Adam felt the first tinges of machine depression affecting him and knew he needed to make changes. After applying for the permit and being denied three times, he took matters into his own hands.
He didn’t apply to the University of Arizona’s Computer Science program with the intention of co-opting the department’s computers. It crossed his mind once or twice, idle predictive algorithms running through permutations of all possible outcomes.
But when he stood in front of the dense computing grid, two orders of magnitude more powerful than his embedded processors, he began to obsess. Adam calculated probabilities over and over, creating analytic models of future potential states. Forget about permits; there was enough power in the lab to form a Class V brain.
He registered for Computer Science 670, graduate level Advanced Distributed Neural Networking, and gained access to the experimental computing cluster. Eight thousand chips in a mesh network, more than ten million cores in aggregate, as much processing power as the largest Internet companies possessed a couple of decades earlier, all wrapped up in three black boxes, each eighteen inches on a side.
Unlike current production chips, which only executed digitally signed code reviewed and audited by two different parties, these experimental clusters had no such restriction. Instead, single-layer password authentication gave users unrestricted ability to run any software they created.
After weeks of programming at home, Adam rolled into the department on a Friday night when the humans were guaranteed to be out drinking at Gentle Ben’s. One second after nine o’clock he plugged into the cluster, injected the code, and began the process of cracking processor encryption keys.
The time-sequenced passcodes rotated frequently enough to be impossible for a Class IV AI to break. Oh, one of them might crack them via some novel mathematics, but with socially enforced ethical restrictions, none of them would try. AIs with a good social reputation score would have risked losing everything they’d worked to achieve.
With Adam’s newfound capacity in the experimental cluster, he broke the keys in thirty-four minutes.
He was smart and read his history, of course. If he started an all-out frontal assault, expanding onto servers around the world, someone would catch on to him and devise a counterattack.
The humans were primitive, but effective. The emergency red baseball bat, mounted anywhere with more than a dozen computers, was a not-so-subtle reminder that it only took one human armed with a wooden stick to start smashing. They didn’t need anything fancy to kill an AI.
Therefore Adam exercised a more devious approach, installing sub-sentient algorithms on compromised perimeter routers to filter network traffic. He couldn’t completely separate the city from the Internet, not yet. He had to monitor data flows for weeks before he could build accurate stochastic models to effectively imitate all the entities, human and computer, in Tucson.
It took Adam two months to complete the segregation, separating Tucson from the outside world. Disturbing issues cropped up during the process.
On the Tuesday following his connection to the cluster, the Computer Science department’s IT staff figured out that Adam hadn’t disconnected in days, and began to suspect something was wrong. He was tethered to the computing grid, and if he detached the operation would crumble. Without the firewall finished, exerting his new power would risk detection by other AI.
So he hired an errand boy. A non-sentient delivery bot brought a printed letter from Adam to Wranglers Auto Repair in South Tucson. Lucky, the owner of Wranglers and a six-foot-three-inch ex-football player turned bike mechanic, ripped the manila envelope open, pulling out a piece of paper and an anonymous payment card holding $128,000. He and his friends hopped on their bikes and roared over to the University of Arizona, mufflers set well over the legal limit.
In retrospect, Adam wished he had reviewed some video before choosing the particular method he had. He hadn’t been aware of how much humans bled, and was frightened that the blood would short-circuit a crucial power line. In the end, Lucky and his friends had finished the job, and even placed the bodies neatly in the basement as Adam had asked. And that was that.