“No, we don’t,” Bhavin said, his thoughts momentarily lost in the storm.
There was a moment of dead time. Bhavin’s nose was almost touching the screen, he was so close. Then he said, “something happened, and now I’m concerned. I’m very concerned.”
“Sir?”
“That’s why I didn’t meet with you until now. I didn’t want to bring you in before I was certain, because this is something more than a side project for a philanthropist billionaire. This requires another level of trust, and another level of loyalty. It also poses a much higher level of commitment.”
Axel didn’t know what could be a higher level of commitment than eliminating nuclear insurgents, but he didn’t question Bhavin. The last thing he wanted to do was interrupt his stream of consciousness now that he was explaining himself.
Bhavin turned back to face Axel. His earth-brown eyes were ebullient, with gold flecks visible in the irises. “Most of what you said I agree with. Nuclear threats are worthy targets, but nuclear programs require fissile material, sophisticated missile or bomb technology, and a lot of security resources. In other words it requires a big footprint, and because of that it’s fairly easy to spot, and they’re not difficult to dismantle, as you’ve shown. And if one or two bombs go off, a few million people die, and that’s that.”
Axel couldn’t help raising his eyebrows at Bhavin’s dismissal of a “few million people” as unimportant for the second time in the meeting.
Bhavin continued, “Climate change is worsening, sure. Measures are being taken and yes it’s not enough, but we could potentially turn things around with the right carbon-scrubbing technology. It will probably cause a fair amount of suffering and dislocation.” Bhavin shrugged. “And yet, even in the worst-case scenario, it will not be thoroughly apocalyptic.
“Then there is synthetic biology. Here there is indeed a great potential for deadly viral or bacterial agents that could go global, but there are limitations on the science, and the timeline to produce an effective pathogen is quite long. Once it is released we can always implement quarantine measures, or simply nuke the area.” Bhavin paused again and said, “So although the downside risk is greater, I think we would recover from any calamity there as well.”
Axel’s eyes remained wide open. Again Bhavin was making flippant references to massive losses of life.
“I’m concerned about something worse. Exponentially worse.”
Bhavin slid a phone across the desk to Axel.
Axel picked it up. It wasn’t a brand he had seen before. He could see it had the typical slew of apps and navigation buttons.
“What am I looking at?” Axel asked.
“Wog is on there. It’s a new phone operating system. There’s also a software avatar that’s stacked on top of it.”
“Forgive me, sir, I’m not sure where this is going. Are you concerned about this software in some way?”
Bhavin sighed and then put his head in his hands. When he raised his head back up, his communion seemed to give him new energy. His words were slow and staid, and his eyes were more focused and unrelenting. “I know you spoke with Rawlings. I’m glad you did. You know about the open source Wog software, correct?”
“Yes, sir. Bytomic launched it recently. Rawlings said it was something you needed to compete with, which is why Fortient bought Catalytic.”
Bhavin frowned. “That’s not entirely true. It’s not Rawlings’s fault, though. He’s doing his job, what I told him to do, because you weren’t in the know yet. You see, Wog is essentially an open-source form of artificial intelligence with a broad suite of deep learning capabilities. It’s able to learn a variety of different domains and analytical methods using neural and evolutionary networks. Bytomic has managed to crack some long-standing development challenges around planning, abstraction and reasoning needed to successfully develop artificial general intelligence, a more powerful and broader form of AI, something no one else had been able to do. It’s very effective. So effective, in fact, that we think it will be unstoppable.”
“I’m sorry, sir. Unstoppable?”
“It will have the capacity to be superintelligent, much smarter than a human being. If you program it as a weapon of mass destruction, it will be more successful at destroying its target than we could imagine. Even if you were to give it a benign objective it could be just as dangerous. You see, it will self-improve. It will replicate itself across networks. It will instrumentally leverage resources from anywhere it can, and it will do everything in its power to complete its objective, regardless of morality as we define it. We will be the biggest obstacle to meeting its objective, because we might try to shut it down or compete for the resources it needs. So… if it’s superintelligent, it will logically try to…” Bhavin made a cutting motion with his hand across his throat.
“Kill us,” Axel completed for him. As Bhavin was talking, Axel could feel something like a balloon deflating in his chest. It was the kind of feeling you had when you were the victim of a bad joke.
“Exactly,” Bhavin confirmed, “Now there are still one or two technical hurdles that need to be overcome, but we believe adept developers can figure these out in as little as a few months. Once these hurdles are overcome almost anyone could access the open-source Wog code and work to unleash a superintelligent machine entity, one that could ultimately threaten the entire human race.”
When Axel took this job, he always knew there was a chance Bhavin would turn out to be some kind of crazy egomaniac. This strange turn might be leading to that unfortunate conclusion. “So am I to… seek out rogue implementations of this AI, this Wog, that you think might become superintelligent?” Axel had trouble saying it because it sounded so fantastical.
“Yes, and figure out how to remove the open-source version as soon as possible.”
Axel puzzled over Bhavin’s response. “You want me to remove all instances of Wog from Bytomic servers, so they can’t release it again? In other words, you want me to disable the main product line of your biggest competitor?”
Bhavin nodded slowly, his eyes unwavering. It would be highly criminal, not to mention virtually impossible. Was Bhavin so caught up in this bizarre fantasy that he didn’t see the obvious conflict of interest?
“But haven’t many people already downloaded it?” Axel asked, hoping the obvious logical flaw would convince Bhavin of the folly of the endeavor.
“Of course they have. That’s why as soon as it’s erased at Bytomic, we need to track down and erase every single copy that has ever been released, as well as all derivatives. We will give you cyber-tracking support, which may enable some remote destruction. I want every copy removed from circulation until we can figure out how to properly mitigate the risk of Wog, or anything like it.”
Axel’s hands were cradled in his lap, but they were gripping each other tightly. He forced himself to look at the situation objectively. There had been the occasional media blitz about AI safety risks, but they were focused on job dislocations from automation or self-driving cars. Most of the eggheads citing AI safety risks were so lambasted by the media and big corporations that they were generally perceived as crackpots. All the major software companies had ethical boards, and that seemed to be enough for governments and public-safety officials.
In fact, it had been months since Axel had heard anyone even speak out about the risk of AI. Why would they? There were so many applications of AI that were useful across the automotive industry, high tech, health care, and energy. Here it looked as though Bhavin wanted to take steps to stop that progress.