Выбрать главу

She did not want him to be defined by the number on that piece of paper her mother kept hidden at the bottom of the box in the attic.

“They counted wrong, Dad,” Kyra said. “They missed one death.”

§

Kyra walked down the hall dejectedly. She was done with her last interview of the day—a hot Silicon Valley startup. She had been nervous and distracted and had flubbed the brainteaser. It had been a long day and she didn’t get much sleep the night before.

She was almost at the elevator when she noticed an interview schedule posted on the door of the suite next to the elevator for a company named AWS Systems. It hadn’t been completely filled. A few of the slots on the bottom were blank; that generally meant an undesirable company.

She took a closer look at the recruiting poster. They did something related to robotics. There were some shots of office buildings on a landscaped, modern campus. Bullet points listed competitive salary and benefits. Not flashy, but it seemed attractive enough. Why weren’t people interested?

Then she saw it: “Candidates need to pass screening for security clearance.” That would knock out many of her classmates who weren’t U.S. citizens. And it likely meant government contracts. Defense, probably. She shuddered. Her family had had enough of war.

She was about to walk away when her eyes fell on the last bullet point on the poster: “Relieve the effects of PTSD on our heroes.”

She wrote her name on one of the blank lines and sat down on the bench outside the door to wait.

§

“You have impressive credentials,” the man said, “the best I’ve seen all day, actually. I already know we’ll want to talk to you some more. Do you have any questions?”

This was what Kyra had been waiting for all along. “You’re building robotic systems to replace human–controlled drones, aren’t you? For the war.”

The recruiter smiled. “You think we’re Cyberdyne Systems?”

Kyra didn’t laugh. “My father was a drone operator.”

The man became serious. “I can’t reveal any classified information. So we have to speak only in hypotheticals. Hypothetically, there may be advantages to using autonomous robotic systems over human–operated machines. Robots.”

“Like what? It can’t be about safety. The drone operators are perfectly safe back here. You think machines will fight better?”

“No, we’re not interested in making ruthless killer robots. But we shouldn’t make people do the jobs that should be done by machines.”

Kyra’s heart beat faster. “Tell me more.”

“There are many reasons why a machine makes a better soldier than a human. A human operator has to make decisions based very limited information: just what he can see from a video feed, sometimes alongside intelligence reports. Deciding whether to shoot when all you have to go on is the view from a shaking camera and confusing, contradictory intel is not the kind of thinking humans excel at. There’s too much room for error. An operator might hesitate too long and endanger an innocent, or he might be too quick on the trigger and violate the rules of engagement. Decisions by different operators would be based on hunches and emotions and at odds with each other. It’s inconsistent and inefficient. Machines can do better.”

Worst of all, Kyra thought, a human can be broken by the experience of having to decide.

“If we take these decisions away from people, make it so that individuals are out of the decision–making loop, the result should be less collateral damage and a more humane, more civilized form of warfare.”

But all Kyra could think was: No one would have to do what my father did.

§

The process of getting security clearance took a while. Kyra’s mother was surprised when Kyra called to tell her that government investigators might come to talk to her, and Kyra wasn’t sure how to explain why she had taken this job when there were much better offers from other places. So she just said, “This company helps veterans and soldiers.”

Her mother said, carefully, “Your father would be proud of you.”

Meanwhile, they assigned her to the civilian applications division, which made robots for factories and hospitals. Kyra worked hard and followed all the rules. She didn’t want to mess up before she got to do what she really wanted. She was good at her job, and she hoped they noticed.

Then one morning Dr. Stober, the head roboticist, called her to join him in a conference room.

Kyra’s heart was in her throat as she walked over. Was she going to be let go? Had they decided that she couldn’t be trusted because of what had happened to her father? That she might be emotionally unstable? She had always liked Dr. Stober, who seemed like a good mentor, but she had never worked with him closely.

“Welcome to the team,” said a smiling Dr. Stober. Besides Kyra, there were five other programmers in the room. “Your security clearance arrived this morning, and I knew I wanted you on this team right away. This is probably the most interesting project at the company right now.”

The other programmers smiled and clapped. Kyra grinned at each of them in turn as she shook their outstretched hands. They all had reputations as the stars in the company.

“You’re going to be working on the AW–1 Guardians, one of our classified projects.”

One of the other programmers, a young man named Alex, cut in: “These aren’t like the field transport mules and remote surveillance crafts we already make. The Guardians are unmanned, autonomous flying vehicles about the size of a small truck armed with machine guns and missiles.”

Kyra noticed that Alex was really excited by the weapons systems.

“I thought we make those kinds already,” Kyra said.

“Not exactly,” Dr. Stober said. “Our other combat systems are meant for surgical strikes in remote places or are prototypes for frontline combat, where basically anything that moves can be shot. But these are designed for peacekeeping in densely populated urban areas, especially places where there are lots of Westerners or friendly locals to protect. Right now we still have to rely on human operators.”

Alex said in a deadpan voice, “It would be a lot easier if we didn’t have to worry about collateral damage.”

Dr. Stober noticed that Kyra didn’t laugh and gestured for Alex to stop. “Sarcasm aside, as long as we’re occupying their country, there will be locals who think they can get some advantage from working with us and locals who wish we’d go away. I doubt that dynamic has changed in five thousand years. We have to protect those who want to work with us from those who don’t, or else the whole thing falls apart. And we can’t expect the Westerners doing reconstruction over there to stay holed up in walled compounds all the time. They have to mingle.”

“It’s not always easy to tell who’s a hostile,” Kyra said.

“That’s the heart of the issue. Most of the time, the population is ambivalent. They’ll help us if they think it’s safe to do so, and they’ll help the militants if they think that’s the more convenient choice.”

“I’ve always said that if they choose to help the militants blend in, I don’t see why we need to be that careful. They made a decision,” Alex said.

“I suppose some interpretations of the rules of engagement would agree with you. But we’re telling the world that we’re fighting a new kind of war, a clean war, one where we hold ourselves to a higher standard. How people see the way we conduct ourselves is just as important nowadays.”

“How do we do that?” Kyra asked, before Alex could further derail the conversation.

“The key piece of software we have to produce needs to replicate what the remote operators do now, only better. The government has supplied us with thousands of hours of footage from the drone operations during the last decade or so. Some of them got the bad guys, and some of them got the wrong people. We’ll need to watch the videos and distill the decision–making process of the operators into a formal procedure for identifying and targeting militants embedded in urban conditions, eliminate the errors, and make the procedure repeatable and applicable to new situations. Then we’ll improve it by tapping into the kind of big data that individual operators can’t integrate and make use of.”