Выбрать главу

The code will embody the minds of my father and others like him so that no one would have to do what they did, endure what they endured.

“Piece of cake,” said Alex. And the room laughed, except for Kyra and Dr. Stober.

§

Kyra threw herself into her work, a module they called the ethical governor, which was responsible for minimizing collateral damage when the robots fired upon suspects. She was working on a conscience for killing machines.

She came in on the weekends and stayed late, sometimes sleeping in the office. She didn’t view it as a difficult sacrifice to make. She couldn’t talk about what she was working on with the few friends she had, and she didn’t really want to spend more time outside the office with people like Alex.

She watched videos of drone strikes over and over. She wondered if any were missions her father had flown. She understood the confusion, the odd combination of power and powerlessness experienced when watching a man one is about to kill through a camera, the pressure to decide.

The hardest part was translating this understanding into code. Computers require precision, and the need to articulate vague hunches had a way of forcing one to confront the ugliness that could remain hidden in the ambiguity of the human mind.

To enable the robots to minimize collateral damage, Kyra had to assign a value to each life that might be endangered in a crowded urban area. One of the most effective ways for doing this—at least in simulations—also turned out to be the most obvious: profiling. The algorithm needed to translate racial characteristics and hints about language and dress into a number that held the power of life and death. She felt paralyzed by the weight of her task.

“Everything all right?” Dr. Stober asked.

Kyra looked up from her keyboard. The office lights were off; it was dark outside. She was practically the last person left in the building.

“You’ve been working a lot.”

“There’s a lot to do.”

“I’ve reviewed your check–in history. You seem to be stuck on the part where you need the facial recognition software to give you a probability on ethnic identity.”

Kyra gazed at Dr. Stober’s silhouette in the door to her office, back–lit by the hall lights. “There’s no API for that.”

“I know, but you’re resisting the need to roll your own.”

“It seems… wrong.”

Dr. Stober came in and sat down in the chair on the other side of her desk. “I learned something interesting recently. During World War II, the U.S. Army trained dogs for warfare. They would act as sentries, guards, or maybe even as shock troops in an island invasion.”

Kyra looked at him, waiting.

“The dogs had to be trained to tell allies apart from enemies. So they used Japanese–American volunteers to teach the dogs to profile, to attack those with certain kinds of faces. I’ve always wondered how those volunteers felt. It was repugnant, and yet it was also necessary.”

“They didn’t use German–American or Italian–American volunteers, did they?”

“No, not that I’m aware of. I’m telling you this not to dismiss the problematic nature of your work, but to show you that the problem you’re trying to solve isn’t entirely new. The point of war is to prefer the lives of one group over the lives of another group. And short of being able to read everyone’s minds, you must go with shortcuts and snap heuristics to tell apart those who must die from those who must be saved.”

Kyra thought about this. She could not exempt herself from Dr. Stober’s logic. After all, she had lamented her father’s death for years, but she had never shed a tear for the thousands he had killed, no matter how many might have been innocent. His life was more valuable to her than all of them added together. His suffering meant more. It was why she was here.

“Our machines can do a better job than people. Attributes like appearance and language and facial expressions are but one aspect of the input. Your algorithm can integrate the footage from city–wide surveillance by thousands of other cameras, the metadata of phone calls and social visits, individualized suspicions built upon data too massive for any one person to handle. Once the programming is done, the robots will make their decisions consistently, without bias, always supported by the evidence.”

Kyra nodded. Fighting with robots meant that no one had to feel responsible for killing.

§

Kyra’s algorithm had to be specified exactly and submitted to the government for approval. Sometimes the proposals came back marked with questions and changes.

She imagined some general (advised, perhaps, by a few military lawyers) looking through her pseudocode line by line.

A target’s attributes would be evaluated and assigned numbers. Is the target a man? Increase his suspect score by thirty points. Is the target a child? Decrease his suspect score by twenty–five points. Does the target’s face match any of the suspected insurgents with at least a fifty–percent probability? Increase his suspect score by five hundred points.

And then there was the value to be assigned to the possible collateral damage around the target. Those who could be identified as Americans or had a reasonable probability of being Americans had the highest value. Then came native militia forces and groups who were allied with U.S. forces and the local elites. Those who looked poor and desperate were given the lowest values. The algorithm had to formalize anticipated fallout from media coverage and politics.

Kyra was getting used to the process. After the specifications had gone back and forth a few times, her task didn’t seem so difficult.

§

Kyra looked at the number on the check. It was large.

“It’s a small token of the company’s appreciation for your efforts,” said Dr. Stober. “I know how hard you’ve been working. We got the official word on the trial period from the government today. They’re very pleased. Collateral damage has been reduced by more than eighty percent since they started using the Guardians, with zero erroneous targets identified.”

Kyra nodded. She didn’t know if the eighty percent was based on the number of lives lost or the total amount of points assigned to the lives. She wasn’t sure she wanted to think too hard about it. The decisions had already been made.

“We should have a team celebration after work.”

And so, for the first time in months, Kyra went out with the rest of the team. They had a nice meal, some good drinks, sang karaoke. And Kyra laughed and enjoyed hearing Alex’s stories about his exploits in war games.

§

“Am I being punished?” Kyra asked.

“No, no, of course not,” Dr. Stober said, avoiding her gaze. “It’s just administrative leave until… the investigation completes. Payroll will still make bi–weekly deposits and your health insurance will continue, of course. I don’t want you to think you’re being scapegoated. It’s just that you did most of the work on the ethical governor. The Senate Armed Forces Committee is really pushing for our methodology, and I’ve been told that the first round of subpoenas a coming down next week. You won’t be called up, but we’ll likely have to name you.”