Выбрать главу

Among themselves, the executives whispered that it was impossible to make a thorough assessment within such a short period of time and that whatever report Stamos delivered would surely flag superficial problems and give the new head of security some easy wins at the start of his tenure. Everyone’s life would be easier if Stamos assumed the posture of boundless optimism that pervaded Facebook’s top ranks. The company had never been doing better, with ads recently expanded on Instagram and a new milestone of a billion users logging on to the platform every day.1 All they had to do was sit back and let the machine continue to hum.

Instead, Stamos had come armed with a presentation that detailed problems across Facebook’s core products, workforce, and company structure. The organization was devoting too much of its security efforts to protecting its website, while its apps, including Instagram and WhatsApp, were being largely ignored, he told the group. Facebook had not made headway on its promises to encrypt user data at its centers—unlike Yahoo, Stamos’s previous employer, which had moved quickly to start securing the information in the two years since National Security Agency whistleblower Edward Snowden revealed that the government was likely spying on user data as it sat unprotected within the Silicon Valley companies.2 Facebook’s security responsibilities were scattered across the company, and according to the report Stamos presented, the company was “not technically or culturally prepared to play against” its current level of adversary.

Worst of all, Stamos told them, was that despite firing dozens of employees over the last eighteen months for abusing their access, Facebook was doing nothing to solve or prevent what was clearly a systemic problem. In a chart, Stamos highlighted how nearly every month, engineers had exploited the tools designed to give them easy access to data for building new products to violate the privacy of Facebook users and infiltrate their lives. If the public knew about these transgressions, they would be outraged: for over a decade, thousands of Facebook’s engineers had been freely accessing users’ private data. The cases Stamos highlighted were only the ones the company knew about. Hundreds more may have slipped under the radar, he warned.

Zuckerberg was clearly taken aback by the figures Stamos presented, and upset that the issue had not been brought to his attention sooner. “Everybody in engineering management knew there were incidents where employees had inappropriately managed data. Nobody had pulled it into one place, and they were surprised at the volume of engineers who had abused data,” Stamos recalled.

Why hadn’t anyone thought to reassess the system that gave engineers access to user data? Zuckerberg asked. No one in the room pointed out that it was a system that he himself had designed and implemented. Over the years, his employees had suggested alternative ways of structuring data retention, to no avail. “At various times in Facebook’s history there were paths we could have taken, decisions we could have made, which would have limited, or even cut back on, the user data we were collecting,” said one longtime employee, who joined Facebook in 2008 and worked across various teams within the company. “But that was antithetical to Mark’s DNA. Even before we took those options to him, we knew it wasn’t a path he would choose.”

Facebook’s executives, including those in charge of the engineering ranks, like Jay Parikh and Pedro Canahuati, touted access as a selling point to new recruits on their engineering teams. Facebook was the world’s biggest testing lab, with a quarter of the planet’s population as its test subjects. The managers framed this access as part of Facebook’s radical transparency and trust in its engineering ranks. Did a user enjoy the balloons on the prompt to wish her brother a happy birthday, or did an emoji of a birthday cake get a higher response rate? Instead of going through a lengthy and bureaucratic process to find out what was working, engineers could simply open up the hood and see for themselves, in real time. But Canahuati warned engineers that access to that data was a privilege. “We had no tolerance for the abuse, which is why the company had always fired every single person found to be improperly accessing data,” he said.

Stamos told Zuckerberg and the other executives that it was not enough to fire employees after the fact. It was Facebook’s responsibility, he argued, to ensure that such privacy violations never happened to begin with. He asked permission to change Facebook’s current system to revoke private data access from the majority of engineers. If someone needed information on a private individual, they would have to make a formal request through the proper channels. Under the system then in place, 16,744 Facebook employees had access to users’ private data. Stamos wanted to bring that number down to fewer than 5,000. For the most sensitive information, like GPS location and password, he wanted to limit access to under 100 people. “While everyone knew there was a large amount of data accessible to engineers, nobody had thought about how much the company had grown and how many people now had access to that data,” Stamos explained. “People were not paying attention.”

Parikh, Facebook’s head of engineering, asked why the company had to upend its entire system. Surely, safeguards could be put in place that limited how much information an engineer accessed, or that sounded alarms when engineers appeared to be looking up certain types of data. The changes being suggested would severely slow down the work of many of the product teams.

Canahuati, director of product engineering, agreed. He told Stamos that requiring engineers to submit a written request every time they wanted access to data was untenable. “It would have dramatically slowed work across the company, even work on other safety and security efforts,” Canahuati pointed out.

Changing the system was a top priority, Zuckerberg said. He asked Stamos and Canahuati to come up with a solution and to update the group on their progress within a year. But for the engineering teams, this would create serious upheaval. Many of the executives in the room grumbled privately that Stamos had just persuaded their boss to commit to a major structural overhaul by presenting a worst-case scenario.

One executive was noticeably absent from the September 2015 meeting. Only four months had passed since the death of Sheryl Sandberg’s husband. Security was Sandberg’s responsibility, and Stamos technically fell under her purview. But she had never suggested, nor been consulted about, the sweeping changes he was proposing.

Stamos prevailed that day, but he made several powerful enemies.

Late in the evening on December 8, 2015, Joel Kaplan was in the business center of a hotel in New Delhi when he received an urgent phone call from MPK. A colleague informed him that he was needed for an emergency meeting.

Hours earlier, Donald J. Trump’s campaign had posted on Facebook a video of a speech the candidate had made in Mount Pleasant, South Carolina. In it, Trump promised to take a dramatically harder line against terrorists, and then he linked terrorism to immigration. President Obama, he said, had treated illegal immigrants better than wounded warriors. Trump would be different, the presidential candidate assured the crowd. “Donald J. Trump is calling for a total and complete shutdown of Muslims entering the United States until our country’s representatives can figure out what the hell is going on,” he announced.3 The audience exploded with cheers.