Выбрать главу

“It’s about the business model,” one government official said in an interview. Sandberg’s behavioral advertising prototype treated human data as financial instruments bartered in markets like corn or pork belly futures. Her handiwork was “a contagion,” the official added, echoing the words of academic and activist Shoshana Zuboff, who a year earlier had described Sandberg as playing “the role of Typhoid Mary, bringing surveillance capitalism from Google to Facebook, when she signed on as Mark Zuckerberg’s number two.”3

With scant competition to force the leaders to consider the wellbeing of their customers, there was “a proliferation of misinformation and violent or otherwise objectionable content on Facebook’s properties,” the attorneys general alleged in their complaint. Even when faced with major impropriety such as Russia’s disinformation campaign and the data privacy scandal involving Cambridge Analytica, users didn’t leave the site because there were few alternatives, the regulators maintained. As James succinctly described, “Instead of competing on the merits, Facebook used its power to suppress competition so it could take advantage of users and make billions by converting personal data into a cash cow.”

When the FTC and states came down with their landmark lawsuits against Facebook, we were nearing completion of our own investigation of the company, one based on fifteen years of reporting, which has afforded us a singular look at Facebook from the inside. Several versions of the Facebook story have been told in books and film. But despite being household names, Zuckerberg and Sandberg remain enigmas to the public, and for good reason. They are fiercely protective of the images they’ve cultivated—he, the technology visionary and philanthropist; she, business icon and feminist—and have surrounded the inner workings of “MPK,” the shorthand employees use to describe the headquarters’ campus in Menlo Park, with its moat of loyalists and culture of secrecy.

Many people regard Facebook as a company that lost its way: the classic Frankenstein story of a monster that broke free of its creator. We take a different point of view. From the moment Zuckerberg and Sandberg met at a Christmas party in December 2007, we believe, they sensed the potential to transform the company into the global power it is today.4 Through their partnership, they methodically built a business model that is unstoppable in its growth—with $85.9 billion in revenue in 2020 and a market value of $800 billion—and entirely deliberate in its design.5

We have chosen to focus on a five-year period, from one U.S. election to another, during which both the company’s failure to protect its users and its vulnerabilities as a powerful global platform were exposed. All the issues that laid the groundwork for what Facebook is today came to a head within this time frame.

It would be easy to dismiss the story of Facebook as that of an algorithm gone wrong. The truth is far more complex.

Chapter 1

Don’t Poke the Bear

It was late at night, hours after his colleagues at Menlo Park had left the office, when the Facebook engineer felt pulled back to his laptop. He had enjoyed a few beers. Part of the reason, he thought, that his resolve was crumbling. He knew that with just a few taps at his keyboard, he could access the Facebook profile of a woman he had gone on a date with a few days ago. The date had gone well, in his opinion, but she had stopped answering his messages twenty-four hours after they parted ways. All he wanted to do was peek at her Facebook page to satisfy his curiosity, to see if maybe she had gotten sick, gone on vacation, or lost her dog—anything that would explain why she was not interested in a second date.

By 10 p.m., he had made his decision. He logged on to his laptop and, using his access to Facebook’s stream of data on all its users, searched for his date. He knew enough details—first and last name, place of birth, and university—that finding her took only a few minutes. Facebook’s internal systems had a rich repository of information, including years of private conversations with friends over Facebook Messenger, events attended, photographs uploaded (including those she had deleted), and posts she had commented or clicked on. He saw the categories in which Facebook had placed her for advertisers: the company had decided that she was in her thirties, was politically left of center, and led an active lifestyle. She had a wide range of interests, from a love of dogs to holidays in Southeast Asia. And through the Facebook app that she had installed on her phone, he saw her real-time location. It was more information than the engineer could possibly have gotten over the course of a dozen dinners. Now, almost a week after their first date, he had access to it all.

Facebook’s managers stressed to their employees that anyone discovered taking advantage of their access to data for personal means, to look up a friend’s account or that of a family member, would be immediately fired. But the managers also knew there were no safeguards in place. The system had been designed to be open, transparent, and accessible to all employees. It was part of Zuckerberg’s founding ethos to cut away the red tape that slowed down engineers and prevented them from producing fast, independent work. This rule had been put in place when Facebook had fewer than one hundred employees. Yet, years later, with thousands of engineers across the company, nobody had revisited the practice. There was nothing but the goodwill of the employees themselves to stop them from abusing their access to users’ private information.

During a period spanning January 2014 to August 2015, the engineer who looked up his onetime date was just one of fifty-two Facebook employees fired for exploiting their access to user data. Men who looked up the Facebook profiles of women they were interested in made up the vast majority of engineers who abused their privileges. Most of the employees who took advantage of their access did little more than look up users’ information. But a few took it much further. One engineer used the data to confront a woman who had traveled with him on a European vacation; the two had gotten into a fight during the trip, and the engineer tracked her to her new hotel after she left the room they had been sharing. Another engineer accessed a woman’s Facebook page before they had even gone on a first date. He saw that she regularly visited Dolores Park, in San Francisco, and he found her there one day, enjoying the sun with her friends.

The fired engineers had used work laptops to look up specific accounts, and this unusual activity had triggered Facebook’s systems and alerted the engineers’ managers to their transgressions. Those employees were the ones who were found out after the fact. It was unknown how many others had gone undetected.

The problem was brought to Mark Zuckerberg’s attention for the first time in September 2015, three months after the arrival of Alex Stamos, Facebook’s new chief security officer. Gathered in the CEO’s conference room, “the Aquarium,” Zuckerberg’s top executives had braced themselves for potentially bad news: Stamos had a reputation for blunt speech and high standards. One of the first objectives he had set out when he was hired that summer was a comprehensive evaluation of Facebook’s current state of security. It would be the first such assessment ever completed by an outsider.