Chapter 2

TECHNOLOGY AND PUBLIC SAFETY: “I’d Rather Be a Loser Than a Liar”

The public depends on law enforcement to keep it safe. But you can’t catch criminals or terrorists if you can’t find them—and this requires effective access to information. In the twenty-first century, that information often resides in the data centers of the world’s largest tech companies.

As the tech sector tries to do its part to keep the public safe and protect people’s privacy, we’ve found ourselves perched atop a razor’s edge. It’s a delicate balance that we must maintain while we respond to a fluid and fast-changing world.

Events requiring our response arise suddenly and without warning. It’s a reality I first grappled with in 2002. On January 23 of that year, Wall Street Journal reporter Daniel Pearl was abducted in Karachi, Pakistan.1 His kidnappers moved between internet cafés using our Hotmail email service to communicate their ransom demands, kicking off a desperate manhunt by the Pakistani police. In exchange for Pearl, the kidnappers demanded the release of terrorist suspects in Pakistan and the halt of a planned shipment of F-16 fighters from the United States. It was clear the Pakistani government would not agree to the ransom demands. The only way to save Pearl was to find him.

The Pakistani authorities worked quickly behind the scenes with the FBI in the United States, who came to us. Congress had created an emergency exception to the Electronic Communications Privacy Act so that the government could act immediately and tech companies could move quickly when there was an “emergency involving the danger of death or serious physical injury.”2 Pearl’s life was clearly in jeopardy.

John Frank came to me and explained the situation. I gave the green light to work with the local police and the FBI. Our goal was to monitor the Hotmail account being used by the kidnappers and use the IP address in their newly created emails to locate the internet cafés halfway around the world where they were sitting. Our teams worked closely with the FBI and the local authorities in Pakistan for a week, trailing the kidnappers as they bounced from hotspot to hotspot accessing the internet.

We came close but not close enough. The kidnappers killed Pearl before being caught themselves. We were devastated. His brutal death underscored the enormous stakes and responsibility that had been cast upon us, something we seldom spoke about publicly.

The incident was an early indicator of what was to come. Today, cyberspace is no longer some peripheral dimension. It increasingly has become the place where people organize themselves and define what happens in the real world.

The tragedy involving Daniel Pearl also underscores the importance of exercising judgment in terms of privacy. In important ways, there is a balance between privacy and safety that benefits from privacy groups that push in one direction and law enforcement agencies that push in another. But like the judges who decide these disputes, tech companies have become a place where these issues come to a head. We need to understand and think hard about both sides of this equation.

One big challenge is how to do this well. Our ability to turn on a dime in response to search warrants is a process that was honed through trial and error since the birth of email and electronic documents in the 1980s.

In 1986, President Ronald Reagan signed the Electronic Communications Privacy Act, affectionately known by today’s privacy lawyers as ECPA. At the time, no one knew whether the Fourth Amendment would protect something like electronic mail, but Republicans and Democrats alike wanted to create this type of statutory protection.

As sometimes happens in Washington, DC, in 1986 Congress acted with good intentions but in a way that was far from simple. Part of ECPA was the Stored Communications Act, which created what was basically a new form of search warrant. With probable cause, the government could go to a judge, secure a search warrant for your email, and serve the warrant not on you but on the tech company where your email and electronic documents were stored.3 The company was then obligated to pull the email and turn it over. In certain circumstances, the law in effect turned tech companies into agents of the government.

This also created a new dynamic. If the government served a search warrant at your home or office, someone was likely to be there and know what was happening. They couldn’t stop it, but they were aware. If they thought their rights were violated, they could follow in John Wilkes’s footsteps and go to court.

But Congress adopted a more complicated approach when it came to notifying people and businesses that the government was obtaining their emails and documents from tech companies, creating a statute that gave the government the authority to seek a gag order that would compel a tech company to keep the warrant secret. This statute gave the government five different bases on which to demand secrecy. On the surface, these bases did not look unreasonable. For example, if disclosure would lead to destruction of evidence or intimidation of a witness, or would otherwise jeopardize an investigation, a judge could issue a warrant together with what is called a nondisclosure order.4 A tech company might receive both orders together, the first requiring it to turn over electronic data files and the second requiring it to keep the demand secret.

When email was still a rarity, these new warrants and gag orders were few and far between. But once the internet exploded and data center campuses emerged with hundreds of thousands of computers, life became far more complex. Today, twenty-five full-time employees—compliance experts, lawyers, engineers, and security professionals—make up our Law Enforcement and National Security team. They work with broad support provided by numerous law firms around the world, and they’re known across Microsoft as the LENS team. Their mission is straightforward: to review and respond globally to law enforcement requests under the laws of different countries and in accordance with our contractual obligations to our customers. This is no small task. The LENS team operates from seven locations in six countries on three continents. During a typical year, they address more than fifty thousand warrants and subpoenas from more than seventy-five countries.5 Only 3 percent of these demands are for content. In most cases, authorities are looking for IP addresses, contact lists, and user registration data.

When Microsoft receives a warrant, it typically comes through email. A compliance manager reviews the demand to ensure that it’s valid and signed by a judge, that the authorities have probable cause, and that the agency has jurisdiction over the information. If everything checks out, the compliance manager will pull the requested evidence from our data center. The data is reviewed for a second time to make certain that we are only producing exactly what’s specified in the warrant, and it’s then sent to the requesting authority. As one LENS employee explained to me, “It sounds simple, but it takes a lot of time to do a good job. You need to review the warrant itself, review the account information associated with it, pull the information, and then review it again to be certain that what you’re providing is appropriate.”

When a compliance manager concludes that a warrant is too broad or the request exceeds an agency’s jurisdiction, the case is escalated to an attorney. Sometimes we ask for warrants to be narrowed. Other times we deem the warrant unlawful and refuse to comply.

One member of the LENS team is on call 24-7, meaning for a week at a time he or she will sleep next to a phone in case there is an emergency or terrorist incident somewhere in the world that requires immediate action. During weeks when the world is turned upside down, members of the LENS team take turns being on call, so each person gets enough sleep to be alert on the job.

In 2013, as Edward Snowden shared the NSA’s secrets and the public issues relating to this massive amount of data started to explode, a new lawyer joined Microsoft to lead the team. Her name is Amy Hogan-Burney. Armed with a keen intellect and sharp sense of humor, she quickly won over the team. Amy had spent the prior three years as an attorney in the National Security division at the FBI’s headquarters. It equipped her well for the work at Microsoft, even if there would be days when she was on the opposite side of an issue from her former colleagues in Washington, DC.

Amy quickly adapted to her new role. She sat just downstairs from my office and I found myself walking down to her corridor more and more frequently. Her office was next to Nate Jones’s, who had joined Microsoft earlier in the year after wrapping up more than a decade serving in the US government, including time with the Senate Judiciary Committee, the Department of Justice, and finally on President Obama’s National Security Council working on counterterrorism.

Amy managed the work of the LENS team while Nate managed our overall compliance strategy, our relations with other tech companies, and negotiations with international governments. As the world had evolved, they and the entire LENS team had to strike a delicate balance. They needed to work with law enforcement agencies around the world, but they were also on the front line defending the privacy rights enshrined in the Fourth Amendment and other countries’ laws. As they worked with the multiple privacy experts we already had on board, I was glad their offices were close to mine.

Nate and Amy quickly became something of a tag team, so much so that others on the team began referring to them as “Namy.” Across Microsoft, people relied on Nate and Amy to work together quickly to think through our approach on the most sensitive issues. Our compliance managers would glance at a hot issue that arrived in their inbox, talk with each other, and decide that it needed to go to Namy right away.

Our Namy team was in the hot seat for protecting the world’s filing cabinet—a seat that often got hotter in sudden and dramatic ways.

As office workers across France prepared to break for lunch on Wednesday, January 7, 2015, two brothers entered the Paris headquarters of the satirical magazine Charlie Hebdo and viciously murdered twelve people.6 The two men were affiliated with Al-Qaeda, and they had been offended, as had many other Muslims, by the publication’s profane cartoons of the prophet Muhammad.7 But unlike many others, these brothers had taken matters into their own hands.

The tragedy was all over the news. We saw the horrific events unfold from Redmond with the rest of the world. As I refilled my coffee mug in the break room, a group of us watched the television as French police searched for the two brothers, who had managed to escape. Soon soldiers from the French army were involved in a nationwide manhunt, and another Al-Qaeda member launched a separate deadly terrorist attack in a French supermarket.8 I recognized the streets and neighborhoods involved; I’d spent my first three years as a Microsoft employee working at our European headquarters in Paris.

Other than checking on our employees in the area, who were all safe, the story seemed important for the world but unrelated to my own job. That was no longer the case when the sun rose the next day in Redmond. France’s national police quickly determined that the two terrorists had Microsoft email accounts, and they asked the FBI for help. At 5:42 a.m. in Redmond, the FBI in New York responded to the emergency and sought from us the killers’ email and account records, including the IP addresses that can show the location of a computer or phone when a user logs in. A team at Microsoft reviewed the emergency request and provided the information to the FBI within forty-five minutes. A day later, the national French manhunt led authorities to the two terrorists, who were killed in a shootout with police.

The events in Paris shook France and the world. The Sunday following the attack, more than two million people marched in the French capital’s streets to mourn the journalists and stand in solidarity, demonstrating support for freedom of the press.9

Unfortunately, it was not the last tragedy inflicted upon Paris in 2015. On a Friday evening in November, as Parisians were winding down from the workweek, terrorists struck again in coordinated attacks across the city. They opened fire with automatic rifles at a concert inside a theater, outside a stadium, and in restaurants and cafés. The scenes were horrifying. The terrorists killed 130 people and injured more than five hundred others. It was the deadliest attack Paris had experienced since World War II. And while seven of the attackers were killed, two others managed to escape.10

French President François Hollande immediately declared a state of emergency across the country. The Islamic State of Iraq and Syria—ISIS—claimed responsibility, and it soon became apparent that some of the attackers had come from Belgium. A new manhunt ensued, this time spanning two nations.

Working with European authorities, the FBI again quickly served warrants and subpoenas on tech companies for the email and other accounts belonging to the suspects. We had learned from the Charlie Hebdo tragedy that we needed to be ready to spring into action when terrorists struck. This time, the authorities in France and Belgium served fourteen orders on us. The team reviewed them, determined they were lawful, and provided the information requested, in each case turning around information in less than fifteen minutes.

The two tragedies in Paris were events that grabbed the world’s attention. But the days they occurred were far from the only days that demanded our own. When email was in its infancy, governments rarely turned to us. But once fifty thousand search warrants and government orders started arriving from more than seventy countries each year, we needed to operationalize our work on a global scale.

Satya Nadella helped define our path forward. He’d run Microsoft’s cloud business before becoming the company’s CEO in early 2014. More than anyone else, he understood the cloud. He also brought another valuable sensibility to this complex issue. He had grown up as the son of a senior civil servant in India, where his father was revered as the leader of the academy that trained a generation of the country’s senior civil servants in the decades following the nation’s independence. This background gave Satya an intuitive feel for how governments worked. I was struck by the similarity to Bill Gates, who had grown up as the son of one of Seattle’s most prominent and respected lawyers. Bill and Satya were both quintessential engineers, but Bill could think like a lawyer and Satya could think like someone in government. For me, the opportunity to talk through tough issues with both was invaluable.

As we grappled with the full range of surveillance issues, Satya suggested in late 2014 that we needed to develop a principled approach. “We need to know how to make the hard calls, and we need our customers to know how we’re doing it,” he said. “And we need a set of principles to guide this work.”

We had applied a similar approach to hard issues over the preceding decade, including the publication of “Windows Principles: Twelve Tenets to Promote Competition” to address our antitrust issues. I had unveiled those principles at the National Press Club in Washington, DC, in 2006.11 Jon Leibowitz, then a member of the Federal Trade Commission who had pushed us on the subject amid our high-profile antitrust cases, attended the speech and came up to me afterward. “If you had come out with these a decade ago,” he said, “I don’t think the government would have sued you.”

While Satya’s assignment seemed straightforward, it wasn’t. We needed principles that could apply across our entire business from our operating systems to the Xbox. These principles had to be simple and memorable—not twenty paragraphs filled with legal and technical jargon. Coming up with something that is shorter and simpler is always more difficult.12

While the issue was complex, the starting point was not. We were always clear in our minds that the information people stored in our data centers didn’t belong to us. People still owned their emails, photos, documents, and instant messages. We were stewards of other people’s possessions, not the owners of this data ourselves. And as good stewards, we needed to use this data in ways that served its owners, rather than thinking about only ourselves.

Building from this starting point, we assembled a team that developed what would become four principles that we would call our “cloud commitments”: privacy, security, compliance, and transparency. I loved pointing out to the company’s marketing leaders that the lawyers had found a way to take a complicated topic and reduce it to four words. Not surprisingly, they were quick to point out that this was a first.

Still, developing clear principles and putting them to work were two separate challenges. The team built out each principle with details and training. The real test would come when new circumstances raised tough questions and required us to decide how far we would go to stand up for the commitments we had created.

One of the toughest calls soon came regarding our commitment to transparency. We recognized that transparency was a linchpin for everything else. If people didn’t understand what we were doing, they could never trust us on anything else.

Our business customers, in particular, wanted to know when we received a warrant or subpoena seeking their emails or other data. We believed there was seldom a good reason for the government to serve legal orders on us rather than our enterprise customers. Unlike individual criminals or terrorist suspects, a reputable company or business was far less likely to run for the border or act unlawfully to thwart an investigation. And if the government was concerned about the potential erasure of data, we could act under a limited “freeze order” to make a copy of a customer’s data while the government hammered out the legal issues with the customer before obtaining access to it.

In 2013, we stated publicly that we would notify our business and government customers if we received legal orders for their data.13 If a gag order prohibited us from telling them, we’d challenge the order in court. We’d also direct government agencies to go straight to our customers for information or data about one of their employees—just as they did before these customers moved to the cloud. And we’d go to court to make it stick.

We faced our first test when the FBI served us with a national security letter seeking data that belonged to an enterprise customer. The letter barred us from telling the customer that the FBI wanted its data. We studied the letter and could see no reasonable basis for the FBI to prohibit us from notifying the customer, let alone demand the data from us rather than obtain it directly from the customer. We refused, filed a lawsuit, and went to federal court in Seattle, where the judge was sympathetic to our argument. The FBI got the message and withdrew the letter.

Over the next year, our lawyers made good progress in pushing the Justice Department to go directly to enterprise customers for data. But in January 2016, an assistant US attorney in another district disagreed and served a search warrant under seal on us demanding data that belonged to a business customer. He coupled the warrant with an open-ended gag order that would stay in place forever. We objected.

Typically, once we had explained our position, the government would back off. This time, the federal attorney persisted and forced us to go to court.

I was traveling in Europe and woke early to an email from David Howard, who was responsible for our litigation work and several other areas. David had joined us five years earlier and was a successful former federal prosecutor and law firm partner. He brought a calm demeanor and good judgment to every tough problem. His leadership had been instrumental in what became year after year a pattern of winning 90 percent of our lawsuits. As I said only half-jokingly to our board of directors, “I’ve learned from David that good litigation results are not actually hard to achieve. You just have to fight the cases you deserve to win and settle the cases you deserve to lose.” The key was having someone like David who could discern the difference.

In this instance, David wasn’t optimistic about our chances. The judge was not sympathetic and was threatening to hold us in contempt of court. David wrote that the litigation team wanted to turn over the customer’s data to avoid a fine.

On a conference call later that day, I told the team that I didn’t want to surrender. We had made a promise to our customers to fight these types of orders, and that included going to court and taking on tough battles.

One of the litigators said that this was clearly a fight we would lose, and it could be an expensive defeat. “I’d rather be a loser than a liar,” I said. “A promise is a promise.” I felt that the cost of breaking it was greater than any amount of money, even if the outcome remained under seal and kept a secret.

I told the litigation team that if they fought the case, lost the battle, and kept the fine under $20 million, I would consider it a moral victory. We all knew there was no way we’d receive a fine for more than a fraction of that amount. It was my way of telling the litigators—who thankfully wanted to win every case—that there was no way they could lose this one as far as I was concerned.

The Microsoft team worked around the clock and through the weekend with our outside lawyers. We lost the case, but we avoided the contempt fine entirely and preserved the ability to be transparent with our customers and state generally that we had now lost one of these cases. And most important, we had lived up to our promise.

We worried that we’d continue to be tested in this way on a case-by-case basis. We needed to go on the offensive. “We aren’t going to win these cases if we let the government pick every fight,” David said. “These types of gag orders are supposed to be the exception, not the norm. But the government is making them routine. We need the courts to rule on this broad practice.”

He developed a brilliant play. We decided to pursue what is called a declaratory judgment, which would clarify our rights. We argued that the government was exceeding its constitutional powers by routinely issuing gag orders under the Electronic Communications Privacy Act. We combed through available warrant records from the prior year and a half and found that more than half of government data demands for individuals were bound by gag orders, with half of these written to ensure they were kept secret forever.

We returned to federal court in Seattle to sue our own government. We argued that the excessive use of gag orders violated our First Amendment right to tell our customers that the government was seizing their emails. We also maintained that these gag orders violated our customers’ Fourth Amendment right to be protected from unlawful search and seizure, because people had no way of knowing what was happening and weren’t able to stand up for their legal rights.

The case squarely raised whether people’s rights would be protected in the cloud. We were optimistic, bolstered by the trend we saw unfolding in the Supreme Court.

In 2012, Supreme Court justices declared in a 5–4 decision that the Fourth Amendment required that the police get a search warrant before putting a GPS locator on a suspect’s car.14 While the other justices found that the “physical intrusion” of attaching a device to someone’s car required a search warrant, Justice Sonia Sotomayor recognized that in the twenty-first century, law enforcement didn’t necessarily need to physically intrude to track someone’s location. GPS-enabled smartphones, which create remote records of someone’s location, were starting to spread. They revealed all sorts of personal information that the government could mine for years. As Sotomayor put it, unless this type of surveillance was safeguarded under the Fourth Amendment, it could “alter the relationship between citizen and government in a way that is inimical to democratic society.”15

Justice Sotomayor captured something else that we thought was fundamental. For almost two centuries the Supreme Court had said the Fourth Amendment failed to protect information that was widely shared, on the theory that people no longer had a “reasonable expectation of privacy.” Now, however, Sotomayor noted, privacy meant the ability to share information but determine who can see this information and how it will be used. She was the first justice to articulate this shift, and the big question was whether the other justices would embrace it.

Two years later, an answer began to emerge. In the summer of 2014, Chief Justice John Roberts wrote an opinion for a unanimous Supreme Court.16 The justices decided that the police needed a warrant to search someone’s cell phone, even if the person was under arrest for committing a crime. As Roberts put it, “Modern cell phones are not just another technological convenience. With all they contain and all they may reveal, they hold for many Americans the privacies of life.”

While the Fourth Amendment was adopted to protect people in their homes, Roberts explained that modern phones “typically expose to the government far more than the most exhaustive search of a house: A phone not only contains in digital form many sensitive records previously found in the home; it also contains a broad array of private information never found in a home in any form.”17 Hence the Fourth Amendment applied.

We cheered when we read what Roberts wrote next. For the first time, the Supreme Court in effect addressed the files stored in our data centers, like the one in Quincy. “The data a user views on many modern cell phones may not in fact be stored on the device itself,” he wrote. “The same type of data may be stored locally on the device for one user and in the cloud for another.”18 For the first time, the Supreme Court recognized that a search of a phone reached far beyond what was in a person’s physical possession. In effect, new technology had created new grounds for strong privacy protection in the cloud itself.

While these words didn’t speak directly to our protest of broad gag orders in Seattle, they provided some helpful tailwinds for our broader privacy cause. Now we needed to ride them.

We put David’s plan into action by filing a lawsuit on April 14, 2016.19 It was assigned to Judge James Robart, who had been a leading light in Seattle’s legal community before becoming a federal judge in 2004. We had appeared before him previously, including in a big patent trial. He was tough, but smart and fair. He kept our litigators on their toes, and from my vantage point, that was a good place for them to be.

As we filed our lawsuit, we shared our data from the preceding eighteen months, which showed that we had received more than twenty-five hundred gag orders applicable to individuals, effectively silencing us from speaking to customers about the legal process seeking their personal information.20 Notably and even surprisingly, 68 percent of the total contained no fixed end date at all. This meant that we effectively were prohibited forever from telling our customers that the government had obtained their data.

We recognized that we needed to couple our concerns about the DOJ’s current practice with a blueprint for a better approach. We called for greater transparency and what we termed digital neutrality, or the recognition that people’s information should be protected regardless of where and how it was stored. This should be balanced, we said, with a principle around necessity, so that gag orders could be issued but could be adapted to what’s necessary for an investigation, and no more.

The government hit back with a motion to dismiss our lawsuit before it even got started. It argued that we had no right to inform customers under the First Amendment and no basis to stand up for customers’ rights under the Fourth Amendment. We soon concluded that our ability to survive this motion would likely provide the critical turning point. If we survived, we would get access to government data on the broad use of secrecy orders, and this likely would give us the remaining facts we needed to drive our arguments across the finish line.

We decided that we needed to build a broad coalition of supporters. We spent the summer on a recruiting campaign. By Labor Day, more than eighty supporters had joined the case by filing amicus, or friend of the court, briefs. The group represented every part of the tech sector, the business community, the press, and even respected former officials from the Justice Department and the FBI.21

The lawyers and the public filed into Judge Robart’s courtroom on January 23, 2017. It had been a year and two days since our decision to fight a gag order under seal rather than surrender. Now we had the opportunity for a public hearing on the government’s motion, with former DOJ officials supporting us in the front row.

Two weeks later, Robart ruled that our case could proceed.22 While he accepted the government’s argument that we couldn’t defend our customers’ Fourth Amendment rights, he agreed that we had a basis to move forward with our First Amendment claim. We had lived to fight another day.

The Justice Department took notice and began to take our claims more seriously. We sat down and, after a number of discussions, the DOJ released a new policy that set clear limits on when prosecutors could seek gag orders. The department coupled this with new guidance directing prosecutors about going to enterprises before cloud providers in the case of enterprise warrants. We were satisfied, saying publicly that we thought the new approach would help ensure that secrecy orders are used only when necessary and for defined periods of time.23 Both sides agreed to bring the gag order litigation to an end.

The outcome underscored the delicate balance between privacy and safety. Lawsuits typically are blunt instruments. By themselves, they can only determine if current processes are lawful. They can’t craft a new proposal that addresses how technology should be governed. That requires real conversation and sometimes negotiation and even new legislation. In this case, the lawsuit had done what was needed, bringing everyone to the table to talk about the future. But getting everyone to sit down together on other issues remained an ongoing challenge, one that would become even more difficult and important.