> CHAPTER 12

> MUDGE AND DILDOG

PEITER ZATKO, KNOWN to even close friends as Mudge, was not the most engaged executive at @stake, even though he was the lead creator of the pioneer hacker consulting group. The most famous member of the Cult of the Dead Cow was elsewhere much of the time, fighting his own demons and, after 9/11, America’s demons as well. What he saw made him very afraid. Mudge knew as much as anyone about the basic failings of tech security and about their root causes. The internet’s inventors built it on trust and it got loose in its test version, before Vint Cerf and his team could come up with reliable security. It still ran that way.

All software has bugs, some of which can be exploited. Layering software on software makes it less secure. The software vendors had all escaped legal liability for poor craftsmanship and had little incentive to devote significant resources to making their products safer. (This hard line on liability has only begun to fray in 2018 in extreme cases, as with deaths blamed on automated vehicles’ programming.) Regulation ranged from nonexistent in most commercial markets to negligible in industries such as financial services, health care, and power distribution. All of which meant everything was unsafe and would only get less safe as the economy grew more dependent on technology.

This was classic market failure, compounded by political failure. One could debate the largest causes of the political failure, but they included the capture of the regulators by industries that did not want to be regulated, the dominant pursuit of short-term business gains by short-term business executives, and the failure to distinguish when private companies should be responsible for their own defense and when the federal government needed to step in. That last was nontrivial, since the same techniques could be employed by criminal hackers, fending off whom would generally be considered a corporate responsibility, and nation-state spies, who would generally be considered a Homeland Security or FBI responsibility, with backup from the Department of Defense. Even if those lines were clear, what do you do about criminals who work for spies, or spies who moonlight as criminals? Congress’s inaction loomed large. But without blood on the streets, Mudge held little hope of that changing anytime soon.

In 2003, as largely Russian organized crime groups took the leading role in spreading computer viruses for spam and extortion, Mudge saw that the big picture was about to look a lot worse. He figured the best way to help was to go to the place that had the best understanding of the problem, the most power to deal with it, and the greatest responsibility: the federal intelligence agencies. Given his sketchy associations and general antiestablishment attitude, it would have been draining to apply directly at the CIA or NSA. But Mudge could at least start where he was a known quantity, and where he had geographical and employment buffers from the people wearing braids and stars on their uniforms.

A year after Mudge’s top government sponsor, Richard Clarke, resigned from the Bush White House, Mudge rejoined BBN Technologies. Starting in 2004, he worked at BBN on research and development for US intelligence agencies, and he trained people who would become the core of the NSA’s elite hacking unit, Tailored Access Operations. Over the next six years, he worked on a lot of things he can’t talk about. “I think domestic lives have been saved as my ideas went operational,” Mudge said. He told me that lives in the Middle East were also saved because his tools were used instead of bombs.

In 2010, the new head of the Defense Advanced Research Projects Agency asked Mudge to come in-house and lead the agency’s cybersecurity efforts. Mudge had thought about DARPA before, but he hadn’t been enthusiastic about the agency’s prior leadership. The new boss, Regina Dugan, he liked. And DARPA, founded in 1958 in response to Russia’s stunning Sputnik satellite, had the coolest mission in government: “the creation and suppression of strategic surprise.” Like many positions inside DARPA, the post was for a fixed three-year term, during which he would award grants for offensive and defensive breakthroughs in security. But the opportunity was incredible. This agency had steered the creation of the ARPANET, which became the modern internet. “I obviously wanted to make sure the things I depend on, that my family and friends depend on, are secure,” Mudge said. “I also owe a lot to my country. A lot of countries would not have allowed me to influence the intelligence community and the Department of Defense, hopefully in ways that have them make less stupid mistakes.”

Mudge’s personal slogan had long been “Make a dent in the universe.” Now he called in a dozen of the smartest hackers he knew to help figure out how. He told them to be ready to discuss where the security industry was failing, what they as researchers were angriest about, and what DARPA could do to help. They convened in a bland Arlington, Virginia, building that housed the massive intelligence contractor Booz Allen Hamilton, the company that would employ Edward Snowden. Mudge’s call brought out “a bunch of misfits,” said Dug Song, who was among them. The group included @stake veterans Dave Aitel, now running zero-day seller Immunity Inc., and Dino Dai Zovi, a former federal labs researcher and chief scientist at government zero-day supplier Endgame. Also there was sometime intelligence contractor H. D. Moore, who had created Metasploit, a penetration-testing tool that used vulnerabilities as soon as they were disclosed, often within a day. Ninja Strike Force stalwart and intelligence contractor Val Smith came too.

Mudge convened the meeting by telling them that his DARPA slot had given the entire hacking community, at long last, “a seat at the table.” Now, he said, “let’s not waste this opportunity.” As they brainstormed priorities, Song asked about something different: a change in process. DARPA funded the big guys—defense contractors, other major corporations, and some university departments. Those operations knew how to navigate the paperwork, come up with slick pitches, and leverage their previous work. This left out talented small teams and individuals who had great insights from being hands-on hackers and no idea where to go from there. The son of a liquor-store owner, Song had used a small-business grant to start Arbor Networks. He said DARPA should go small as well, and Smith agreed.

Mudge had spent enough time around government to realize they were right, and he convinced Dugan. “The process itself was an impediment,” Dugan said. Mudge announced the Cyber Fast Track not long after, the first program at DARPA aimed at giving small amounts to small teams, instead of large amounts to large ones. Mudge funded nearly two hundred proposals, all of which let the researchers keep their own intellectual property. Among the recipients was Moxie Marlinspike, whose invention Signal would come years later, and Charlie Miller, who studied flaws in near-field communication as those protocols were getting embedded in more smartphones.

At Def Con in 2011, Miller was presenting a near-field talk and bumped into Mudge, who was also speaking. Miller told Mudge some of the things he was interested in and asked if DARPA would buy him a car he could hack. “Submit and find out,” Mudge said, so Miller did. He got the car and hacked away. Building on that work later, Miller hacked a moving jeep being driven by a Wired reporter, prompting a mass recall and drawing global attention to the safety issues of computerized vehicles. The initial equipment and the money was one thing. But DARPA’s backing became even more important when a car company, upset at Miller’s revelations, threatened to sue. Mudge warned them that if they did, the Pentagon would join the suit on Miller’s side, with a significant number of well-trained lawyers.

“Those grants also provided a certain amount of legitimacy to the research that really helped when people were having objections,” Miller said. “There are lots of research projects you see around now that would have never existed without those CFT grants, including the car hacking we did.” Everyone at the Pentagon wanted to get the papers explaining the research. But before they could get the briefing books, they had to sit through a demonstration by the hackers themselves, so they really understood them. In the years that followed, other areas at the Pentagon began mimicking the fast track Mudge developed.

Mudge did much more than streamline the way the federal government acquired good ideas. He also tackled a fundamental problem with the way the government, and everyone else, evaluated security. For decades, no one had come up with a reasonable way to estimate the worth of security products, which draw attention mainly when they fail. Likewise, DARPA couldn’t figure out a logical basis for determining what to fund. “We are not going to approve a single new project until we do the deep strategic work,” Dugan said. She insisted that Mudge and his boss, long-serving DARPA software chief Dan Kaufman, find a new way of looking at the issue.

Mudge and Kaufman came up with what they called the Cyber Analytic Framework. The major concept: as predictable complexity increases, the defenders’ job gets harder more rapidly than the attacker’s job does. To illustrate the problem, Mudge used the common language of Washington, a slide deck. The most eye-popping chart showed that the average advanced defense software had bloated to contain 10 million lines of code over the past decade. The average number of lines in malicious software, meanwhile, had held steady at 125.

Since every thousand lines of code has one to five bugs in it, that meant big security products were making the situation worse. DARPA needed to seek simple and elegant approaches instead. “It was a clear articulation of trend lines,” Dugan said. Mudge began asking defensive grant applicants whether their approaches were tactical or strategic, how their project would increase or decrease the overall attack surface being defended, and how they would beat it themselves.

The Framework approach became the basis for DoD spending beyond DARPA, and it got DARPA some money that otherwise would have gone to Cyber Command, one of several things Mudge worked on that annoyed Cyber Command head and NSA director Keith Alexander. Mudge didn’t mind that at all. Alexander had presided over a massive expansion of global and US surveillance, as well as a culture that produced several whistle-blowers and leakers while allowing employees to be hacked.

Mudge loved betting on promising ideas, but he also considered it his duty to strangle bad ones in the crib. While still an outside contractor, he decried a product that automated some “active defense,” the industry term for measures that range from blocking suspicious connections to disabling the computers used by an attacker. Though hacking back tempts targets that feel powerless relying on the government, most intelligence professionals think it is a bad idea that would lead to chaos and perhaps an unintended war. Automating that “is a terrible idea, because then an outsider can make you do things,” Mudge said.

Mudge also expended considerable energy arguing against demands for back doors in encryption. Intelligence and military officials said that back doors worked well in their offices—that access was logged and controlled and that abuse was rare. But those were closed systems, where the people in charge could completely govern the environment. Out in the regular world, configurations get looser and privileged access leaks.

Mudge didn’t stop telling the truth just because he was at the seat of great power. It probably helped that his position would end after just three years, so officials expected less sucking up. Mudge briefed the Joint Chiefs and the secretary of defense, helping them understand when one of the armed forces or a contractor was claiming an improbable capability in a turf or budget fight. “The Joint Chiefs and the Pentagon would call me in because I didn’t have a horse running, and I was able to explain to them ground truth,” he said.

Mudge remained iconoclastic. Amid widespread outcry over the constant breaches of American defense vendors by other nations, Mudge observed shortly after leaving DARPA that contractors had a perverse incentive to allow their weapons systems to be stolen. Once that happened, Mudge mused at Black Hat, they could ask the Pentagon to pay for a new and improved version of their system that was not yet in enemy hands. “Game theory is a bitch,” he said.

Yet Mudge managed to play the inside game well. DARPA always sent off its creations to new homes within the Pentagon or intelligence establishment where they would best develop. With Alexander and others predisposed to dislike much of what Mudge had handled, he sometimes engaged in subterfuge, handing off to a midlevel operative who could remove evidence of a project’s heritage. At one briefing with the deputy secretary of defense, Alexander explained that he had five “silver bullets” that he could deploy in cyberoperations. “Three of those are mine,” Mudge thought with satisfaction.

Mudge got the Pentagon to stop seeing hackers as the natural enemy. In fact, Mudge showed that people who grew up knowing exactly where the line was were habitually more careful about not crossing it than people constantly protected by their uniforms, bureaucracy, and lawyers. During one discussion at a large agency that was witnessed by Kaufman, an employee asked Mudge if the agency could just hack into a system in order to get information Mudge was deducing. “Absolutely, you could do that,” Mudge told him. “But just suggesting that is illegal, and it’s wrong.” Even within DARPA, Mudge provided a moral compass.

In a fortuitous bit of timing, Mudge’s scheduled exit from the government came in April 2013, two months before Snowden’s disclosures turned the NSA and US intelligence into global punching bags. On his way out, Mudge accepted the secretary of defense’s highest award for civilian service. The citation said that Mudge’s fast-track grants had produced more than one hundred new capabilities, that his new method for detecting cyberespionage had been placed into operation by intelligence agencies, and that he had improved the Defense Department’s ability to conduct online attacks.

Mudge followed Dugan to Google, where he worked on secret projects. The best known put a secure operating system on a memory card; the software would function properly even if the overall computer were compromised. Its features included an unchangeable logging system. The software would have been among the best possible defenses to the mass surveillance revealed by Snowden. Google did not release a finished version before Mudge left for a new venture: a nonprofit to examine code from binaries, the machine-readable instructions that programs give to computers, and score them based on standard safety features.

Mudge and his wife Sarah’s Cyber Independent Testing Lab functioned like the labs at Consumer Reports, scanning for the digital equivalent of automatic brakes and seat belts, all without needing access to the source code. With money from DARPA, the Ford Foundation, and others, CITL showed that on a Mac’s then-current operating system, hackers would have a harder time attacking Google’s Chrome browser than Safari or Firefox. Mudge aimed to make a more detailed version of such scores into something like the mandatory nutritional labels on food, telling buyers enough for them to make informed choices that reflect their priorities.

Grappling with kidney cancer that brought back his post-traumatic stress disorder, Mudge saw the project through its first year, then handed day-to-day control to Sarah, a fellow veteran of federal contractor BBN. Mudge took a day job as head of security at internet payment processor Stripe, which helped pay the bills at the nonprofit. (A September 2018 investment round would value that company at $20 billion.) In his spare time, Mudge served as cybersecurity advisor to Senator Mark Warner, cochair of the Senate Cybersecurity Caucus. “Mudge has been extremely helpful in refining our understanding of software security, which informed our work on improving the security of internet-of-things devices, to take just one example,” Warner said, referring to new classes of internet-connected gadgets such as security cameras and thermostats. Warner also served as the top Democrat on the Senate Intelligence Committee, making him the lead Democrat in the congressional investigations of Russia’s hacking to help Trump win the 2016 election. It would be logical to think that Mudge’s expertise aided Warner there as well, though neither man would discuss it with me. (Mudge had earlier advised the Democratic Party in 2016 to tighten its security, he tweeted in 2018, but most of his advice was ignored.)

The other great technical mind from cDc’s golden era, Christien “Dildog” Rioux, wound up doing something technologically similar to the work of Mudge’s lab: deeply analyzing the safety of programs without access to the source code. But he went a very different route, starting with rejecting an opportunity to work for the government and ending up doing something much bigger.

While with @stake, Christien spent a lot of time poring over binaries. Source code, which appears as it is written by the programmers, is a hundred times easier for the human eye to comprehend. But it can also hide a host of ills. Looking at ones and zeroes, though, is mind-numbing. So Christien wrote as many tools as he could to process the binaries and tell him what they were saying to the computers. That saved a lot of time while still allowing him to conduct what the industry calls static analysis of the code. As Symantec sucked @stake deeper into itself and made it harder to distinguish from the rest of the giant company, Christien decided to create a start-up to fund his quest for something of a holy grail—a program that would decompile all the binaries back into human-readable instructions for analysis.

From 2006, Christien served as chief scientist of the new company, called Veracode. He tapped Chris Wysopal, his colleague from the L0pht and @stake, as cofounder and chief technology officer. The business plan called for them to serve software customers instead of the makers, like Microsoft and Oracle, where there were incentives to scrimp on security. Once the master program worked well, Christien reasoned, the buyers could convince their suppliers to let Veracode do a safety analysis on the binaries. If they passed with flying colors, then the suppliers would cite Veracode’s approval as a badge of honor and recommend that prospective customers have Veracode do a new check on the most recent software version.

In theory, it was brilliant. In practice, it was a lot of work. “It was a five-year business plan that executed really well in ten years,” Christien said. One early round of funding came from In-Q-Tel, the Silicon Valley venture firm set up to serve the needs of the US intelligence agencies and led by former @stake CEO Chris Darby. Darby believed Christien would make code much more secure, and he thought Christien should deploy it inside US weapons systems, making sure that the code controlling missiles and the like could withstand most attacks by hackers.

Darby arranged for Christien to visit an intelligence installation deep underground and give a demonstration of what Veracode could do. A senior officer of clandestine operations said hello, adding, “I’m a big fan of you guys from the L0pht.” Christien thanked him. “What a nice guy,” he thought. “He probably kills people.” On a specially prepared laptop, Christien analyzed a blob of binary code that had been given him, perhaps a spying tool crafted by the agency. He let the program run during a lunch break and came back just as it spat out the results, describing what many pieces of the code did. Among other things, it detected a custom modification of a standard encryption algorithm. The polite killer was blown away. But the logistics of a major deal were daunting. Veracode could provide its program, but it could not be around to maintain it.

Darby wanted Christien to focus on optimizing the code for such deals anyway. But Christien figured that his main customers would end up being the federal government and a few close allies. “This would not be very lucrative for me, and it would have me working five hundred feet underground and never seeing the light of day,” Christien thought. He didn’t even want to go through the hassle of getting a security clearance. More importantly, “I want to have a bigger impact on the world, and I don’t see it happening in the bowels of government.”

Once Veracode decided to stay focused on the commercial world and Christien’s team cobbled together a prototype of their master decompiler, he and Wysopal started calling old friends who were now inside the big software companies. That included Brad Arkin, an @stake veteran who by 2008 was a senior director for security at Adobe Systems, perhaps the vendor most criticized for omnipresent software flaws in all of Silicon Valley. “Everyone knows your Flash player is full of bugs,” Christien told Arkin, promising to find all the problems. “We can do a scan in a month.” Arkin agreed. But the code base was a mess on a scale Christien had never seen before. In addition to regular programming foibles, Adobe had incorporated obscure encoding systems so that it could display material recorded in all kinds of formats and show them on many different devices. It kept choking the decompiler. After a month went by, Christien declared that he would not shave until he was able to complete the Flash scan. That kept the pressure on him. But it still took an entire, brutal year, and his face itched like hell. “I hate Adobe,” Christien said.

Pulling through it made Veracode’s product much better. The company added big software customers, and by working through military contractors like Boeing, it could also serve the NSA and CIA. Veracode convinced software buyers to demand that their vendors allow Veracode to audit the binaries, which were stored on extremely secure computers. The first time through the wringer, most of the software providers hated it. But instead of blowing the whistle on those suppliers immediately for major weaknesses, Veracode would give them a couple of chances to improve, along with pointers about where and how to do that. Like many software and service companies, Veracode’s sales went up and down, with extra volatility around the end of the quarters because of the commission incentives. After the company straightened that out, and with sales approaching $120 million a year, Veracode weighed going public. The alternative was selling itself to a company with deeper pockets that could bring Veracode to more customers. The latter ended up being a better deal, and Veracode sold itself in 2017 to CA Technologies, formerly known as Computer Associates, for $614 million. It was sold and resold in the following year, the last time for $950 million. Once installed in his new corporate home, Christien could spend more time on a side project called Hailstone, which allows developers to test their code for security flaws as they write. While Veracode typically cost $10,000 a year, they could try Hailstone free. He quit Veracode entirely in March 2019.

The largest proportion of Cult of the Dead Cow members wound up working at tech companies with people who didn’t know their history. That included Luke Benfey, Paul Leonard, Matt Kelly, Misha Kubecka, and Kemal Akman. The previously outed Josh Buchbinder still works in security in San Francisco. John Lester is in Montreal: he worked for the maker of Second Life for years, then focused on electronic tools for interactive medicine and education. Dan MacMillan turned toward business, becoming a sales and consulting executive at big software companies. Glenn Kurtzrock, who had always wanted to put bad guys in jail, served as an assistant district attorney on Long Island for seventeen years before starting a private practice in 2017. Carrie Campbell is a freelance researcher near Seattle. Cofounder Bill Brown teaches documentary film. Cofounder Brandon Brewer, once known as Sid Vicious, is as straight as it gets: senior vice president of real estate–services firm Republic Title, based in Fort Worth.

Sam Anthony went to work as a programmer in a Harvard University lab, then started graduate school there, working on biological models for computation. He earned a PhD in 2018. Along the way, he cofounded a self-driving car technology company, Perceptive Automata. Autonomous vehicles “are super good at knowing where the road is, how fast the car is going, whether something’s a tree or a person,” Sam explained. “They’re miserably bad at solving the psychology problem of guessing what’s in a human’s head. The techniques we developed while I was doing my PhD are perfect for situations where you want machine learning to do something where humans are incredible.” Sam’s company took video clips of pedestrians, showed them to humans, and asked such questions as whether the subjects were acting like they wanted to cross the road. He was using machine-learning techniques to teach computers how to understand people. By the Consumer Electronics Show in January 2019, Perceptive Automata could boast of investments by Toyota, Honda, and Hyundai.

Kevin Wheeler kept working in music for years. In addition to producing bands, Kevin pretended he had three different record labels. He would send recordings and press releases to music publications, trying to get them to write about the bands. If any of them bit, he figured he could always pay for someone to put out a real record. That gambit didn’t pay off. In 1999, he and a cDc friend moved to New York to make it bigger in music, but it was tough slogging. They did two off-Broadway soundtracks before Kevin’s partner met a woman and moved to Taipei with her in 2001. Then the September 11 attacks wiped out the office where Kevin worked and he got laid off. The biggest problem was his partner leaving, because Kevin always did better as part of a team. “I’m not an Oxblood,” he said. “I was the front man. I’ll help promote it, spin it, make it funny. I’m at my most productive when I have a partner pushing me, like Gibe in the beginning.”

One other music partnership did produce a minor hit. While in high school, a budding writer named Hugh Gallagher, frustrated with unrealistic expectations for college entrance essays, wrote a wonderful spoof with cDc-style self-mocking grandiosity. It concluded: “I breed prizewinning clams. I have won bullfights in San Juan, cliff-diving competitions in Sri Lanka, and spelling bees at the Kremlin. I have played Hamlet, I have performed open-heart surgery, and I have spoken with Elvis. But I have not yet gone to college.” Besides getting Gallagher into New York University, it won him a national contest, modest fame, and writing assignments for Rolling Stone. Later, he created the character of a Belgian rapper, Von Von Von, and gave a televised performance of a song at Harlem’s Apollo Theater with music Kevin wrote and produced. Gallagher gave him a shout-out by name from the stage in a video seen more than a million times on YouTube. After flirting with a career playing poker, Kevin turned to trading currency from his apartment. He reverted to being the shy person he really was when not writing under a handle or promoting a cause or a colleague with onstage antics.

By 2018, the Cult of the Dead Cow had likewise faded into the background. The Def Con and HOPE performances had been over for years, and many hackers in their twenties had not heard of it, unless they picked it up from researching the past or heard about the group from older friends and mentors. Almost everyone had heard about hacktivism, even much of the general public, but they usually associated it with Anonymous or other cDc successors. As with any great teacher of other teachers, the most obvious legacy of the group was the actions of those who were inspired by cDc and the next generation that they reached. That included a large swath of nonprofit activists and researchers and some of the top security minds in government and industry. At Google, security team founder Heather Adkins had grown up on Internet Relay Chat channels with cDc and taken its disclosure lessons to heart. They had set the foundation, she said, for efforts like Project Zero, Google’s wide-ranging team for finding bugs in any software and setting a three-month calendar for public revelation. In four years, the group found 1,400 vulnerabilities and drove ninety-day turnarounds from 25 percent to 98 percent. “Disclosure is more mature now, but the stakes are a lot higher,” Adkins said. “Companies have a responsibility to protect users. How do you show them the way? Historically, tech companies had only one goal, and that’s to make money. Not until someone comes in and disrupts that does it have an impact.”

Those individual cDc members who were still accomplishing great things on their own, fulfilling some of the group mission, included Oxblood, Mudge, Christien, and one more person: Psychedelic Warlord. Like the others, he saw that as technology became more central to life, the critical thinking that had grown up with it likewise needed a bigger platform.