CHAPTER SEVENTEEN

Ring of Fiber

[… How can I keep silent?] How can I stay quiet?

[My friend, whom I loved, has turned] to clay,

my friend Enkidu, whom I loved, has [turned to clay,]

[Shall I not be like] him and also lie down,

[never] to rise again, through all eternity?

TABLET X, EPIC OF GILGAMESH

In early 2009, General David Petraeus signed Joint Urgent Operational Need 336, a request to rapidly deploy the Battlefield Airborne Communications Node (BACN). BACN was a system that filled an important gap in servicing soldiers operating at the very edge.1 It was a need specific to the mountains of Afghanistan and Pakistan, where peaks and valleys inhibited normal communications and created a vaporous and unacceptable nonnetworked space.2 BACN would link to everyone who found themselves out of range of the Data Machine. In addition, it would be the military’s own Internet to receive, bridge, and distribute data from satellite, radio, and data networks—a universal relay and intelligence disseminator for standard and nonstandard platforms alike.

To understand BACN, we have to take just a quick journey back to the beginning, to the parts that make up the unmanned machine. Every drone consists of four distinct elements: the platform itself (whether aircraft, ground, or waterborne robot), the payload (what I call the black box—that is, the sensor or weapon), the control station (where the flight is directed from, whether it is on the ground or not), and the communications network that is required to control the platform and receive its product. External to the drone world are the processors (analysts or computers) who scrutinize the product and then the users (political decision-makers, commanders, special operatives, soldiers) who take action, the manned element of the unmanned system, who are hardly trivial.

If all of this were a tactical system—that is, simply serving one user—then the entire system could be relatively self-contained. But think of drones instead as a set of computing appliances (smartphones, tablets, laptops, desktops, etc.), all overlapping: some are indeed used offline and are personal, but the majority are connected to some network and then to the Data Machine, which demands constant data flowing through it like blood flowing through a living body. In the olden days, the military erected its own terrestrial and then space-based communications networks, and it still has many such networks today. But today, most military communications demand access to a network to operate. Where the networks are robust or can be supplemented by military-only systems, communicating is manageable. But where the network is lacking, or when the number of appliances connecting to it surpasses capacity, something different has to be created. And just as in the mid-2000s, when no operation would be undertaken without some drone flying overhead, now no one can be out of network range.

BACN, like other black box systems, really has no simple definition or description, no birth date, and no single identity. On October 24, 2003, its manufacturer, the defense giant Northrop Grumman, conducted a first set of communications between a Global Hawk drone and a manned airborne command post in the skies above California.3 Its system, called the Advanced Information Architecture, was a ROVER-like setup that allowed the drone to send imagery directly to the command aircraft but then also connected to everyone within range of the network to share what it was seeing. It facilitated not just faster and more personal provision of intelligence but also the automatic layering of different types of intelligence. By creating its own IP-based airborne network in the sky, BACN avoided the expensive and bandwidth-intensive transfer of imagery to processors far away. In short, BACN was a self-contained intelligence agency extending and speeding up the process, whatever it was.

Northrop got its first BACN contract in 2005, and it flew an experimental relay in February 2006. The black box was fitted onto a NASA-owned manned WB-57, the same plane used to map all of Afghanistan with hyperspectral precision. Flying over Southern California at 60,000 feet, BACN created a “forward-edge tactical server.” Marines on the ground tapped into real-time imagery and intercepts from collectors near and far, pulling down common situational awareness displays and current intelligence, and gaining access to network-management services, including the ability to send e-mails and make cellular calls over a military-only network completely divorced from the commercial Internet.4 In December 2008, a BACN-converted manned Bombardier business jet deployed to Afghanistan to serve as the quick-reaction capability to test the system operationally. Flying over a special operations or CIA mission otherwise taking place in a networked dead zone, the airplane could provide improvised connectivity for hours at a time. It could pull down what users at the edge needed directly from whatever was flying that possessed the right data. If a soldier queried the node and it did not have the requested data, its server would go out and poll other servers in the network to obtain it. BACN emerged not only to serve the soldiers at the edge and the new culture of constant contact but also because it was far cheaper and more flexible than leasing commercial communications. The Data Machine could now be extended anywhere, regardless of local capacity and without resort to commercial leases.

In June 2009, Northrop Grumman received a quarter of a billion dollars for three additional BACN Bombardiers, in addition to two unmanned Global Hawks, newly outfitted with BACN capabilities.5 By November 2010, two Global Hawk BACNs, each with 300-mile-radius coverage when airborne, were providing about 50 percent of the requested 24/7 network support for the edge. The drones can stay up for days at a time. By the end of 2012, more than 3,000 missions had been flown.6 In late 2011, a year after the last of 300,000-plus American soldiers and contractors left Iraq, the Pentagon formally christened the now multibillion-dollar converted Global Hawk the EQ-4.7 No one noticed the party. In fact, in the eleven years of its existence from concept to deployment, from 2003 until this writing in late 2014, no mainstream media outlet has ever reported on the existence of this now-multibillion-dollar tool for waging war anywhere and anytime.

Perhaps one of the reasons for the media’s silence is that this system—described by its manufacturer as platform agnostic, radio agnostic, and untethered—is virtually impossible to describe. Global Express is the closest one comes to a military nickname that sticks, but in a stroke of geographic indifference to mountainous Afghanistan, the overall system has been tagged Desert Express. So it is not a weapon, not a sensor, and though Global Hawk is host, it is not really a drone in the way most people think about drones. As simply as can be defined, it is an alternate and exclusive military Internet in the sky, essential to shore up a weak spot in the Data Machine but really a secret agent of the vision of precision without location, loitering transformed into perpetual war-making.

From the first night of Afghanistan bombing in October 2001, when everyone boggled over the all-seeing eye for the first time, decision-makers at the CIA, the Pentagon, the White House, and command centers near and far were glued to their own DNN, the drone news network, everyone fully in thrall. Video was of course the simplest explanation, spawning epithets of Predator porn and “CAOC crack,”8 but what really appealed to a television-watching and image-obsessed generation was persistence. General Jumper called it the buzzword of the decade in 2003.9 Arguably the most important strategy document that the Pentagon prepares, the Quadrennial Defense Review, in 2006 argued that future capabilities needed to favor “systems that have far greater range and persistence; larger and more flexible payloads for surveillance or strike; and the ability to penetrate and sustain operations in denied areas.”10 BACN is the facilitator of anywhere and always. Now all that was needed was all the time.

The concept of persistence requires yet another family of black box sensors. Predator is up there like no other, but it provided far less than the persistence that was envisioned, at least beyond extremely narrow individual targeting that came from looking through a soda straw. It’s an “immediate-time kind of reporting,” one air force officer said, “of viewing exactly what’s going on with whatever your selected target is—whether that’s a house, a building, a vehicle moving down the road, whatever that is—you are able to then sit there and watch that. It’s very small. So I just see one vehicle or two or three vehicles at the most, but my field of view just isn’t that big on the ground.”11 Not only did the Predator camera show a limited perspective, but the raw imagery from the moving platform proved not so easy to interpret, the thirty-to forty-five-degree angle constantly changing as the drone moved. Scientists went to work on better processing, developing software and hardware that would provide georeferences (what we today call metadata) and even a converted top-down perspective that matched a scene-based correlation, virtually all of the advances being borrowed from graphics processors used in gaming applications. The other two avenues of attack were increasing the breadth of the perspective (wide area) and providing higher resolution, thus allowing greater exploitation of each imaged scene by the naked eye.

Sonoma was the first experiment of widening the perspective, developed starting in 2003 by Lawrence Livermore National Laboratory in California. Using a novel mosaic-like sensor design that could view a wide area at high definition, the first prototype carried a 22-megapixel sensor (six times Predator’s resolution), the second a 66-megapixel sensor, and the third a 176-megapixel sensor, each capable of imaging a larger and larger area in a single frame. Where that normal sensor on Predator can image the area of a city block, Sonoma 2 could cover an area the size of downtown Washington, DC, and Sonoma 3 could see the entire metropolitan area. Such wide-area high-definition imaging exposes every corner. In one of the initial Sonoma experiments, an IED scenario was created—Red Team Intent—that assumed that any car that slowed down to five miles per hour for more than 100 feet was suspicious. Software was written that highlighted the path of all vehicles matching this signature. Then, once the pattern was triggered, an analyst could rewind the video and discover where a suspicious vehicle came from. And Sonoma could track 8,000 simultaneous moving objects.

It was truly persistence, but in order for surveillance to be useful, an analyst must be able to see the data in real time. As the Livermore laboratory explained, “all data processing for one frame must be completed before the next frame is captured.” With data being collected at two frames per second, Sonoma’s data exceeded the bandwidth of available communications by a factor of 100 to 10,000. So scientists applied various techniques, including data compression, to show only movement (or anomalies), while the georeferenced static background was only episodically transmitted to match what the sensor was seeing.12

Sonoma turned into the Mohawk Stare experiment for the army and then into Constant Hawk, and in 2006, a prototype Constant Hawk wide-area persistent surveillance (WAPS) system was quietly deployed in Iraq, owned and operated by contractors.13 Constant Hawk could record and archive sensor data that allowed for playback of incidents, such as roadside IED bomb blasts, to be reviewed. Once an event occurred, the data was downloaded, and analysts attempted to backtrack from the incident, tracing bomb-makers and insurgents who might have deployed the IED and, if possible, following them backward even to their points of origin. They call this method of going backward to pick up clues forensic analysis. This was warfare completely turned on its head. Constant Hawk was an immediate hit. But the experimental black box was integrated on a manned airplane and not a drone, giving it limited time in the air. And it still produced enormous amounts of data, much more than could be moved very far, and in formats useful only for demonstration.14

Then, as these things go, the Los Alamos National Laboratory in New Mexico produced Angel Fire for the marines—smaller and more user friendly—and other wide-area and persistent programs came knocking.

More black boxes meant more data. And the introduction of wide-area surveillance, and particularly high definition,15 exponentially increased the amount of information available. Collection outpaced the ability to move the information, store it, or process it.16 As a result, the Pentagon admitted in 2009 that it was drowning in data. It was now looking at hundreds of terabytes of new data coming in every day. That’s over 800 laptops with the typical 128-gigabyte solid state drive, and more than the total of all the terabytes collected by the Library of Congress Web teams.17 “We’re going to find ourselves in the not too distant future swimming in sensors and drowning in data,” said Lieutenant General David Deptula, head of air force intelligence in January 2010. And within a couple of years, Reapers would be carrying their own wide-area black boxes that would be able to track up to twelve different targets simultaneously, delivering 84 million pixels twice a second. “The iteration after that will jump to 30 and there are plans to eventually reach 65. That’s an increase from 39 possible video feeds [from Predators and Reapers] to more than 3,000 with a 50 cap force,” Deptula said.18 Data pipes were filled and storage was approaching saturation levels.19

The next month, BAE Systems announced successful flight tests of its ARGUS-IS, a 1.8-billion-pixel camera with a resolution of six inches that can see a minimum of sixty-five “Predator-like” video windows across more than 100 square kilometers.20 And ARGUS would transmit at five times the frame rate of Constant Hawk, ten times a second.21 One minute of high-definition video of a city block already demanded one gigabyte; an 800-megapixel image of a small city—that required to extract intelligence information at specific locations—demanded half of a terabyte per minute; ARGUS-IS, operating at 1,800 megapixels, could image a large city demanding half a petabyte per minute of bandwidth if all of the data was transmitted.22

BACN was pursued because everyone saw saturation coming, because there was a demand for far more bandwidth and data.

Part of the problem is the haystack itself. When 9/11 came, there were about 450 million Internet users and close to one billion mobile connections in use around the globe, sending about 10 billion electronic messages daily, 10 percent of them text messages. By 2014, the planet was closing in on two billion Internet users and the number of mobile connections was estimated at 7.5 trillion, with only about 5 percent of them in the United States. Internet use was no longer dominated by people sitting at computers; in most parts of the world, particularly in places like Afghanistan and Iraq, the vast majority of Internet access, including everything from communications to banking, was achieved using smartphones. By 2014, the number of electronic messages sent daily topped 500 billion. In the decade and a half after 9/11, the numbers multiplied many times over, with each development—digital DVDs replacing analog CDs, digital radio and television, high-definition, social media, and people living online—exerting greater and greater demands for bandwidth and presenting an infinite universe of data to be collected.

Everyone, including the custodians and residents of the Data Machine, is now drowning in information. The number of all kinds of manned and unmanned collection platforms tripled in the two years after 9/11 and continued to grow after the Iraq war started, increasing by over 200 percent from the end of the Bush administration until 2012.23 Just in terms of combat flight hours, drones increased from a total of around 22,000 in 2001 to over 550,000 in 2011.24 The demands for intelligence became so great, and the capacity to collect information proliferated so broadly, that by 2013, there were triple the number of platforms in Afghanistan than there had been at the height of operations in Iraq, despite the fact that the fighting force on the ground there was only one-fifth the number of troops that deployed to Iraq.25

Those in the know describe just the amount of visual data collected every day as five seasons’ worth of every professional football game played—thousands upon thousands of hours. The data moves around the globe multiple times, first for “actionable” purposes, which means in support of an immediate high-value mission. The data then moves to be processed for second-phase and multi-INT exploitation. It then moves to contribute to geospatial products. It then moves to park itself somewhere on the network. And it then moves whenever someone pulses the system, secret Googles that go under names like Stone Ghost, Gemini, and Hercules. On a daily basis, the Data Machine produces hundreds of thousands of reports, many of which require no human intervention whatsoever.

All of this data is now constantly on and fully dynamic and moving from desktops to handheld ROVERs and ginormous video walls in fusion centers, occupying chat, e-mail, and Web services for processors and users all along the way. It is a wholesale change in culture that had quietly taken hold in the military and intelligence communities, one where information—data—came to dominate, where it was seen as key to soldier safety and discriminate warfare. Yet despite the coming end of the big wars in Iraq and Afghanistan, despite the directive to stop buying platforms, and despite the saturation that was affecting movement and storage, no one could seem to find a limit, a point when or where information ended. Years later, when Edward Snowden brought to light the NSA’s infinite collection of signals, the broader impact (and appetite) of the Data Machine was lost in discussions of the legality and privacy of eavesdropping and cyberdata interception. The way the Data Machine itself works also wields enormous demands of its own, not just the post-9/11 cult of connect the dots and the kill chain perfected, but also the human factors—user friendliness and interactivity that make the machine workable for a generation of digital natives, seamless production values that now mask the drivel of most of the content.