He is fair in manhood, dignified in bearing,
graced with charm in his whole person.
He has a strength more mighty than yours,
unsleeping he is by day and by night.
TABLET I, EPIC OF GILGAMESH
In an obscure office building tucked behind the Fair Oaks Mall in Fairfax, Virginia, and at highly secure data centers in a half dozen locations from Maryland to California, is the national signatures pool, a massive electronic library that catalogs hundreds of thousands of signatures, the digital marks of our entire world.1 The signatures database, which has been meticulously collected for decades, catalogs the distinguishing features of everything, civilian and military, foreign and domestic, from weapons to vehicles to fabrics to vegetation to individual people.
Everything gives off a spectral signature—a house, a car, a knife—observable in countless regions of the electromagnetic spectrum. As one official of this secretive world says, a signature “is a distinctive basic characteristic or set of characteristics that consistently re-occurs and uniquely identifies a piece of equipment, activity, individual, or event.”2 Because all material reflects, absorbs, or emits photons as a consequence of its molecular makeup, a high-resolution deconstruction of the intensity of these materials can form a rendering unique to any given material.3
The collection of signatures goes back to the days when the Soviet enemy was behind an iron curtain and the intelligence wizards needed to come up with innovative and even elliptical methods to acquire information. The earliest days of atomic fission spawned a special type of sleuthery, with aircraft and satellites sniffing out rare isotopic concentrations to discover the existence of nuclear tests and then even to characterize the makeup and capabilities of hidden nuclear weapons. These techniques of scientific detection and technical intelligence took on the name measurements and signature intelligence (or MASINT).4 MASINT collection and analysis never had the allure of human intelligence, the wonder of imagery, or the capacious plenty of signals intelligence, even if it did provide the possibility of seeing into the beyond. Instead, it served as a kind of technical back end of the nuclear age, with every enemy weapon given added character by its radioactive return or other chemical signature; every target characterized not just by location and size but also by its physical composition. Finally, friendly weapons systems were made “signature dependent,” that is, sent off to find and attack through progressions of sensing enemy signatures and making arcane calculations to precisely find, locate, attack, and assess—each act building on the last.
Detection augmented normal seeing and hearing: infrared detection of the plume of a missile, acoustic detection of the sound emitted by a submarine, electrooptical detection of laser light, materials sampling to detect the presence of chemical or biological agents. At the height of the Cold War, the emerging “INTs” that built into the whole of MASINT spawned highly qualified scientists with a wide array of specialties.
Scientists are needed because MASINT differs from “normal” intelligence in that with MASINT, what is seen is inferred from the physical characteristics—it is not just what something looks like to the naked eye. By way of explanation, photographs rely on the literal extraction of information by a human. MASINT deals with nonliteral exploitation.5
Any kind of sensor, as determined by its size, weight, and sophistication, measures reflective energy based upon spectral and spatial resolution, observation time, and frequency of observation, processing the resulting data to highlight different spectra against a static background.6 Every sensor collects energy that bounces off an object. And in the electronic era, every sensor converts its returns into digits, a series of picture elements (or pixels), which are themselves just zeros and ones.7
Unless a target is visibly observed by the human eye, something has to translate what a nonliteral sensor detects into what we “see” when we think of seeing. When normal people think of radar, they imagine pulses of energy sent out and a simulation of a physical shape formed in the reflective returns: it’s an airplane in the sky, a tank on the ground, etc. And that’s indeed how it all started. What is physical and can be seen, even at long distances, is what is reflected. But fast-forward to the modern day: What if the object you are trying to “see” is tiny, or nonreflective, or moving? And what if you can’t send out a beam of energy to pulse it because that would make you vulnerable to observation and attack yourself? There are a gazillion permutations and steps in the underlying physics, but that’s basically how nonliteral detection emerged as a supplement to the visual and the physically reflective.
Now to see it: the reflected energy travels in wavelengths and must be received by a sensor that can translate those waves into something understandable to humans. What is visible to the human eye are three bands of electromagnetic energy almost in the middle of the electronic spectrum; red, green, and blue (known as RGB). An infrared sensor measures wavelengths adjacent to the visible bands in the spectrum.8 A multispectral sensor can monitor reflected energy in ten spectral bands of visible and infrared light. Hyperspectral imagers, the most complex and with the broadest view, monitor spectral bands numbering up to 200 or more. This includes reflected energy in the ultraviolet (UV), visible, near-infrared (IR), and short-wave infrared (SWIR) portions of the electromagnetic spectrum, as well as the emitted energy in the mid-wave infrared (MWIR) and long-wave infrared (LWIR) portions of the infrared spectrum.9
Multispectral imaging (MSI) has been used in the civil world for decades to observe everything from general land cover to detailed species identification. In the 1960s, scientists confirmed that reflectance measurements by multispectral airborne and space sensors permitted the identification of the mineral makeup of rocks, soils, and vegetation. In weather forecasting, MSI is used to detect cloud droplets, ice particles, dust, ash, and smoke, each of which can then be associated with specific frequencies. MSI can also monitor wavelengths over broad areas to characterize terrain and man-made features, a technique in widespread use by the military in mapmaking.
Hyperspectral imaging (HSI), on the other hand, collects the energy of a wider section of the electromagnetic spectrum and from many narrower bands simultaneously, from infrared across the visible to ultraviolet. Because hyperspectral sensors can sample spectral signals reflected and emitted from the same area, a sensor can even separate atmospheric signals from ground signals, thus allowing the sensor to essentially “see” through clouds.10 It wasn’t until 1989 that the first hyperspectral imager was flown,11 and today, it is the most complex form of MASINT.12
To fully understand the world of signature-derived intelligence, it is useful to think of the domain name system that orders the Internet. When a Web address (a URL) is typed into a browser or clicked on as a hyperlink, an international library of numeric Internet Protocol (IP) addresses is instantly searched, returning the Web server associated with each address. That way, the IP address—the site’s signature, if you will—can be a public and easily remembered name translating the domain name. And given that the DNS system is its own network, if it is unable to translate a particular domain name, it asks another at a higher echelon, and so on, until the correct IP address is returned.
A hyperspectral signature can be thought of in the same way. Since every object reflects and absorbs light in different ways, the amount and type of radiation reflected directly relate to an object’s surface chemical and physical characteristics, illumination factors, and atmospheric properties.13 In intelligence terms, the ultimate goal in imaging is to produce a complete reflectance spectrum for each pixel, an achievement that can only come from hyperspectral imaging.14
Hyperspectral imaging then can simply be described as a type of remote sensing that uses powerful information contained in the full-spectrum signature of an object (that is, its total reflective makeup). Not surprisingly, the most important feature of hyperspectral imaging for military and intelligence purposes, in addition to its complexity, is that there are just a few highly expensive black boxes on a small population of highly classified platforms that are able to practically collect and translate into images.15 HSI is a multistep process involving an enormous amount of imaging and computing, but in the end, say for instance in the case of IEDs, spectral signatures related to bombs and techniques to hide them are collected and validated against ground truth to populate data sets of objects of interest. What makes the data instantly available and militarily relevant is the database of spectral signatures that underlies the whole process.
While a multispectral sensor might indicate the presence of an object such as a vehicle, a hyperspectral sensor can also detect whether it’s metal or plastic, what kind of metal it’s made from, the color and type of paint it has, and the amount of moisture it contains. A multispectral image might differentiate between desert and farmland, separating features in the near-infrared region because the chlorophyll in the plants is reflected to a far greater extent than any other feature. A hyperspectral image of the same farmland can differentiate a barley crop from potatoes, detect stressed vegetation, and even determine soil composition.
The Pentagon first actively initiated research on a hyperspectral sensor that would be able to return near-real-time data in 1991, initially to fill a need experienced in Cold War Europe and later in Bosnia and Kosovo, which was to see what was hidden in shadows and under trees.16 The Hycorder black box was flown in October 1994 and June 1995, the first of an unmanned hyperspectral generation that would begin to open the way for the fighting man to see in a completely new manner. The imaging radio-spectrometer was fitted on board a navy Pioneer drone that flew over the White Sands Missile Range in New Mexico and Yuma, Arizona. Reflective targets with known signatures were precisely placed on the ground, and the spectral information was downlinked to a visualization and analysis system that processed the continuously running video using the finest computer of the day, a Pentium Pro PC. Desert Radiance, as the experiments were called, proved the feasibility of detection of a tactical target by use of its unique spectral signature.17 Desert Radiance was followed by Forest Radiance, Island Radiance, and Littoral Radiance, each planned collection operation a proving ground for the calibration of aerial sensors and processors and the building of a larger and larger signature library.
As part of the Hyperspectral MASINT Support to Military Operations (HYMSMO) umbrella program started in the late 1990s, different hyperspectral sensors were flown to explore tactical detection and classification of potential military targets. Hyperspectral imagers were placed on manned aircraft and in space,18 each attempting to increase spatial resolution and signal-to-noise ratios to militarily useful levels. In each case, a series of runs were flown in which tanks and other military vehicles were precisely placed on a targeted terrain, or fabric and painted target panels were used to simulate camouflage. Ground truth measurements were also taken simultaneously, from towers and other airborne platforms, to compare the reflectance of the surface to the energy recorded by the imaging sensor. And in 1997, blind testing was introduced, that is, hyperspectral imaging used to find hidden objects. Overall detection success rates were nowhere near the level needed for combat.19 And HSI continued to be conceived in Cold War terms, detecting the evidence of weapons of mass destruction manufacture or deployment through the presence of plumes or runoff; or in strictly conventional military terms, as countercamouflage—detecting objects that were intentionally hidden from sight.
The WARHORSE black box flew on board Predator a year before 9/11.20 It’s another acronym, of course, for Wide Area Reconnaissance Hyperspectral Overhead Real-Time Surveillance Experiment. It is a hyperspectral sensor that images from approximately 10,000 feet, with a collection process that entails21 a massive amount of data, far beyond anything seen with Global Hawk imagery or synthetic aperture radars or even operational multispectral sensors.
Just one frame of a hyperspectral imager is on the scale of 20 gigabytes or more. Before WARHORSE, this huge amount of data was stored on digital tapes, which were then mailed to the appropriate organizations for processing, with intelligence returning days or weeks or even months after the image was taken.22 With WARHORSE, the data were collected, calibrated, corrected, and presented so that when a spectral signature in the stored database harmonized with something being processed (or in other words, when the IP address of a sought-after website was matched), a camera on board Predator simultaneously took a still image (or “chip”) of the same scene, the image itself being modified to create false color variations so that it was visible to the human eye. In this way, WARHORSE could provide tip-offs or cueing of other sensors. However, the hyperspectral data was still so complex that it needed to be processed elsewhere for further exploitation.
Enter the Signatures Support Program. A program that had always been dominated by strategic nuclear and “national” collection shifted to conventional and even unconventional war in the late 1990s. The decades-old collection of signatures started to look at dynamic phenomena, that is, the signatures of real-time events and activities immediately relevant to the fighting man and woman.23 Hyperspectral imaging, if it could be made practical and cost effective, would allow a way to see through clouds and under trees, to detect what was underground or underwater, to find what the enemy was trying to hide, and even to rescue a friendly downed pilot hiding behind enemy lines (by detecting a previously applied reflective “tag”).24 But the only way that hyperspectral imaging could be turned into anything beyond a science project was to rely on onboard processing of the enormous amount of data generated to extract only the bits needed to identify prospective targets. WARHORSE was the first step.
When war in Afghanistan began in 2001, every form of intelligence collection in this remote and unknown land was employed, including experimental hyperspectral sensors. First to enlist in Afghanistan was NASA’s satellite-based Hyperion sensor, which was used to assess pre-and postbomb damage by comparing before-and-after scenes of difficult places that had been bombed, such as tunnels and caves.25 Hyperspectral imaging was also able to detect concentrations of carbon dioxide in cave-riddled areas and thereby possibly signal the presence of humans.26 When the first sensors were applied for tactical detection, in Afghanistan and then in Iraq, they were also shown to be able to detect buried IEDs by detecting the presence of disturbed dirt or by using “change detection” techniques to go back and see anomalies of military significance. Common types of IEDs were also directly detected through signature matching, particularly as the IED library grew.27
Hyperspectral products were not quite in the hands of the war-fighter because of security classifications and scarcity, and a real challenge to overcome was bandwidth, given how much data was demanded and had to move through the networks. But the Pentagon, sufficiently optimistic about the prospect of real-time imaging, in 2002 approved the HyCAS or Hyperspectral Collection and Analysis System technology demonstration, a five-year program that would assess the feasibility of spectral data as a source of regular tactical intelligence, while also figuring out ways of incorporating HSI sensors into the day-to-day workings of the Data Machine.28 As Sue Payton, the Pentagon’s head of advanced systems, said upon unveiling HyCAS, the United States now had hyperspectral sensors on aircraft and even in space. HyCAS included sensors on Global Hawk, on Predator, and on manned navy P-3 aircraft.
In 2007, the US Geological Survey conducted HALO Falcon, a sweeping hyperspectral survey of Afghanistan that collected data from an altitude of 50,000 feet.29 The public announcement was that the mission was designed to assess Afghanistan’s natural resources, such as coal, water, and minerals, and that no less than President Karzai had requested the mission.30 The true purpose of the mission was to build a complete snapshot signature of the country in order to form a baseline that intelligence collection of the future could rely upon. In other words, the purpose was to create a library of the entire country’s broad signature.
Experimentation in the United States continued as collection accelerated in Afghanistan and Iraq. The signatures experts processing the volumes of data that were newly arriving purchased and fabricated the materials that made up such things as military vehicles, camouflage, fabrics, and paints, in order to conduct spectral characterization and add to the library. At black box laboratories, work accelerated not just on new means of collecting and processing hyperspectral data, but also on reducing signal-to-noise ratio (false alarm rates), on improving spectral and radiometric stability and image quality at high altitudes, and on improvements in computational capabilities and communication that would make it possible to overlay hyperspectral data with imagery or eavesdropping. MASINT was becoming the new everything, and new standards were created for all kinds of multi-and hyperspectral collection.31 Partly driven by war, partly driven by the promise—any promise—of support for the troops in the counter-IED battle, partly just reflecting the incredible pace of technological change in the information field, and partly prompted by the unappeasable ambitions of the Data Machine, a new vibrancy pulsated through the signatures world. The main air force signatures data center in Tennessee filled to capacity, and “automated scene detection” was slowly developed to ease the processing burden.
It didn’t take long, but it also didn’t happen overnight. Within two years, the processing time of HSI collected data declined from eight hours to less than a minute.32 Sensors became so small, and processing so advanced, that even ground-based hyperspectral sensors were introduced that could be used by reconnaissance troops. The troops could use the new set of eyes to identify very small targets at a distance of a mile or to detect spectral signatures associated with programmed patterns of anomaly detection. The ground prototypes incorporated automated software that allowed for data to be automatically processed “without a human in the loop.”33 You don’t have to be a PhD optical scientist, one company official said. “You just push a button, algorithms are processed and you see the target on the screen.”34
The military mission of finding people through spectral imaging did not start with terrorism or al Qaeda, but was part of a sacred task at the core of all organized and honorable fighting. Antiseptically labeled “personnel recovery,” it is the very emotional task—the promise—not to ignore a fallen comrade on the battlefield and never to leave anyone behind. The unique capacity of hyperspectral images to detect, locate, and identify materials associated with a downed pilot or a captured soldier made long-range search and rescue an early articulation of mission need, a capability made all the more practical with the development of specially formulated material (called taggants) with exact spectral features, material so small that it could be worn or carried by a pilot and yet also detected in real time by a hyperspectral sensor.35 As part of the HyCAS program in 2003, taggants that the human eye could not readily detect were tested and successfully identified in airborne surveillance.36
From this use of hyperspectral imaging and identifying taggants came the next step: “noncooperative identification.” It too was initially applied to creating capabilities to identify and track friendly forces and to avoid friendly fire. The military called it combat identification37 until the Iraq war in 2003 introduced ubiquitous blue-force tracking, which entailed automatic satellite collection of the locations of select vehicles by pulsing the special tags they mounted. The noncooperative part comes in the ability of systems to interrogate without human action or knowledge. Complex coalition operations, working behind enemy lines, demanded black box devices that enabled war-fighters to identify friendly, enemy, and neutral forces for “shoot/don’t shoot” instant decisions.
As counter-IED and counterinsurgency doctrines took over in Afghanistan and Iraq, noncooperative identification was looked to as another intelligence application of hyperspectral imaging, both in signature development and to directly enhance both targeted killing and the counter-IED “attack the network” strategies. Another INT emerged, biometrics-enabled intelligence, which is defined as the intelligence information “associated with and or derived from biometrics data that matches a specific person or unknown identity to a place, activity, device, component, or weapon that supports terrorist/insurgent network and related pattern analysis; facilitates high-value individual targeting, reveals movement patterns, and confirms claimed identity.”38 As the head of the Pentagon’s biometrics agency said: “The department has unique military requirements to collect biometrics from unknown individuals in all tactical environments, to transmit and store that collected data and to fuse intelligence, law enforcement, and administrative databases to provide the contextual data that will enable timely identification of unknown individuals on the battlefield.”39
As the HyCAS experiments reached their conclusion in 2008 and as the next generation of hyperspectral sensors was preparing for deployment even as the Iraq war was coming to an end, the signature support specialists began putting more and more effort into what is called remote biometric feature extraction or soft biometrics. This is noncooperative identification to the extreme, the biometrics not of fingerprints but of gait, body markings, vein structure, heartbeat, and even odor—all things that might be detected and identified at a distance, all things detectable by hyperspectral means.40
In 2006, the first hyperspectral camera experiments were conducted to detect human skin spectra. The methods are only hinted at in secret documents: nonobtrusive biometrics, multimodal biometrics fusion, biometrics-at-a-distance, iris-at-a-distance, stand-off/remote facial recognition and matching, remote biometric feature extraction, spectral facial recognition, Cognitive Counter-IED Integrated Signature System.41 As one biometrics briefing asked about processing intelligence from a terrorist attack site: How do you classify an anonymous individual? By something he’s wearing? By something he’s carrying? By who he is associated with? By where he’s been?42
By “where he’s been.”
That puts the entire Data Machine, not just drones, at the service of the new assassins. To find individuals and aid in the targeting, whole new fields of intelligence exploitation emerged at about the same time. It was referred to as Advanced Geospatial Intelligence, the quantitative analysis of data combining types of sensors, but also all types of information technically derived from the processing, exploitation, and nonliteral analysis of the data.43 One intelligence industry executive calls Advanced Geospatial Intelligence the “power of place.”44
One could say in the end that it all goes back to killing the target, the mobile target, the fleeting target, the difficult target, approximating the capacity of the human brain while at the same time brushing aside all of the essential and crucial decisions in order to just push the button when the data lines up. Perfecting targeting to an individual level entails using all means necessary. And sometimes backtracking becomes the only way forward. It is a shift in the temporal promise—using where he’s been to predict where he’ll be—that requires connecting the dots at such hyperspeed and in so many dimensions as to replace the retrospective with the prospective, the estimative with the actually prophetic.