By 1973, when the Joint Program Office was established, the basic requirements were understood: three-dimensional positioning, velocity as well as position, passive receivers, and global coverage. But other parameters were unresolved. The Naval Research Lab followed the NTS-1 satellite with NTS-2, launched in June 1977 (see figure 11). Its deployment resolved these issues. Whereas the NTS-1 used a commercial rubidium clock modified for space use, the follow-on satellite used cesium clocks built specifically for the space environment. These were shown to drift less than 20 nanoseconds per day, or less than one second in thousands of years.1 (Rubidium-based clocks were subsequently improved to the point where current-generation GPS satellites all use rubidium, not cesium.) In addition to the cesium frequency standards, the satellite carried extra systems, including radiation sensors, pseudo-random as well as side-tone ranging circuits, and other equipment. The NTS-2 satellite made the transition from experimental device to production of positioning satellites. Experimental transmissions from NTS-2, conducted in the summer of 1977, were GPS’s equivalent of Samuel Morse’s 1844 message, “What hath God wrought?” for the electric telegraph (see figure 12).
Figure 11 NTS-2 team at the Naval Research Laboratory, ca. 1977. Standing (left to right): Dr. Bruce Faraday, Richard Statler, Guy Burke, Roger Easton. Seated (left to right): Al Bartholomew, Bill Huston, Red Woosley, Ron Beard, Woody Ewen, Pete Wilhelm. (Photo: Naval Research Laboratory)
Figure 12 This equipment was the first military GPS five-channel receiver built by the company now known as Rockwell Collins. It was one of several programs launched to study the feasibility and operational utility of GPS. The receiver weighed more than 120 kilograms (270 pounds) and was mounted on an Air Force equipment flight test pallet. Transmissions from NTS-2 to this receiver, in the summer of 1977, may be considered as marking the birth of GPS. (Photo: Rockwell Collins)
Historians of technology have argued that technological systems evolve through a period of experimentation, dead ends, and false starts until a configuration is agreed upon and becomes widely accepted. During that initial period, social as well as technical issues are an important part of the debate. Historians use the term “social construction” to emphasize that the systems eventually accepted emerge not solely because they are technically the best, but because they satisfy the needs of the various participants in the technology’s design and use. Eventually, the debates die out. The system becomes a “black box”: it works, and users need not be concerned with the inner workings of it, nor do they need to revisit the technical, political, or social debates that led to its final configuration (see figure 13).2
Figure 13 The GPS constellation: six orbital planes, 55° inclination, four satellites in each plane. The current constellation also includes at least one spare in each orbital plane, for a total of 30 satellites in orbit.
The deployment of NTS-2 “closed the black box” of GPS. It demonstrated the practicality of the Naval Research Laboratory’s basic design, combined with the Air Force’s 621B method of coding. To sum up:
Four decades after the launch of NTS-2, the Global Positioning System—in tandem with its European and Asian counterparts—is an indispensable part of global commerce, military affairs, and culture. The systems have evolved, but their basic architecture, formed in the early and mid-1970s, has persisted (see figure 14).
Figure 14 How GPS works.
Advocates of the social construction of technology have argued that changing social, political, or technical conditions may force a reexamination of initial agreements, leading to a reopening of the black box. Donald MacKenzie showed how this happened to inertial missile guidance as the United States transitioned from the Minuteman to the MX intercontinental ballistic missile. The classic case study of social construction is Trevor Pinch’s and Wiebe Bijker’s study of the history of the bicycle, which was offered in a myriad of designs until stabilizing in the form of the “safety”: two wheels of the same size, pneumatic tires, a diamond frame, and a chain drive to the rear wheels.4 That configuration had been standard for decades, and to the authors that design closed off the debates. But in the three decades since the publication of that study, advances in materials and a changing social environment have reopened the black box on bicycles. The safety has given way to a new complexity of designs: different-size wheels, folding bikes, recumbents, “tadpole” trikes, “dockless” bike-share bicycles, different drive trains, different frame geometries, etc.5 We shall see that similar events, decades after the design was established, have reopened the black box of GPS.
David Packard served as undersecretary of defense for only three years, from 1969 to 1971. Nevertheless, he left a mark, although he was frustrated trying to overcome the inertia he found in the Pentagon and Congress. The Joint Program Office led by Col. Parkinson managed to establish a plan for a unified satellite positioning system, but obtaining funding from the various branches of the military, not to mention civilian agencies, was difficult. The system’s ability to give precise altitude and velocity information, both lacking in Transit, made it especially useful to the Air Force and Navy aviators, but the Defense Department was reluctant to commit funds to deploy the full constellation of 24 satellites.
Also among those skeptical of satellite-based positioning was the civilian Federal Aviation Agency (FAA). The FAA was concerned about the cost of receivers, especially for general aviation aircraft. The Joint Program Office was counting on civilian markets to help support the system, but in the late 1970s the FAA was more concerned with upgrading existing VOR (very high frequency omnidirectional range) transmitters and other equipment. By the 1960s, the early vacuum-tube VOR units were being replaced by solid-state equipment that rotated the signal electronically, at lower cost and with higher reliability. At the same time, new microwave-based navigation aids were being implemented to assist in blind landings. The microwave landing system, used to safely land the Space Shuttle orbiters in an unpowered return to Earth, allowed the pilot of an aircraft to follow a glide slope to a safe touchdown, even in zero visibility conditions. GPS, as it was configured at the time, could not do that. The geometry of fixing a position with satellites in medium-earth orbit meant that accuracy along the vertical axis (altitude) would seldom give readings as good as the horizontal readings of latitude and longitude. (The technical term is “geometric dilution of precision.”) For an aircraft landing in bad weather, a five-meter error in altitude was unacceptable. The FAA was also concerned that, whereas VOR gave a pilot navigation data instantly as soon as the set was turned on, GPS receivers took a few minutes to acquire satellites and display position from a cold start—the so-called “time-to-first-fix.” A third objection had to do with the long-range plan for GPS: that it would replace not only VOR, but also the microwave landing system, LORAN, Omega, and other classic aids. The FAA reasoned that aircraft flying international routes would be using the older systems when flying to foreign airports. Would those countries adopt GPS, which was under US control and operated by the US Air Force? If not, the aircraft would have to install both GPS and legacy devices.6
The program continued to face opposition. In 1979, the Office of the Secretary of Defense cut $500 million from the program’s budget for FY81 through FY86.7 In 1983, the Joint Program Office responded with a contingency plan to make the system useful, with 18 satellites in orbit and using only the civilian, L1 frequency and the short C/A code.8 An 18-satellite constellation had been considered almost from the start of planning; now, a system with fewer satellites and a simpler code was put to a test. In 1983, the tests were conducted with a configuration of nine satellites then in orbit. The restrictions were calculated to give positioning to no better than 100-meter accuracy. But when tested, Col. Parkinson discovered that “much to everyone’s surprise … the unit performed almost as good as its more sophisticated counterparts, demonstrating accuracies in the 20–30 m range.”9
The results of those tests were the first indication that GPS’s impact on the world would be far more than its predecessors—more than the Harrison chronometer of the eighteenth century. This “surprise” turned out to have enormous implications for satellite navigation in general, for the US military’s sponsorship of GPS, and for the social transformation effected by these systems in the twenty-first century. The surprising accuracy helps explain in part why the European Union and other nations have chosen to develop their own systems, even given the free worldwide availability of GPS. The architects of the system had always planned for civilian and commercial customers; what they had not foreseen was how much other nations, whose interests were not aligned with those of the United States, might exploit the better-than-expected accuracy. Again, in Parkinson’s words,
the Department of Defense was faced with a dilemma since the C/A code on which this equipment operated was to be generally available to anyone in the world who had access to the technology required to build a suitable receiver.
That led to a study by the Department of Defense along with the US National Security Council to consider the implications of foreign nations having such access. The result was to “open the black box,” as advocates of the social construction of history might describe it. The Joint Program Office chose to degrade the accuracy of the publicly available C/A code, to further restrict those who would have access to the more accurate P code, and to encrypt the P code, transforming it into the P(Y) code. An additional overlay of the C/A code, called selective availability (SA), was installed on the satellites. It deliberately degraded the civilian signal to give an accuracy no greater than about 100 meters. SA biased the signals from the satellites to a varying degree, and the variation was different for each satellite. Since civilian users would not have access to the L2 signal, their receivers had to compensate for ionospheric delays only by the receivers’ mathematical model of ionospheric propagation. The surprising accuracy of this restricted system came from the better-than-expected performance of the model, good performance from the onboard clocks, and sophisticated techniques for processing the signals.
In discussions of GPS and other systems, the notion of accuracy comes up, but its definition is not always agreed upon. Early descriptions of GPS used a common measure known as circular error probability, a term derived from ballistics. By that definition, a circular error probability of 100 meters meant that bombs aimed at a target will fall within 100 meters of the target 50% of the time. Some descriptions of GPS extended this to three dimensions, for a spherical error probability, defined the same way. Extending a ballistics model to a positioning system, however, is a poor way to measure the system’s performance. Circular error probability gave way to a better method, called root mean square (RMS), a term familiar to electrical engineers and statisticians. It measures the average deviation from an intended point. By this measure, an RMS accuracy of 100 meters will give the user a position that is within 100 meters of its true position about 65% of the time.10 A variation of that metric doubles the radius of the circle, inside of which the receiver will be located 95% of the time. The measurement is roughly, but not exactly, equivalent to one and two standard deviations from the peak of a Gaussian curve. Typical consumer devices sold to hikers and recreational boaters display accuracy at double RMS accuracy (2DRMS), or 95%. Receivers sold in the 2000s claimed a 2DRMS accuracy of 100 meters with SA turned on, 15 meters with it turned off, and 3–5 meters with the Wide Area Augmentation Service (WAAS) activated (discussed in chapter 5).11 A woman standing on the 50-yard line of a football stadium and holding one of those consumer-grade receivers would know she was somewhere in the stadium with SA turned on, on mid-field with SA turned off, and on the 50-yard line with WAAS active.
For the designers of GPS, the “surprise” that the C/A code was so accurate was something they were proud of. With SA, they also felt that the restrictions they placed on the codes would still allow a robust civil market for ships, trucking, and commercial aircraft to develop without endangering national security.12
The NTS-2 satellite was now designated as the first of the operational GPS satellites. Further first-generation, or Block I, launches began in February 1978, when the first of 11 satellites were launched. Launches proceeded rapidly through 1978, then slowed down, with the last Block I satellite launched in October 1984. Concurrent with the launches, tests were conducted at the Army’s Yuma Proving Ground in Arizona using transmitters located on the ground, whose signals mimicked what would in the future be transmitted from satellites. Army trucks, jeeps, and helicopters had receivers installed. Soldiers carried bulky receiving equipment on their backs. Eventually these backpacks would shrink in size and weight as microelectronics technology progressed. The use of “pseudo-satellites”—devices on the ground transmitting GPS signals from surveyed locations—would evolve into methods of augmenting the accuracy of GPS signals from space. By establishing GPS ground transmitters at surveyed positions, receivers nearby could receive a signal that had fewer of the inaccuracies that signals from satellites had. With so-called differential GPS supplementing the satellite signals, accuracy increased to a few meters even for civilian sets. The development of differential GPS was a major factor in persuading the FAA to adopt GPS for aircraft navigation.
The choice of an Army proving ground was wise. The Army was not as concerned with long-range navigation as were the other services, but its need for soldiers to know their own position and that of an enemy go back to the beginnings of warfare. Eventually the Army would be the largest customer of GPS equipment among the military services. From the perspective of classical navigational techniques, it may have seemed absurd that the US Army would be interested in a navigation system derived from the need to navigate over the open ocean. An army traditionally found its way with maps, relating to local physical features like rivers, towns, and mountain ranges. Army maps had little need for distances measured from the equator or Greenwich, England. They relied not on latitude and longitude but rather on a metric system, which allowed direct conversions from the map to directions and distances on the ground. The preferred map projection was the Mercator, with a major change: army maps were centered not on the equator but rather on a local meridian of longitude. The Mercator projection was long preferred by sailors in the mid-latitudes, but it distorts features at high latitudes—making Greenland, for example, appear as large as Africa. Aircraft navigators avoided Mercator maps because of those distortions since they flew great circle routes near the North Pole. The Army’s transverse Mercator projections referenced a line of longitude chosen to minimize distortions at a local area. That allowed a direct metric grid to be established over the map to facilitate its use. Traditionally, battles were named after nearby towns or physical features: Bull Run, Yorktown, Gettysburg. In the 1991 Persian Gulf War, the principal tank battle, fought on February 26, 1991, between coalition forces and Iraqis took place in a relatively featureless landscape. It was given the name of its Universal Transverse Mercator (UTM) coordinates: the Battle of 73 Easting.13 The coalition forces relied heavily on GPS units supplied by the Defense Department and by commercial vendors. Users of these units were able to select the UTM system of coordinates. Modern recreational GPS receivers allow a user to make a similar selection of UTM or latitude-longitude, whether she is on a boat or hiking on land. GPS turn-by-turn navigation systems are standard equipment in most new automobiles, rendering the traditional road maps supplied by oil companies extinct. Drivers do not need to know the car’s latitude or longitude, nor its UTM coordinates. Commercial mapmakers make and sell detailed maps of US states, as do state highway departments, although many drivers now rarely consult these maps, if at all. Beginning around 2009, the US Geological Survey stopped production of its classic 7.5' topographic maps and made a transition to digital files, which a user may either consult directly on a computer or print out on a color printer. These files are machine-produced and incorporate many overlays not practical in paper versions.14 However, one does not read them for pleasure.15 The transition to digital files was not a result of GPS but more a response to advances in digital geographic databases. The US Geological Survey no longer deals in maps but rather geographic information systems, or GIS. The hand-crafted, 7.5' paper maps represented a rare, complete merger of art and science; their demise parallels the changing notions of sense of place in the world brought on by satellite positioning services. The Army still relies on maps, which it integrates tightly with GPS receivers carried by soldiers and on its equipment. Soldiers know how to read a map and use a magnetic compass.
Typical maps convey a lot of information besides the obvious representation of physical features. One critical piece of information is the direction of North as indicated by a magnetic compass, and its deviation from true north, symbolized by the North Star. The deviation can be quite large in northern latitudes, and increases as one approaches the location of the geomagnetic North Pole. The other is the “datum”: the base from which the coordinates are derived. This datum has been progressively refined as satellites and other instruments are employed to measure the shape and gravitational field of Earth. Topographic maps produced by the US Geological Survey refer to the 1927 North American Datum; later editions refer to a revision established in 1984. The changes reflect increasing information, much of it supplied by satellites, of the shape and gravitational field of the earth—the “geoid,” defined as the shape of the earth if it were covered by water. GPS satellites follow Newton’s and Kepler’s laws as they orbit around the center of mass of the Earth. The satellites’ positional data are thus divorced from traditional surveyor’s measurements, which were based on astronomical observations with respect to the local vertical. The Earth spins like a top, with a slight wobble, so the North and South Poles gradually trace an area about the extent of a baseball infield. Continents drift slowly, but at rates that modern techniques, including satellites, can measure. As GPS is further refined in its accuracy, these once-minor differences among reference points become significant. Pacific islands have been accurately mapped, but have been found to be off by several kilometers from their true location. A GPS receiver held at the prime meridian in Greenwich, longitude 0°, will not read zero.16 The reasons are complex, but at the prime meridian, a plumb bob, which points toward the Earth’s center of mass, will not point toward the center as it was understood by those who established the zero meridian. GPS receivers have the onboard computational power to resolve these anomalies.
The United States has gone to great lengths to produce maps of foreign territories with accuracy using a variety of techniques, including reconnaissance satellites. For an army using precision-guided munitions, accurately locating a target is critical. As with a choice of coordinates, a user of some GPS receivers is able to select the 1927, 1984, or other datums. The magnetic compass has lost its value in the age of satellite navigation, going the way of the rotary telephone and manual typewriter. Most smartphones have a magnetic compass installed, but few owners of smartphones know it is there, or how to use it. The Army Map Service, now part of the National Geospatial-Intelligence Agency, remains one of the largest producers of maps worldwide, generating accurate maps of many parts of the world. The kind of detail taken for granted by Americans using USGS topographic maps is hard to find for many parts of the world, such as the Middle East.
As the program shifted into production mode, the Los Angeles aerospace firm Rockwell International became the prime contractor for the satellites and their onboard clocks. It also received contracts for military receivers. Rockwell was known as the builder of the B-1 bomber and Space Shuttle orbiter; its Collins Radio and Autonetics divisions had extensive electronics and guidance systems expertise. The electronics firm Magnavox received major contracts to develop receiving equipment. Magnavox was known for its consumer products, including one of the first home video games; it also had a staff of qualified aerospace engineers at its military division in Torrance, California.
The deployment of the constellation proceeded through the 1980s, but at an irregular pace. The first set of satellites was launched on Atlas-F boosters, which enabled the Air Force to establish an initial constellation at low cost. Later satellites required the Delta II launch vehicle. Initially, Block II satellites were to be launched by the Space Shuttle, but the loss of the Challenger in January 1986 ended shuttle involvement and led to a delay. The second-generation Block II satellites had additional capabilities and were much heavier: 900 kg versus 450 kg for Block I.
One reason for the added weight of later-generation satellites was the result of a convergence that helped prevent the total cancellation of the program. Beginning in 1980 with the launch of the sixth Navstar satellite, a 132-kg sensor system was included in the configuration, whose purpose was to detect nuclear explosions carried out by the Soviet Union or other nations. The sensors were similar to those installed on the Vela Hotel satellites, which had been developed in part to monitor compliance with the 1963 Limited Nuclear Test Ban treaty between the United States and USSR. Having support from what is now the Department of Energy—separate from the Department of Defense—helped secure the future of GPS and offset the added weight and complexity that this Nuclear Detection System, or NUDETS, required.17
In July 1983, the same year that the test of the limited configuration was underway, Rockwell gave a dramatic demonstration of the potential commercial value of GPS. It equipped a Rockwell Sabreliner business jet with a GPS receiver and flew it from Cedar Rapids, Iowa, the home of Rockwell’s Collins Radio division, to Le Bourget Airport in Paris—the airport where Charles Lindbergh landed his Spirit of St. Louis in 1927. Relying on satellite positioning alone, the Sabreliner taxied to within 7.5 meters of a pre-surveyed stopping point. At the time, only five satellites were working well enough to rely upon. There had been one Atlas launch failure, and one of the satellites was not delivering reliable data. But again, the success of the flight went a long way to dispel any lingering doubts about the accuracy of the planned system.
On September 1, 1983, a Korean Air Lines Boeing 747, en route from Anchorage, Alaska, to Seoul, strayed over Soviet territory and was shot down by a Soviet interceptor. All 269 passengers and crew onboard perished. The tragedy led to tension between the United States and Soviet Union that was nearly as high as during the Cuban Missile Crisis of 1962. The cause and details of the incident were not known for years. The Soviets recovered the aircraft’s “black box” (in this context, flight data recorder), but that fact, and the recorder’s contents, were known only to a small number of Soviet personnel. The causes and circumstances surrounding this event have been discussed and debated at length; what follows is a brief discussion of the methods of navigation employed by the commercial jet, and how the response by the US government affected the future of global satellite navigation.
Korean Air Lines Flight 007 originated in New York and had stopped to refuel in Anchorage, Alaska. It left Anchorage at 4:00 AM on August 31, east of the International Date Line. Air traffic controllers directed it to fly along an established corridor across the northern Pacific, passing over a series of waypoints. That corridor avoided Soviet territory, especially the Kamchatka Peninsula, where several Soviet military installations were located. However, by the time the aircraft passed the last waypoint on Alaskan soil, at the village of Bethel, it was already several miles off course. As the flight progressed, its deviation from the established corridor increased.18
Although many years would pass before a full investigation could be completed, analysts now believe that the crew relied on a magnetic compass to set its heading. Knowing that magnetic compasses are inaccurate in northern latitudes, the crew was supposed to switch to an inertial navigation system—the Delco Carousel described above. As the aircraft approached Japan, the crew could then receive position data from ground controllers to correct for any drift in the inertial system. Ground controllers would then direct the flight to Seoul. The Carousel inertial navigation system was reliable and accurate, with a projected drift that would place it within range of ground controllers once it reached Japan. It employed a redundant design of three independent sets of gyroscopes and accelerometers, so that the failure of one of them would not cause the entire system to fail. It did not require any compass reading, celestial sightings, or radio contact with ground controllers. And unlike the magnetic device, it could automatically correct for any deviation from the flight path caused by crosswinds.
Investigators believe the crew switched the autopilot from “heading” (i.e., magnetic) to “INS” (inertial navigation system), but the handover never happened. The reason was that the inertial system was programmed not to take control of the autopilot if the aircraft’s course was more than 7.5 nautical miles from the planned course (aircraft navigators use nautical miles as a measure of distance). There was a small indicator in the cockpit that would have informed the crew that the handover did not take place, but apparently the crew did not notice it. The crew could have also been alerted by other factors during the flight, including a difficulty in communicating via VHF radio with a Korean airliner following it (recall that VHF radios are restricted to line-of-sight range). But when they contacted controllers in Japan, they erroneously reported their position as being on the correct course. The aircraft eventually drifted over 160 km off course, flying over the Kamchatka Peninsula, then over the Soviet territory of Sakhalin Island. Soviet jets were sent up to intercept the aircraft, and for reasons still unclear, the Soviets did not recognize it as a civilian, not military incursion. An air-to-air missile fired by one of the interceptors destroyed the plane.
On September 5, 1983, President Ronald Reagan addressed the nation in a televised speech. He denounced the Soviets in strong terms, and he was especially critical of their denial that one of their pilots downed the aircraft. He noted:
Commercial aircraft from the Soviet Union and Cuba on a number of occasions have overflown sensitive United States military facilities. They weren’t shot down. We and other civilized countries believe in the tradition of offering help to mariners and pilots who are lost or in distress on the sea or in the air.19
In the absence of reliable information about the incident, some of the responses from the aviation community focused on the inertial navigation system, assuming that its drift was a cause. The interface with the crew was more likely the problem. An unsigned article in Aviation Week and Space Technology, published in October 1983, suggested that the Navy’s Omega long-range navigation system be employed as a backup to the inertial system. It noted that commercially available Omega systems cost “in the $50,000 range.”20 According to the article, Omega either already had achieved or was close to achieving global coverage. Next to that note was a report of a letter from Senator Charles H. Percy (R-Ill.) to President Reagan, asking him to accelerate the deployment of Navstar/GPS satellites to achieve coverage over the great circle route across the North Pacific. GPS was still under development in 1983. It was projected to have two-dimensional positioning capability by 1987 and full capability by 1988—quite a few years after the downing of the Korean airliner. The satellites then in orbit were focused on the tests being conducted at the Yuma Proving Ground. The remainder of the article listed a number of other reasons why Aviation Week thought Senator Percy’s suggestion was premature.21
On September 16, 1983, the White House Press Secretary issued a statement in further response to the tragedy and stated, in part:
World opinion is united in its determination that this awful tragedy must not be repeated. As a contribution to the achievement of this objective, the President has determined that the United States is prepared to make available to civilian aircraft the facilities of its Global Positioning System when it becomes operational in 1988. This system will provide civilian airliners three-dimensional positional information.22
On face value, the statement seems reasonable. The system was intended to serve civil aviation and commercial shipping from the start. What the White House perhaps did not recognize is that the unanticipated accuracy of the civil signals, revealed by the testing being conducted at the time, upset the planned civil-military balance. Providing for commercial access was a good way to help sell the system in its planning stages. Now that GPS began to show its capabilities, however, was it not prudent to reexamine that commitment?
The President’s and other responses to the tragedy imply that there was a need to provide a backup to the inertial navigation systems then in use. But there were alternative means to verify that the Korean airliner was off course, which the crew did not employ. The authors of the Aviation Week articles had a fuller understanding of, on one hand, the potential value of the Omega low-frequency system, and, on the other hand, the Navstar/GPS system’s long-term deployment schedule. Testifying before Congress on September 19, Loren E. DeGroot of Rockwell Collins stated that although the satellites then in orbit were performing well, there was as yet no agreement on the types of receivers commercial customers could use. Implied in his testimony was the unresolved issue of how much GPS capability should be given to civilian customers, including to other nations, given the better-than-expected accuracy revealed by the tests on the limited system.23
In 1992, after the fall of the Soviet Union, Russian President Boris Yeltsin released details about the downing of the aircraft. Those details suggest that the airliner might have strayed off course even if it had a GPS system onboard. In that context, and given the availability of other, more mature systems like Omega, President Reagan’s statement that GPS would be made available to civilians free of charge was premature. It did bring GPS into public consciousness, and it helped dispel some of the lingering skepticism among the branches of the US military, the FAA, and others. It also had the unfortunate consequence of implying that the inertial system used on the airliner was deficient. Drift in inertial systems was a known issue, and suppliers and users of inertial systems were well trained in how to deal with drift. Meanwhile, as GPS has become embedded into so much of modern life, the issue of its fragility has come to the fore. Among the responses to that issue is a renewed interest in inertial navigation, with its unsurpassed ability to resist jamming or other interference.
Whether the President’s statement was premature or not, it did assure potential manufacturers of civilian receivers that they could market their products without fear of the system being denied to them for an unspecified reason in the future. The future of GPS seemed to be certain, although funding for the full 24-satellite constellation, including spares in each orbital plane, was not restored until 1988.