Chapter Two

Digging Canals, Curing Cancer, and Flying to Jupiter

There is a piece of a World War I German battleship on the Moon. How did it get there, and why? The Atomic Energy Commission once bought a new roof for the Notre Dame Cathedral in Paris. What were they thinking?

Radiation. Just the naked word, radiation, is enough to make us uncomfortable. The word carries a sinister, dangerous under-meaning, even to someone who has worked closely with it. I have stared, by way of a mirror, at the center of a nuclear reactor core at full power. I have gazed down into a swimming pool holding 700,000 curies of cobalt-60, I have dangled a californium-252 source at the end of a 10-foot pole, and still “radiation” gets to me.168 Our chimes are still vibrating, slightly, from being banged roughly during the periods of The Fantasy and The Puzzle, the newsreel images of Chernobyl and Hiroshima/Nagasaki still burned in our brains. Radiation remains secretly harmful. You can just be standing there, feeling nothing unusual, while being killed by it, never mind being actively hit with it via a meltdown or a bomb.

Radiation has always been associated with nuclear power, and rightly so. The practical mode of nuclear power, fission, releases a tremendous blast of radiation in a wide spectrum, and extraordinary precautions must be exercised to prevent it from sterilizing the vicinity of the reaction. However, to put it in perspective, a major component of The Paradox of nuclear power is that far more people die each year of radiation-induced disease from standing out in the Sun than have ever died from the application of nuclear power. There is, after all, radiation all around us. There has always been radiation penetrating our bodies, 24 hours a day.

I drew pay at the Georgia Tech Research Institute for about 25 years, with my office in the Electronics Research Building, or the ERB. It was a serviceable building, Spartan by current standards, three stories, with an over-built steel frame, filled in with concrete blocks. It was completed in 1966, and it was unique in its attention to interconnecting cable-trays among laboratory rooms and a very solid electrical grounding scheme. The place was wired like a battleship. Even in the sixties, architects were enamored with the idea of a building that needs no heating system, so the structure was built facing south, with enormous, solar-collecting picture windows on the front. During the winter, when the Sun was low in the sky, the light would come blasting through those windows, and the heat would trap behind the glass. Large, concrete brows over the windows would then shade them from the Sun in the summer, when our hemisphere tilted down from the ecliptic.

I am sure the design looked good on paper, but in application it was thoroughly idiotic. There was no way to move the heat that was building up in the front offices to the back offices, which were facing north. The back offices were freezing cold, while in the front offices the glue would melt in our book bindings. I was in a front office. I recall this extended experience every time an architect tries to sell me on a passive, solar-powered heating, ventilation, and air-conditioning scheme. No thank you. I prefer noisy air blown forcibly through a thermostat-equipped furnace.

The 1960’s were a spasm of expansion for Georgia Tech, as the campus crawled ever northward, eating up old residential neighborhoods and constantly gaining ground. In the next lot, just north of the ERB, they had built the Frank H. Neely Nuclear Research Center in 1963. It contained a faithful copy of the 5-megawatt CP-5 heavy-water reactor at Argonne National Lab, hand-built and looking as if no expense was spared. The aluminum reactor tank was set in a concrete cylinder, in the middle of a gleaming white, cylindrical, steel, reactor containment structure. The containment was attached via a personnel air-lock to a two-story laboratory and administration building. Enter the lobby through the front door, and a visitor would find the south wall of the room made of glass, and behind it was the hot-cell facility in splendid view. Always busy, the hot cell contained telemanipulator arms handling radioactive samples and objects behind three-foot-thick glass windows.169 The lobby was appointed like a Rat-Pack bachelor’s pad, with lounge furniture and a combination of dark hardwood and polished marble from floor to ceiling. The latest issue of Nuclear News and a facility brochure were always casually arranged on the low coffee table.

Being a nuclear research reactor, radiation monitoring was taken quite seriously. Not only was the inside of the air-sealed reactor containment building monitored continuously at several key locations, the entire facility was wired for radiation detection, with a staff of trained health physicists constantly vigilant. One sunny day in 1965, all hell broke loose. Radiation alarms started going off, indicating that there was an unintended critical mass of fuel somewhere inside. That was impossible. All of the uranium in the facility was accounted for. The health physics team grabbed their portable counters, switched them on, clicked down-scale, and started scanning the floor to locate a source. The electronics maintenance team checked the alarm settings and started testing the equipment, beginning with the power supplies. The Geiger-counter equipped health physicists followed the radiation out the front door and up the hill, south, to the adjacent building, which was under construction. Their alarms were being rattled, not by a mass of uranium accidentally dumped on the floor of a nuclear laboratory, but by a newly delivered load of construction materials for the seemingly benign Electronics Research Building.

The ERB was made of an enormous pile of concrete blocks, and the concrete blocks, naturally, had been supplied by the lowest bidder. The lowest bidder, in this case, had been a phosphate mine in Florida. A Florida phosphate digger makes a strip-mine, and there is a lot of waste material that must be stored or gotten rid of. It made perfect sense to bake the tailings, form them into concrete blocks, and sell them for cheap in Atlanta. It at least gets rid of the stuff. But, there’s a catch. Phosphate mine tailings are unusually rich in uranium. Just about every dirt, rock, soil, sand, or dust sample in the world contains at least a small amount of uranium. It is the most universally distributed material on Earth, albeit in diluted quantities, but phosphate mines are so uranium-heavy that the tailings are now considered to be a strategic resource. If the United States were cut off from all external sources of uranium, the pre-mined phosphate tailings could be quickly processed into usable uranium with the help of modern technology not available in the 1940s. The ERB basically amounted to a big pile of uranium ore, and it was gradually decaying away into lead, shooting off a variety of rays, particles, and radioactive “daughter products.”170

Sitting in my office in the ERB day by day I was soaking up more radiation than I would if I were sitting atop the reactor in the adjacent building. The main difference was that in the reactor building my exposure would be constantly monitored, integrated, and pondered. In the ERB, we just sat there dumbly, breathing deeply of the radon gas seeping out of the floors while being cross-bombarded with gammas out of the walls. The unusual nature of our building was never officially mentioned to the occupants, for fear of a stampede, but it was the talk of health physics meetings nationwide. I gained some amusement by parking a scintillation counter in a corner of my office and watching the rate meter climb off-scale to the right. Professionally, we call it “pegging the needle.” Occasionally a colleague would see my radiation instrument going wild, and ask, “This isn’t going to…harm me, is it?”

“Probably not,” I would shrug. “By the time the cancer kicks in you’ll probably have heart disease.” My sarcasm was not often appreciated.

The golden age of electronics research finally ground to a crawl by 2007, as it gave way to nanotechnology and biomedical research. The reactor had already been defanged for the 1996 Atlanta Olympic Games, which was centered at Georgia Tech. There had been some bizarre concern that international athletes would storm the building and steal the highly enriched uranium fuel, so it had been quietly removed and shipped away in the middle of the night. Just to make sure there would be no trouble, the word “nuclear” was removed from the title on the side of the building. The fuel was never returned, and the reactor would never again go critical. Both the ERB and the Georgia Tech Research Reactor were torn down for the new Nanotechnology Research Center, which, ironically, is the largest laboratory building on campus.

What is interesting is the way the two buildings were demolished. The reactor was covered with an air-tight plastic tent and was ever so carefully broken down, piece by piece, over the course of months of extremely expensive work. It cost millions of dollars to tear it down. The reactor containment structure had for all its life been kept spotlessly clean inside. Every surface was wiped, cleaned, polished, and scrubbed to prevent the slightest buildup of anything radioactive, and any hint of contamination would be noted on the constantly snooping instruments and treated vigorously. You could literally eat off the floor in that building. There was probably no public health hazard from striking that place, even without all the plastic sheeting.

The ERB, on the other hand, was blown down by a bulldozer and a wrecking ball one work-day afternoon as the students and faculty strolled by, idly glancing toward it as bigger things occupied their minds. Three guys with garden hoses tried to keep down the choking dust as it wafted out of the wreckage, coating everything and everybody, deep down into their lungs, with pulverized, medium-grade uranium ore. A plume of gray concrete dust drifted slowly over the campus, raining down what we used to call “fallout.”

See what I mean, when I call it The Paradox? Much effort was put into protecting the people on the Georgia Tech campus from radiation, but all the effort may have been directed into the wrong coordinates. The people were actually protected from something much more important than the inhalation of uranium. They were protected from the perception of radiation contamination. It will be hard to prove that anyone on the campus the day the ERB came down dies of lung cancer because of the dust, and I highly doubt that anyone will, but the perception of radiation exposure due to a decommissioned research reactor, if allowed to propagate uncontrolled, could bring down the trillion-dollar nuclear power industry. Tearing down a nuclear facility in the City of Atlanta in full public view in a plastic bubble showed good faith, even if it had the reality component of Disney World. The public is hyper-sensitive to the issue of industrial radiation contamination, and the psychology of it is very powerful. The general feeling took decades to fully develop, and possible excesses in the Age of Wild Experimentation did not help to dampen the growth of radiation anxiety.

This interesting age lasted about ten years, from approximately 1954 until 1963, when the Nuclear Test Ban Treaty was ratified and the sheer joy of blowing up things in the desert by atomic means was suddenly curtailed. It had a numbing effect on nuclear exuberance, similar to suddenly imposing liability insurance on hot-rodding, although there was a protracted spin-down period lasting until the end of the sixties. Everything done in the period was not just for the fun of it, and great progress toward a nuclear power economy was made, even though it was a sideshow to the billions of dollars poured into the nuclear weapons programs and literally plowed into the soil in Nevada and Colorado.

The United States Navy was justifiably pleased with the exemplary performance of Admiral Rickover’s nuclear submarine, and plans for an entirely nuclear-powered Navy were already underway. There had actually been an early jump at proposing a nuclear powered aircraft carrier. The thought of building a carrier large enough to float bombers loaded with atomic weapons had been on naval minds as early as 1947. The ship would be huge, it would be an energy hog, and it would have no smokestacks. It was a perfect candidate for this new, theoretical power source, and plans were drawn. The newly formed United States Air Force, finding the concept of the Navy flying strategic bombers terrifying, protested vigorously, and the project was cancelled by President Truman five days after the keel had been laid. Still, the Navy persisted at the drawing board, and by 1954 they had upgraded Rickover’s 50-megawatt submarine reactor into a slightly larger, 60-megawatt aircraft carrier power source. The inherent safety factors of the design, the long cycle between refueling, and the projected reliability of the sub reactor design were all very attractive. It just needed to be a bit larger.

Rickover cast a long shadow in the Atomic Energy Commission, as well as in the Navy, and with his dual identity he was able to expedite a sharing of technology. Using the shelved aircraft carrier design, a civilian nuclear power plant was proposed. It would be a stationary, full-scale power plant, contributing to the electrical grid and demonstrating to the public and to the world that nuclear power was safe, economical, and desirable. Instead of turning screws at the end of an aircraft carrier, the turbine would turn a dynamo. The project was proposed to the Duquesne Light Company. It was the highly visible cornerstone of President Dwight D. Eisenhower’s Atoms for Peace concept.

Ground was broken on September 6, 1954, at a cleared spot on the Ohio River in Beaver County, Pennsylvania, about 25 miles from Pittsburgh.171 It was named the Shippingport Atomic Power Station. It took 32 months and a measly $75.5 million to build.172 The reactor was first started at 4:30 A.M. on December 12, 1957, and after 21 days of thorough shakedown it was brought to full power. Eleven days later, it was pushed to 68 megawatts, 8 megawatts over the designed power, just to see what it would do. The plant operators, who were used to cranky, inertia-bound coal plants, were delighted by the easy response of the controls. In all respects, the plant performed beyond expectations. On May 26, 1958, it was switched into the grid, and the United States was officially running on a component of nuclear power. It was not the world’s first nuclear power station, but it was the first that had no military mission. It was not built with the intent to produce plutonium for bombs. Its mission was to convince the public that nuclear power was clean, safe, and that it was economically attractive.

The plant was certainly clean. There was no endless stream of rail-cars moving coal onto the site and taking away cinders, no black dust covering everything in sight, and no dark brown smoke boiling out of a stack. The naval PWR was a perfectly logical choice for demonstrating safety. Just as in the Nautilus reactor, Shippingport had a negative temperature coefficient of reactivity, meaning that if the water inlet temperature to the core increased, the power level dropped. If the water temperature dropped, then the power increased. The reactor was therefore very easy to control, and it had no tendency to take off in an unintended power excursion or threaten to melt, as did certain other designs.

As for economy, this first civilian power reactor gave a somewhat distorted, overly subsidized view of the cost of nuclear power. The bulk of the engineering cost was borne by the Navy’s nuclear aircraft carrier program, shared with the AEC. Technically, a civilian entity could not own any uranium, so the fuel was loaned to the power company by the AEC, the sole owner of fissile materials. The power from Shippingport was practically free, just as Lewis Strauss had predicted back in ‘54, but only because the government was footing the bills. The reactor core was 165 pounds of expensive, highly enriched submarine fuel, surrounded by a neutron reflector of 14 tons of natural uranium in a pure zirconium matrix. Exotic hafnium controls were used, the same as those installed on the Nautilus. The first core lasted seven years and produced about 2 billion kilowatt-hours of electricity. The second core, installed in 1965, lasted 9 years and generated another 3.5 billion kilowatt hours.

The first equipment failure at Shippingport was a mechanical failure in the turbine on February 4, 1974. While the plant was shut down for repairs, it was decided by the newly formed Department of Energy to further experiment with an exotic fuel load, and Shippingport became the world’s first light-water breeder reactor. It was an interesting new core design, intended to produce more fuel than it used by promoting the capture of excess neutrons in a blanket of thorium-232. Transmutation converted the activated thorium into fissile uranium-233, making it possible to run the reactor forever without digging anything further out of the ground or enriching natural uranium. A nice perk.

Shippingport was finally shut down for the last time in 1983, and was decommissioned in 1988. The site on the bank of the Ohio River was cleansed of all trace of industrial activity, as if it had never been there. It cost $26 million more to tear down the building and fill in the hole than it had cost to dig the foundation and build the plant in 1957. The well behaved little reactor in Pennsylvania had made its points to the values of nuclear power.

In parallel to the Shippingport demonstration, the Age of Wild Experimentation took off with nervous energy and a seemingly bottomless budget. Most of the money went into designing and testing an enormous range of new nuclear weapons.

In the earliest days, in the early fifties, the governmental anxiety over not knowing what the Soviets were up to started to settle down. Soviet weapons were big and dirty, and it got to the point where any well equipped radiation counting lab in the United States could tell that they had set one off, as the atomic blast debris from the other side of the world contaminated the upper reaches of the atmosphere. It would translate around the globe under air currents and come down in the rain weeks later. Radioactive rain would hit the roof, run down the gutter, and collect in puddles outside the lab. Scoop up a beaker of rainwater, count it overnight, and you could tell that, yep, somebody just set off another one. Do a multi-channel analysis of the radiation energies, and you could tell whether it was a British or a Russian test by the isotopes present in the contaminated water. The Brits tended to push their plutonium production too hard, and their debris had an unusual percentage of plutonium-240.

The ability to identify individual shots did not last long. In 1955 there were 20 above-ground nuclear tests. In 1956 there were 30, and in 1957 there were 50. In 1958 there were 105 tests, or one every three and a half days. In 1961, there were 140 nuclear weapons set off in the atmosphere in one year. Most of them were ours. Sometimes multiple bombs were set off in one day. The signature radiation profiles of the tests ran together in the rain like water-colors melting into a murky brown, as the radiation level of atmospheric dust climbed to dangerous levels.173 You could no longer tell one bomb from another. It was just one, ever-growing cacophony of mixed radiation. The most dangerous place to stand at a nuclear research lab became a mud puddle in the parking lot. Children were advised not to eat snow.

The bomb labs at Los Alamos and Lawrence-Livermore were turning out a nuclear weapon design for all occasions. There were gravity bombs, submarine pen penetration bombs, atomic artillery shells, ship-to-ship, and air-to-air rocket-propelled devices. There were anti-aircraft weapons, cruise missile warheads, and anti-ballistic missile nuclear explosives. There was even one particularly nasty little device, the XM-388 or “Davy Crockett,” that could be fired from the back seat of a jeep using a recoilless rifle. The W-45/MADM was developed specifically to take out hydroelectric dams in West Germany, and it was the “man-portable nuke” that was outlawed by international agreement in 1984. Every new design had to be tested for proper ignition and relative effectiveness, and there was a lot of experimentation in the desert to find the effects of nuclear weapons on anything that might be on the ground. Everything the engineers could think of was hit with an atomic explosion. Frame houses, concrete block structures, sheet-metal sheds, propane tanks, telephone poles, fire-trucks, trees, machine tools, rail cars, airplanes, school buses, battleships, and high-voltage transmission towers were placed in harm’s way and subjected to blast damage on the flat desert. Much was learned, but to what end?

The United Kingdom and France had very modest new weapon development programs in comparison to the United States, and even the Soviet Union kept the tests down to a tasteful minimum. Only in 1961 did the Soviets set off more bombs than the United States. What they lacked in numbers of tests they made up for with size. On October 30, 1961, in the Novaya Zemlya archipelago, they set off the world’s largest man-made explosion. It was designed to yield an explosion of 100 megatons, but out of prudence the engineers cut it back to 50 megatons, or twice the size of anything the United States ever designed. That is ten times the force of all the explosives used in World War II, including the two atomic bombs dropped on Japan. They named it RDS-220, or simply “The Tsar.” It made a mushroom cloud 40 miles high, and it broke windows in Finland. At a hefty 27 tons it was too big to ever carry over enemy territory in a bomber. It was never deployed as a weapon, but the point was made.

Strange as it now seems, not all nuclear explosive developments were part of a weapons program. Both superpowers in the Cold War worked independently on industrial uses for atomic bombs. The Soviets called it “Nuclear Explosions for the National Economy.” The U.S. called it officially “Operation Plowshare.” Some call it “The Plan to Nuke Panama.”

In the late 1950’s the Panama Canal, linking the Atlantic and Pacific Oceans with a machine-dug trench through the Republic of Panama, was considered an engineering antique. In 1914 the American-built canal was a marvel, but ship-building techniques had out-paced it, and its quaint lock system was simply too narrow for modern aircraft carriers, oil tankers, or even luxury liners. All ships through the canal had to be lifted 85 feet to clear the terrain, and then be lowered down to sea level on the other side of the isthmus. To modify the old system just to make the locks wider would be nearly impossible. A clear improvement would be to cut a new sea-level canal through Panama, or even through the thicker part of the isthmus in Nicaragua. It would not have the mechanical complications and narrow funnel of the present situation. It would be a foolproof solution to inter-oceanic travel, but digging such a canal was not possible with existing excavation techniques. The bedrock was extremely tough, and mountains as high as 1,000 feet would have to be knocked down. There was, however, the atomic bomb.

If anything could move a mountain in a single swat, it was a good-sized nuclear weapon. All you had to do was plant some devices just below the surface, in a straight line, across Panama, and set them off one at a time. It would be like digging a trench with dynamite, multiplied a million times. The bomb engineers jumped on the concept eagerly, and Operation Plowshare was born. It would eventually burn up $770 million and detonate 31 test bombs, but, mercifully, the plans for nuclear landscaping would be put back on the shelf in 1973.174

Plans were drawn up, probably without consulting the Republic of Panama, to blow the new cut through the Darién Province, just north of the existing canal. It was an unspoiled rainforest with cloud-covered mountains and a scant population. Only 40,000 people would have to be moved. Side effects, such as air blast, ground shock, plumes of radioactive dust thrown into the atmosphere, and possible seismic complications were all considered. From an engineering standpoint, there was only one blank spot. In all the hundreds of nuclear tests all over the world, no one had ever actually dug a hole with an atomic bomb. To remedy that omission, a test of a buried, 104-kiloton bomb was scheduled at Area 10 of Yucca Flat at the Nevada Test Site for July 6, 1962. It was named Sedan in test series Storax.

A shaft 635 feet deep was drilled into the desert alluvium, the device was lowered in, and at the backward count of ten it was detonated. A perfectly round dome of earth lifted immediately off the ground, reaching 290 feet high. It seemed to hang there for three seconds, and then it burst open, shooting 12 million tons of earth into the sky. It left a perfect crater, 320 feet deep and 1,280 feet across. It looked as if a big meteor had just hit the ground.175 A cloud of radioactive dust five miles wide slowly slid across the desert floor. The immediate radiation level in the crater was 500 Roentgens per hour, which even in 1962 was considered extremely dangerous.

Conclusions of the test were mixed. It certainly made a neat, perfectly predictable hole in the ground, but it exposed more than 13 million people to radiation fallout. Of all the nuclear tests conducted in the continental United States, none exposed more people to more radiation. Two plumes had developed, with one headed northeast and one going due east, dropping radioactive dust all along the way. The east-moving cloud seemed to drop most of the load in three counties in Iowa. Measurements counted 17 radioactive isotopes contaminating the Midwest, including 880,000 curies of thyroid cancer-producing iodine-131 alone. The New Panama Canal excavation would require 24 such shots, and so the plan was quietly abandoned.

After 80 days, or 10 half-lives, the iodine-131 radioactivity was undetectably slight. An identifiable swath of thyroid cancer and genetically altered newborns never materialized along the path of the most severe radiation fallout, and this negative finding was interpreted in at least three ways. Proponents of open-air nuclear weapons testing touted it as proof that a fear of ingested, low-level radioisotopes was overblown. Opponents saw it as confirmation that the Government was hiding relevant data concerning trouble with fallout, and despairing statisticians saw it as an indication that people were excessively mobile and would not stay in the same location for 20 years and allow a decent, long-term accumulation of death causes. Even though the Sedan Test had nothing to do directly with nuclear power generation, the buzz of controversy further eroded the public confidence in the wisdom of nuclear technology expansion. It did not matter how many people did not die from it, it was the over-hanging, invisible danger of it.

Not to be outdone, the Soviets made their Peaceful Atom project three times the size of the United States efforts, eventually detonating 115 nuclear bombs in non-military testing. It was given the creative title Program-7, and its goals were lofty. The Soviets wished to find natural gas, make underground gas storage tanks, put out very large oil-field fires, crush ores in open-pit mines, and make coal mining really fast. On top of those applications, they actually started digging a canal.

The Pechora-Kama Canal was an old, impossible project dating back to 1933 that found new life in 1961, under the premiership of Nikita Khrushchev and the unbridled joy of the Age of Wild Experimentation. It was proposed to link up the basin of the Pechora River in Russia with the basin of the Kama, a tributary of the River Volga. The plan was expanded to include a “northern river reversal,” and it was named the Taiga Project. On March 23, 1971, after ten years of studies and general delays, three 15-kiloton bombs were in place near the village of Vasyukovo in Perm Oblast. The simultaneous blasts ripped a crater 2,000 feet long. Radioactive contamination was somewhat higher than anticipated, and when it was concluded that several hundred nuclear bombs would be necessary to complete the canal, enthusiasm waned. At last count, one person was living in the village of Vasyukova, and the Taiga crater has been renamed the “Atomic Lake.” The single inhabitant is in sound physical health, and the local economy is stable, consisting of collecting pieces of metal left over from the nuclear excavation for sale as recycled materials.

Plans for civil engineering projects involving nuclear blasting were still progressing in the United States. The new state of Alaska obviously needed an artificial harbor dug out of the shoreline at Cape Thomson on the Chukchi Sea, on the North Slope. Practical uses for such a harbor were difficult to identify, but that would not stop a good Plowshare operation. It was named Project Chariot, and the plan was championed vigorously by the distinguished physicist and head of the Lawrence-Livermore Laboratory, Edward Teller. It was a very straightforward excavation, and it would require only five bombs. Two big ones would carve out a rectangular lagoon, and three smaller explosions would clear a channel to the sea. Political leaders, newspaper editors, university presidents, and even churches were all in favor of it. Eskimos were not. In the tiny Inuit village of Point Hope an opposition arose, and it was taken up by the Sierra Club.176 Facing increasing unease, the AEC put the project on hold in 1962. Although it was never officially cancelled, the Project Chariot artificial harbor is unlikely to be dug. Just to study the effects of possible radiation contamination on the permafrosted terrain of Alaska, material from a 1962 A-bomb test in Nevada was transported to the Chariot site, buried, and promptly forgotten. The buried radioactivity was rediscovered in 1992 in a search of archival documents, and, at the indignant insistence of the Eskimos in Point Hope, it was cleaned up by the Army Corps of Engineers. Although the two feet of earth covering the fall-out was sufficient to render it safe, the mere existence of such an experimental plan was disturbing.

Plowshare continued on. A test of one of the most abnormal engineering concepts in history was Project Gnome, as conducted on December 10, 1961, in a salt-bed, 24 miles southeast of Carlsbad, New Mexico. The explosion was buried 1,184 feet down, and its Plowshare purpose was to test a power-production idea.177 Setting off a nuclear bomb in an underground stream of water would produce steam, which could be used to spin a turbine and produce electricity. It made as much sense as powering a steam locomotive by opening a door on the boiler and tossing in a lit stick of dynamite every now and then, but the experiment went ahead nonetheless. The maintenance problems of a power system that has nuclear bombs going off in it boggle the mind. Radioactive steam escaped from the test hole as the press looked on, to the embarrassment of the project, and the scheduled “Coach” follow-up shot was cancelled.

The last Plowshare detonation was on May 17, 1973, deep under Fawn Creek, Colorado, 50 miles north of Grand Junction. Three simultaneous explosions at 5,267, 6,152, and 6,611 feet were intended to stimulate the flow of natural gas from a formation of small, disconnected oil fields. It worked, to some degree, but, unfortunately, the gas was unacceptably radioactive. Gas consumers in California might find radioactivity streaming out of their stove burners somewhat offensive. Also, with $82 million invested in nuclear gas stimulation, it was an elementary calculation to show that after 25 years of gas production only 15 percent of the investment could be recovered. With those final conclusions, Plowshare died.

Over a period of twenty years, nuclear explosives were tested for reasons ranging from demonstrational diplomacy suggesting the need for an armistice to North Korea to cross-connecting underground aquifers, with wonderfully positive results probably due to all the lessons learned from the failed Plowshare attempts. All the hundreds of tests seemed like good ideas at the time, but the public’s perception of all nuclear activities gradually became twisted and distorted. The press and casual observers were invited to many tests, as a crude attempt at public relations, and glowing articles describing the works of the Atomic Energy Commission were planted in slick magazines. At the same time it became impossible to hide the increasing radioactive fallout load. There was no place to escape from it. And yet, as it was being confirmed that an atomic bomb could flatten a railroad trestle, nuclear reactions were being used to destroy cancer cells.

When Wilhelm Röntgen found x-rays in 1896, the medical community rushed to seize the discovery and exploit it for all it was worth. Word traveled fast, and within a remarkable eight weeks of Röntgen’s announcement of his penetrating rays, diagnostic x-rays became a line item on patient’s bills across Europe and America. More remarkable is that the therapeutic potential of x-rays was demonstrated even before the first broken bone was imaged.

Emil Grubbe, a medical student in Chicago, was typical of researchers of the time, in that his first inclination upon powering up a newly built x-ray machine was to stick his hand in it. Fortunately, the space between the glowing, humming tube and the desktop was too narrow for his head. Soon after an extended exposure, Emil noticed the skin peeling off the back of his hand. It hurt. How he made the connection between this effect and cancer symptom reduction is still not clear, but he talked one of his professors into letting him aim the tube at a cancer patient named Rose Lee. She suffered from locally advanced breast cancer, a condition for which doctors could offer no hope and knew only profound frustration. Anything was worth trying.

Lee improved immediately. The cancer shrunk and seemed to go away. Radiotherapy was born. Simultaneous discoveries all over the world erupted, and an entire industry came into being. The mechanism by which radiation could kill cancer cells was unknown, and time would reveal that it could just as well cause cancer, but studies of post-treatment outcomes indicated that the therapy could work, and that was all that mattered. X-ray machines tended to be sloppy at aiming radiation, and while it was easy to kill cancer cells, the art was in not killing the patient. Radiologists learned to set their machines to the right ray-intensity for cancer treatment by sticking an arm in the beam. If it caused a pink reaction, like sunburn, called the “erythema dose,” then it was set up correctly. In retrospect, it is no surprise that many of the early radiologists eventually died of leukemia as a result of exposure.

More precision at administering a radiation dose was needed. Marie and Pierre Curie in Paris discovered radium and polonium in 1898, and by 1910 Marie had isolated pure radium metal. Again, the medical community pounced on it. Radium was a solid-state source of radiation, requiring no clumsy, electrically driven equipment. All it took to destroy a local cancer was a tiny speck of it, easily implanted on the tip of a needle, and the radiation produced by it, alpha particles, was of extremely short range. If thrust into a tumor, the radium would kill only the rapidly propagating cells immediately surrounding it, as radiation is most damaging to cells with high turnover rates, like cancer cells, and nothing beyond a microscopic range was harmed. This technique, known as brachytherapy, remains in wide use to this day, particularly for cervical and uterine cancers.

Radium was not inexpensive. It was a rare element, occurring always but in tiny percentages in uranium ore. Extraction was complicated and labor intensive. In 1919, when Marie Curie founded the Radium Institute for research into uses for radioactive materials, her work was funded largely by the sale of radium, polonium, and radon sources, primarily for medical uses. In the first half of the twentieth century serious research, long-shot experimentation, and outright quackery ran unchecked in the industrial world. If radium could cure cancer, it was felt that it could cure anything, and radiation was aimed at everything from tooth decay to dandruff. Many people died needlessly under radiation treatment or in the radium industry, and the excesses of the era did nothing to improve the reputation of nuclear topics later in the 20th century.

Radioactive radium is still among the most dangerous substances that exists in the Earth’s crust. It has a half-life of 2,600 years, so the danger does not go away quickly. It was used in such small bits, it was easy to lose a sample, and the short range of its radiation product made it hard to detect, so a cancer-inducing spec of radium could be easily ingested or inhaled without any immediate indication of hazard. A radium-application industry could go out of business and the old factory building, reeking of spilled and splattered radium, would be a death-trap, difficult to disarm. An old hospital could be torn down, and the wreckers would unknowingly spread a localized cache of old radium needles far and wide. Any industrial use of radium is now forbidden under the United States Code of Federal Regulations in Title 21. You cannot even have a radium-dial watch, anymore.

However, even if radium is too dangerous to be in human hands, brachytherapy is still an excellent way to treat certain, otherwise unreachable cancers, and the general use of radiopharmaceuticals paradoxically exploded in the Age of Wild Experimentation. It was found very soon after the first nuclear reactor was started up in Chicago in 1941 that excess neutrons can be used to artificially produce radioactive isotopes, on demand and with specific, predictable characteristics. Using reactors as powerful neutron generators, it was no longer necessary to use whatever radioisotope nature would provide, such as radium-226. Special isotopes, designed for radiation type, energy, and half-life, can be made at will, with a more precise focus, and the need for radium vanished.

Today, there are 27 artificially produced radioactive isotopes commonly used in medicine, for both diagnostic imaging and disease treatment. Isotopes range from calcium-47, used to measure bone metabolism, to yttrium-90, used for brachytherapeutic treatment of prostate cancer. There are 31 different radiopharmaceuticals based on technetium-99m alone.178 It is used for imaging and functional studies of the brain, myocardium, thyroid, lungs, liver, gallbladder, kidneys, skeleton, blood, and cancerous tumors. Its half life is only 6 hours, so its potency disappears quickly and it becomes harmless to the patient and anyone handling or disposing of the substance. Technetium-99m comes from the beta decay of molybdenum-99, which is produced by the nuclear fission of uranium. It is extracted from nuclear power plant waste products. Over 20 million people per year are given a dose of technetium-99m.

Without nuclear reactors, the radiopharmaceutical industry and all its uses in modern medicine would not exist today. There are only four reactors in the world producing these isotopes, and none are in the United States, where the fear of nuclear work is unusually acute. All the medical and industrial radioisotopes, used daily in impressive quantities in the United States, are made in one reactor, the National Research Universal reactor, or the NRU, at the Chalk River Laboratories in Ontario, Canada. If and when they shut down the NRU, radio-pharmacology in the U.S. will stop.179

One last example of work done in the Age of Wild Experimentation is the extraordinary project named Orion. Before there was a space program, before there was a NASA or a race to the moon, there were detailed spacecraft designs, propulsion tests, and mission plans for practical manned exploration of the solar system and beyond. The work was paid for by ARPA, the Advanced Research Projects Administration, and the United States Air Force. The project was classified secret, and there remain aspects of it to which public access is denied.

Very early in the considerations of manned space flight it was known that chemical rockets had limited utility. By burning a fuel with a stored oxidizer in a rocket engine it was clear that we could achieve escape velocity from the Earth, and it would be a practical way to put satellites into orbit. Small probes, weighing hundreds of pounds, could even be launched to the planets, but the travel times would be years, and there was no way to lock up a human being in a can small enough to be boosted with a rocket and expect life to be preserved for the years it would take to make it to Mars. A manned moon shot was barely possible, assuming that almost all of the millions of pounds of rocket could be shed along the way, dropping it behind like empty beer cans. The theories quickly ran into the ultimate limits of chemical energy storage. The amount of energy per pound of fuel is simply insufficient for manned work beyond the Moon.

In 1947 Stanislaw Ulam, the brilliant Polish mathematician who had worked on the Manhattan Project, was at the Los Alamos Lab twiddling with some numbers. It was the awkward time between the triumph of the atomic bomb and the push for the hydrogen bomb, and his mind wandered off into proposals for space exploration. There was a lot of excitement over scaling up the captured German V-2 rockets, powered by kerosene and liquid oxygen, for satellite injection, but Ulam saw the V-2 technology as a dead end. What spaceflight needed was a nuclear-powered engine, just as Rickover had developed for submarines. A nuclear fuel would be at least a million times more powerful than anything that could be achieved with chemicals. He ran some calculations, and the numbers looked good. With chemical rockets, it might be possible to send a small package to Mars, but it would take years. With nuclear power, a spacecraft weighing several hundred tons could make it to Mars in weeks. A flight to Jupiter, or even to the moons of Saturn would not be out of the question.

The concept was taken up by Ulam’s fellow mathematician and theoretical physicist, Freeman Dyson. Dyson was born in Berkshire, England, but he came to the United States in 1947 on a Cornell University fellowship, and he hit the ground running. He took Ulam’s concept and turned it into a federally funded project, code named Helios. Ulam’s conclusions were that the impulse force from the explosion of a nuclear bomb could be translated into a unidirectional vector, and that it could propel an object forward in any medium, be it atmosphere or empty space. Acceleration to arbitrary velocity could be accomplished by exploding multiple bombs, in series. For the production of thrust or specific impulse, Ulam had proved on paper that nothing was more efficient than detonating nuclear bombs.

Dyson proceeded to give the idea a form. He proposed a hollow, spherical spacecraft, 130 feet in diameter, with a thrust nozzle at the back end. A number of small nuclear bombs, with 0.1 kilotons of explosive yield apiece, are stored as fuel, and a cabin for the crew is at the end of the sphere opposite the nozzle. A trolley picks up a bomb, carries it to the center of the sphere, and explodes it. The expanding plasma from the nuclear explosion exits through the nozzle, and propels the craft forward. Engineering problems, such as keeping the ship from blowing to pieces, brought the project down, but it was not as crazy as it sounds, and it at least got a ball rolling.

On July 18, 1955, the General Dynamics Corporation, a large and diverse defense contractor, created the General Atomic division, and a building complex was erected in San Diego, California. Its mission was research and development to harness the power of nuclear technology for the United States, and it was the spiritual center of the Age of Wild Experimentation. In 1956 atomic bomb physicist Ted Taylor joined the group at General Atomic.180 He recruited Freeman Dyson away from Princeton’s Institute for Advanced Study, and together they threw great energy into a new nuclear spacecraft propulsion project, named Orion, in 1958. It progressed far beyond Helios, with proof-of-concept tests, and the project yielded designs on the edge of what could be built at the time.

The Orion spacecraft design does not depend on an explosion in a closed chamber. Instead, a small nuclear device is shot out the back of the craft through a tube, using compressed air. The bomb is timed to go off 200 feet behind the vehicle, putting its back end just out of reach of the fireball from the explosion. A thick steel pusher plate catches the blast and accelerates the ship forward. Each explosion adds 30 miles per hour to the forward speed. If the detonations are timed right, with a bomb going off every 3 seconds, after 5 minutes the spacecraft is going 3,000 miles per hour. With that kind of speed, long-distance travel with human cargo is possible. There is no problem with radiation impulses from the bomb detonations, as there is plenty of shielding mass between the bomb and the crew cabins.

Unlike chemical-rocket space capsules, the Orion is not small and cramped. The tiniest Orion craft designed, the Satellite version, is one stage, it takes off from a standing start on the ground, and it weighs 300 tons. There is plenty of room inside to spread out and live, more than ample space for supplies, and enough mass to protect the inhabitants from solar flare radiation and meteoroids. The Midrange Orion, suitable for a quick trip to Mars, is 2,000 tons, and the Super Orion weighs a breathtaking 8 million tons. It is the size of a city. The Super can, in theory, reach a speed of 10% of the speed of light. Given the living space inside, it is suitable for interstellar travel. A trip to the closest star, Alpha Centauri, would take 44 years.

There were implementation problems, each of which was addressed by the engineering staff at General Atomic. People involved recalled the work as the best years of their lives, and it was a time when there was no obstacle that could not be rolled over with enough science. First was the problem of the physical shock from each impulse, which was enough to spill your coffee on the control console and knock pictures off the wall. To smooth out the shocks from the blast hitting the pusher plate, a bank of two-stage shock absorbers were put between the plate and the crew’s quarters, with each stage tuned differently so there would be no chance of resonance reaction.

The second problem was erosion of the pusher plate surface. Repeated atomic blasts hitting the steel plate, although kept at a distance so that they would not melt anything, would still ablate the surface. The first nuclear weapon test of the pusher concept was the “Pascal B” shot in Operation Plumb Bob, on August 27, 1957, at which a massive, 1-ton steel capping plate was accelerated to six times escape velocity by a low-yield explosion.181 A solution to the plate erosion problem was found, quite by accident, in a later nuclear test. A pusher plate was recovered, to be examined for ablation, and it was indeed eroded away, except for areas covered with oily fingerprints. It was a surprise, but simply covering a steel plate with grease keeps a nearby nuclear detonation from tearing at the surface. The Orion design includes a central grease-gun for replenishing the protective coating on the pusher.

By 1963 the design was maturing and the Air Force was getting excited, but there was one last problem that no amount of engineering could resolve: fallout to Earth. The Limited Test Ban treaty of 1963, signed by the U.S., the U.S.S.R., and the U.K., made atmospheric nuclear explosions illegal. It was a reaction by both sides of the Cold War to the alarming rise in fallout radiation around the world. There was no hostile intent in the Orion liftoffs, but one launch would have the same effect as a 40 megaton, above-ground nuclear test. Although it would be extremely expensive to do so, an Orion craft could be boosted to orbit in pieces by conventional, chemical rockets and then blasted to escape velocity by nuclear means. The effect, however, would be the same. Radioactive dust from the bomb explosions would enter the top of the atmosphere and rain down on Earth. It would be the same as lifting off from the designated launch point in Jackass Flats, Nevada, on multiple nuclear bombs.

The Orion Project was officially closed down in 1964. It was a magnificent piece of engineering, and it is an echo of the wider concept of an inevitable nuclear power economy that can still come into play. A refined Orion may turn out to be mankind’s means of manned solar-system exploration. For all the dozens of exotic reaction engine designs that have crossed drawing boards over the last 50 years, Orion is still the most efficient, possibly with the greatest chance of success. It awaits peaceful times, and a time when bomb-fallout can be engineered to a minimum impact on the environment.

As for the German battleship part on the Moon, the story begins on June 21, 1919. Germany had just tasted defeat in World War I. The British had the entire German Navy fleet interned in Gutter Sound at Scapa Flow, Scotland. Rear Admiral Ludwig von Reuter, in charge of the ships, anticipated unfavorable news from the ongoing peace treaty at Versailles, France, and ordered everything scuttled. Fifty-one ships sank peacefully to the bottom of the sound. The British overseers, at a loss for what to do when the ships started sinking, opened fire wildly and managed to kill nine Germans, the last casualties of the war. There, under the cold, oxygen-starved water of Scotland sat the German Navy.

Fast-forward to the Age of Wild Experimentation. The sudden load of radiation in the Earth’s atmosphere affected many things, particularly steel production. To make steel, a great deal of air must be blown through the process, and anything in the dust sticks in the product. Any steel made after World War II was noticeably radioactive.

Making the situation worse, the Atomic Energy Commission launched a campaign to deflect attention off the nuclear weapons work and onto the positive aspects of radioactivity. For this purpose, the agency made up all manner of industrial and household applications for nuclear byproducts, including food preservation, disease treatments and diagnoses, and self-illumined exit signs. One of the most successful applications of radioactivity was for the steel industry. Fire bricks in steel furnaces were made with cobalt-60 radioactive traces built in. This made it possible to monitor the erosion of the firebricks from a remote measurement and know when to shut it down and replace the bricks, before there was a fire blowout. All you had to do was monitor the radioactivity from the furnace. As the bricks wore away, the radioactivity from the cobalt-60 would decrease. Unfortunately, this cobalt-60 from the dissolving bricks wound up in the steel. Contaminated steel got mixed with uncontaminated steel, and before long all steel products built after the war were hopelessly contaminated with gamma-emitting isotopes. All construction projects, from building erection to mounting components for an instrument in a steel case involved an increased amount of background radiation. This condition played havoc with sensitive measurements, particularly in nuclear physics. Steel products were divided into two groups. There was post-atomic steel and there was pre-atomic steel. Pre-atomic steel started to become scarce, as post-atomic steel started contaminating everything. A favorite source of contamination-free steel at nuclear labs became World War I naval guns.

Fast-forward to 1967. NASA launched the Surveyor 5 mission to the Moon, where it achieved a successful, soft landing. Surveyor 5, in addition to further proving the concept of a rocket-assisted touchdown, carried an alpha-scattering surface analyzer, using curium-252 alpha particle sources to find elements in the lunar dirt.182 On board were two sensitive radiation detectors to make the analysis. Parts of the spacecraft had to be made of steel, for its strength properties, and post-atomic material was out of the question. If they were sending the experiment to the Moon, NASA did not want background contamination coming out of the spacecraft frame. They needed pure, pre-atomic steel.

The British were glad to step up to the problem. They owned the contents of Scapa Flow, including the water-logged former German Navy. Parts were routinely cut off the sunken ships, which were in amazingly good shape after so long in the water, and were reworked into parts whenever good, pre-atomic steel was needed.183

The Atomic Energy Commission, perhaps coming to grips with the fact that they were partly responsible for the annoying background radiation in steel, decided to make it better by fabricating bricks of some special, “nuclear grade” lead. Lead is a dense, heavy metal, and it makes an excellent radiation shielding material for the more sophisticated nuclear experiments requiring a complete exclusion of cosmic rays and environmental radiation. Unfortunately, freshly mined and chemically pure lead contains a radioactive contaminant, lead-210. All other natural lead is stable and radiation-free, but a small amount of lead-210 can play havoc with an otherwise well-designed experiment. The lead-210 has a 22.3-year half-life, and it is almost the end of the decay chain of natural uranium in the Earth. Find some lead that is 446 years old, and you have lead that has reduced its radiation emission by a factor of one million, as almost all the lead-210 has decayed away.

There was a known source of old lead. The roof and gutters of the Notre Dame Cathedral on the Île de la Cité in Paris were sealed using lead, and it had been there a long time. The roof was completed in about 1250, and although the building had been extensively restored around 1850, it was still the largest volume of verifiably old lead available. The Atomic Energy Commission offered to replace the lead in the cathedral roof, and they would also remove all the old lead, which was obviously an environmental hazard.

Voilá.