CHAPTER ONE
BUILDING THE BOMB
        Albert Einstein signed the letter. Years later he would regret it, calling it the one mistake he had made in his life. But in August 1939, Adolf Hitler’s armies already occupied Czechoslovakia and Austria and his fascist thugs were arresting Jews and political opponents throughout the Third Reich. Signing the letter seemed vital. His friends and fellow physicists, Leo Szilard and Eugene Wigner, had drafted the note he would now send to President Franklin D. Roosevelt.
The scientists had seen their excitement over the recent breakthrough discoveries of the deepest secrets of the atom turn to fear as they realized what unleashing atomic energies could mean. Now the danger could not be denied. The Nazis might be working on a super-weapon; they had to be stopped.
In his famous letter, Einstein warned Roosevelt that in the immediate future, based on new work by Szilard and the Italian physicist Enrico Fermi, “it may become possible to set up a nuclear chain reaction in a large mass of uranium, by which vast amounts of power and large quantities of new radiumlike elements would be generated.” This “new phenomenon,” he said, could lead to the construction of “extremely powerful bombs of a new type.” Just one of these bombs, “carried by boat and exploded in a port, might very well destroy the whole port together with some of the surrounding territory.” The Nazis might already be working on such a bomb. “Germany has actually stopped the sale of uranium from Czechoslovakian mines, which she has taken over,” Einstein reported.1 He urged Roosevelt to speed up American experimental work by providing government funds and coordinating the work of physicists investigating chain reactions.
Roosevelt responded, but tentatively. He formed an Advisory Committee on Uranium to oversee preliminary research on nuclear fission. By the spring of 1940, the committee had allocated only $6,000 to purchase graphite bricks, a critical component of experiments Fermi and Szilard were running at Columbia University. In 1941, however, engineer Vannevar Bush, the president of the Carnegie Institution of Washington and the president’s informal science advisor, convinced Roosevelt to move faster. British Prime Minister Winston Churchill also weighed in, sending the president new, critical studies by scientists in England.
The most important was a memorandum from two German refugee scientists living in England, Otto Frisch and Rudolph Peierls. From their early experiments and calculations, they detailed how vast the potential destructive power of atomic energy could be—and such power’s military implications. Their memo to the British government estimated that the energy liberated from just 5 kilograms of uranium would yield an explosion equal to several thousand tons of dynamite.
 
This energy is liberated in a small volume, in which it will, for an instant, produce a temperature comparable to that in the interior of the sun. The blast from such an explosion would destroy life in a wide area. The size of this area is difficult to estimate, but it will probably cover the center of a big city.
In addition, some part of the energy set free by the bomb goes to produce radioactive substances, and these will emit very powerful and dangerous radiations. The effects of these radiations is greatest immediately after the explosion, but it decays only gradually and even for days after the explosion any person entering the affected area will be killed.
Some of this radioactivity will be carried along with the wind and will spread the contamination; several miles downwind this may kill people.2
 
The scientists concluded:
 
If one works on the assumption that Germany is, or will be, in the possession of this weapon, it must be realized that no shelters are available that would be effective and that could be used on a large scale. The most effective reply would be a counter-threat with a similar bomb. Therefore it seems to us important to start production as soon and as rapidly as possible.3
 
They did not, at the time, consider actually using the bomb, as “the bomb could probably not be used without killing large numbers of civilians, and this may make it unsuitable as a weapon for use by this country.”4 Rather, they thought it necessary to have a bomb to deter German use. This was exactly the reasoning of Einstein, Szilard, and others.
Soon after the Frisch-Peierls memo circulated at the highest levels of the British government, a special committee on uranium, confusingly named the MAUD committee for a British nurse who had worked with the family of Danish physicist Niels Bohr, began assessing the two scientists’ conclusions.5 The MAUD report on “Use of Uranium for a Bomb” would have an immediate impact on the thinking of both Churchill and Franklin Roosevelt in the summer and fall of 1941. It concluded that a “uranium bomb” could be available in time to help the war effort: “the material for the first bomb could be ready by the end of 1943.”6 Upon meeting with Vannevar Bush and learning of the MAUD committee’s dramatic conclusions on October 9, 1941, Roosevelt authorized the first atomic bomb project.
Bush, then head of the newly formed National Defense Research Committee, asked Harvard President James Conant to direct a special panel of the National Academy of Sciences to review all atomic energy studies and experiments. Though Bush’s committee recommended the “urgent development” of the bomb, the December 1941 attack on Pearl Harbor gave other conventional military concerns greater precedence. It was not until a year later that work began in earnest.
The Manhattan Project, formally the “Manhattan Engineering District,” was created in August 1942 within the Army Corps of Engineers. The laboratory research now became a military pursuit, in part to mask its massive budget. Brigadier General Leslie Groves assumed leadership of the project in September 1942 and immediately accelerated work on all fronts. Historian Robert Norris says of Groves, “Of all the participants in the Manhattan Project, he and he alone was indispensable.”7
Groves was the perfect man to direct the massive effort needed to create the raw materials of the bomb, having just finished supervising the construction of the largest office building in the world, the new Pentagon. He needed to find a partner who could mobilize the scientific talent already engaged in extensive nuclear research at laboratories in California, Illinois, and New York. At the University of California at Berkeley, Groves met physicist J. Robert Oppenheimer for the first time and heard his plea for a laboratory purely devoted to work on the bomb itself.8 Groves thought Oppenheimer “a genius, a real genius,” and soon convinced him to head the scientific effort.9 Together they chose a remote southwestern mesa as the perfect site for the greatest concentration of applied nuclear brainpower the world had ever seen.
 
 
AN ATOMIC PRIMER
 
When the young scientists recruited for the Manhattan Project moved into the stark buildings of Los Alamos, New Mexico, surrounded by barbed wire, they understood that they would be working on a top-secret project that could win the war. Most knew that they were there to build the world’s first atomic bomb, but didn’t know much more beyond that. To bring everyone up to speed, physicist Robert Serber gave five lectures in early April 1943 on the scientific and engineering challenges ahead. His lecture notes, mimeographed and given to all subsequent arrivals, became knows as The Los Alamos Primer. Today, it still serves as a valuable guide to the essentials of an atomic bomb.
Serber got right to the point: “The object of the Project is to produce a practical military weapon in the form of a bomb in which the energy is released by a fast neutron chain reaction in one or more of the materials known to show nuclear fission.”10
The discovery of fission was new, but the idea of the atom goes back to the early Greek thinkers. In about 400 BCE, Democritus reasoned that if you continuously divided matter, you would eventually get down to the smallest, undividable particle, which he called an atom, meaning “uncuttable.” By the beginning of the twentieth century, scientists realized the atom had an internal structure. In 1908 Ernest Rutherford discovered that atoms had a central core, or nucleus, composed of positively-charged protons, surrounded by the negatively charged electrons detected by J. J. Thompson eleven years earlier. In 1932 James Chadwick discovered that there were particles equal in weight to the proton in the nucleus, but without an electrical charge. He dubbed them neutrons. This led to the atomic model that we are familiar with today, of an atom as a miniature planetary system, with a nucleus of hard, round balls of protons and neutrons with smaller electron balls orbiting around. (See the first diagram on the page facing page 1.)
Familiar, but not quite right. Danish physicist Niels Bohr, among his many other contributions, found that a large nucleus behaved more like a water droplet. His insight led to a breakthrough discovery in 1939. German scientists Otto Hahn and Fritz Strassman, working with physicist Lise Meitner, had been bombarding uranium, the heaviest element found in nature, with neutrons and observing the new elements that seemed to form. Uranium has an atomic number of 92, meaning it has 92 protons in its nucleus. The scientists thought that the neutrons were being absorbed by the uranium atoms, producing new, man-made elements, but chemical analysis indicated that this was not the case. When Meitner and physicist Otto Frisch applied Bohr’s water droplet model to these experimental results, they realized that under certain conditions the nucleus would stretch and could split in two, like a living cell. Frisch named the process after its biological equivalent: fission. (See the second diagram.)
Three events happen during fission. The least important, it turns out, is that the uranium atom splits into two smaller atoms (usually krypton and barium). Scientists had finally realized the dream of ancient alchemists—the ability to transform one element into another. But it is the other two events that made the discovery really interesting. The two newly created atoms weigh almost exactly what the uranium atom weighed. That “almost” is important. Some of the weight loss is attributable to neutrons flying out of the atom. These are now available for splitting other, nearby uranium nuclei. For every one neutron that splits a uranium nucleus, two more, on average, are generated. Splitting one nucleus can, under the right conditions, lead to the splitting of two additional nuclei, then four, then eight, on up. This is the chain reaction that can start from a single neutron.
The third event is the real payoff. Each fission converts a small amount of the mass of the atom into energy. The first scientists to discover fission applied Einstein’s famous formula, E = mc2, and quickly realized that even this small amount of matter m multiplied by the speed of light squared c2 equals a very large amount of energy E.11
Energy at atomic levels is measured in electron volts. Normal chemical reactions involve the forming or breaking of bonds between the electrons of individual atoms, each releasing energies of a few electron volts. Explosives, such as dynamite, release this energy very quickly, but each atom yields only a small amount of energy. Splitting a single uranium nucleus, however, results in an energy release of almost 200 million electron volts. Splitting all 2,580,000,000,000,000,000,000,000 (2.58 trillion trillion) uranium atoms in just one kilogram of uranium would yield an explosive force equal to ten thousand tons of dynamite. This was the frightening calculation behind the Frisch-Peierls memo and Einstein’s letter to Roosevelt. One small bomb could equal the destructive force of even the largest bomber raid.
 
 
THE RIGHT STUFF
 
Understanding these calculations was the easy part. There wasn’t any great “secret” to atomic energy (and there isn’t now). Physicists at the time in the United States, Great Britain, Russia, Germany, Italy, and Japan all quickly grasped the significance of nuclear fission. The hard part, and this is still true today, is producing the materials that can sustain this chain reaction. Some concluded that the material could not be made, or at least not made in time to affect the course of the war. Others disagreed—among them the influential authors of the MAUD committee report. The crucial difference in the United States was not superior scientific expertise but the industrial capability to make the right materials. Groves used this capability to build by the end of the war the manufacturing equivalent of the American automobile industry—an entirely new industry focused on creating just one product.12
To understand the challenge the United States faced then, and which other nations who want nuclear weapons face today, we have to delve a little deeper into atomic structures. Ordinary uranium cannot be used to make a bomb. Uranium, like many other elements, exists in several alternative forms, called isotopes. Each isotope has the same number of protons (and so maintains the same electric charge) but varies in the number of neutrons (and thus, in weight). Most of the atoms in natural uranium are the isotope U-238, meaning that they each have 92 protons and 146 neutrons for a total atomic weight of 238. When an atom of U-238 absorbs a neutron, it can undergo fission, but this happens only about one-quarter of the time. Thus, it cannot sustain the fast chain reaction needed to release enormous amounts of energy. But one of every 140 atoms in natural uranium (about 0.7 percent) is of another uranium isotope, U-235. Each U-235 nucleus has 92 protons but only 143 neutrons. This isotope will fission almost every time a neutron hits it. The challenge for scientists is to separate enough of this one part of fissile uranium from the 139 parts of non-fissile uranium to produce an amount that can sustain a chain reaction. This quantity is called a critical mass. The process of separating U-235 is called enrichment.
Almost all of the $2 billion spent on the Manhattan Project (about $23 billion in 2006 dollars) went toward building the vast industrial facilities needed to enrich uranium. The Army Corps of Engineers built huge buildings at Oak Ridge, Tennessee, to pursue two different enrichment methods. The first was gaseous diffusion. This process converts the uranium into gas, then uses the slightly different rates at which one isotope diffuses across a porous barrier to separate out the U-235. The diffusion is so slight that it requires thousands of repetitions—and hundreds of diffusion tanks. Each leg of the U-shaped diffusion plant at Oak Ridge was a half-mile long.
The other system was electromagnetic separation. Again, the uranium is converted into a gas. It is then moved through a magnetic field in a curved, vacuum tank. The heavier isotope tends to fly to the outside of the curve, allowing the lighter U-235 to be siphoned off from the inside curve. Again, this process must be repeated thousands of times to produce even small quantities of uranium rich in U-235. Most of the uranium for the bomb dropped on Hiroshima was produced in this way.
Both of these processes are forms of uranium enrichment and are still in use today. By far the most common and most economical method of enriching uranium, however, is to use large gas centrifuges. (See the third diagram on the page facing the opening of chapter 1.) This method (considered but rejected in the Manhattan Project) pipes uranium gas into large vacuum tanks; rotors then spin it at supersonic speeds. The heavier isotope tends to fly to the outside wall of the tank, allowing the lighter U-235 to be siphoned off from the inside. As with all other methods, thousands of cycles are needed to enrich the uranium. Uranium enriched to 3–5 percent U-235 is used to make fuel rods for modern nuclear power reactors. The same facilities can also enrich uranium to the 70–90 percent levels of U-235 needed for weapons. (This inherent “dual-use” capability is one of the key problems in controlling the spread of nuclear weapons and is explored further in chapters 2, 4, and 6.)
There is a second element that can sustain a fast chain reaction: plutonium. This element is not found in nature and was still brand-new at the time of the Manhattan Project. In 1940, scientists at Berkeley discovered that after absorbing an additional neutron, some of the U-238 atoms transformed into a new element with 93 protons and an atomic weight of 239. (The transformation process is called beta-decay, where a neutron in the nucleus changes to a proton and emits an electron.) Uranium was named after the planet Uranus. Since this new element was “beyond” uranium, they named it neptunium after the next planet in the solar system, Neptune. Neptunium is not a stable element. Some of it decays rapidly into a new element with 94 protons. Berkeley scientists Glenn Seaborg and Emilio Segré succeeded in separating this element in 1941, calling it plutonium, after the next planet in line, Pluto.
Plutonium-239 is fissile. In fact, it takes less plutonium to sustain a chain reaction than uranium. The Manhattan Project thus undertook two paths to the bomb, both of which are still the only methods pursued today. Complementing the uranium enrichment plants at Oak Ridge, the Project built a small reactor at the site and used it to produce the first few grams of plutonium in 1944. The world’s first three large-scale nuclear reactors were constructed that year in just five months in Hanford, Washington. There, rods of uranium were bombarded with slow neutrons, changing some of the uranium into plutonium. This process occurs in every nuclear reactor, but some reactors, such as the ones at Hanford, can be designed to maximize this conversion process.
The reactor rods must then be chemically processed to separate the newly produced plutonium from the remaining uranium and other highly radioactive elements generated in the fission process. This reprocessing typically involves a series of baths in nitric acid and other solvents and must be done behind lead shielding with heavy machinery. The first of the Hanford reactors went operational in September 1944 and produced the first irradiated slugs (reactor rods that had been bombarded with neutrons) on Christmas Day of that year. After cooling and reprocessing, the first Hanford plutonium arrived in Los Alamos on February 2, 1945. The lab had gotten its first 200 grams of U-235 from Oak Ridge a year earlier and it now seemed that enough fissile material could be manufactured for at least one bomb by August 1945.
The Manhattan Project engineers and scientists had conquered the hardest part of the process—producing the material. But that does not mean that making the rest of the bomb is easy.
 
 
BOMB DESIGN
 
The two basic designs for atomic bombs developed at Los Alamos are still used today, though with refinements that increase their explosive yield and shrink their size.
In his introduction lectures, Robert Serber explained the basic problem that all bomb designers have to solve. Once the chain reaction begins, it takes about 80 generations of neutrons to fission a whole kilogram of material. This takes place in about 0.8 microseconds, or less than one millionth of one second. “While this is going on,” Serber said, “the energy release is making the material very hot, developing great pressure and hence tending to cause an explosion.”13
This is a bit of an understatement. The quickly generated heat rises to about 10 billion degrees Celsius. At this temperature the uranium is no longer a metal but has been converted into a gas under tremendous pressure. The gas expands at great velocity, pushing the atoms further apart, increasing the time necessary for neutron collisions, and allowing more neutrons to escape without hitting any atoms. The material would thus blow apart before the weapon could achieve full explosive yield. When this happens in a poorly designed weapon it is called a “fizzle.” There is still an explosion, just smaller than designed and predicted.
Led by Robert Oppenheimer, the scientific teams developed two methods for achieving the desired mass and explosive yield. The first is the gun assembly technique, which rapidly brings together two subcritical masses to form the critical mass necessary to sustain a full chain reaction. The second is the implosion technique, which rapidly compresses a single subcritical mass into the critical density.
The gun design is the least complex. It basically involves placing a subcritical amount of U-235 at or around one end of a gun barrel and shooting a plug of U-235 into the assembly. To avoid a fizzle, the plug has to travel at a speed faster than that of the nuclear chain reaction, which works out to about 1,000 feet per second.14 The material is also surrounded by a “tamper” of uranium that helps reflect escaping neutrons back into the bomb core, thus reducing the amount of material needed to achieve a critical mass.
The nuclear weapon that the United States dropped on Hiroshima, Japan, on August 6, 1945, was a gun-type weapon. Called “Little Boy,” the gun barrel inside weighed about 1,000 pounds and was six feet long. The science was so well understood, even at that time, that it was used without being explosively tested beforehand. Today, this is almost certainly the design that a terrorist group would try to duplicate if they could acquire enough highly enriched uranium. The Hiroshima bomb used 64 kilograms of U-235.15 Today, a similar bomb could be constructed with approximately 25 kilograms, in an assembled sphere about the size of a small melon.
Gun-design weapons can use only uranium as a fissile material. The chain reaction in plutonium proceeds more rapidly than the plug can be accelerated, thus causing the device to explode prematurely. But plutonium can be used in another design that uniformly compresses the material to achieve critical mass (as can uranium). This is a more complex design but allows for a smaller device, such as those used in today’s modern missile warheads. The implosion design was used in the first nuclear explosion, the Trinity test at Alamogordo, New Mexico, on July 16, 1945, and in the “Fat Man” nuclear bomb dropped on Nagasaki, Japan, on August 9, 1945.
The implosion method of assembly involves a sphere of bomb material surrounded by a tamper layer and then a layer of carefully shaped plastic explosive charges. With exquisite microsecond timing, the explosives detonate, forming a uniform shock wave that compresses the material down to critical mass. A neutron emitter at the center of the device (usually a thin wafer of polonium that is squeezed together with a sheet of beryllium) starts the chain reaction. The Trinity test used about 6 kilograms of plutonium,16 but modern implosion devices use approximately 5 kilograms of plutonium or less—a sphere about the size of a plum.17
By Spring 1945 the Los Alamos scientists were franticly rushing to assemble what they called the “gadget” for the world’s first atomic test. Although they had spent years in calculation, the staggering 20-kiloton magnitude of the Trinity explosion surpassed expectations. Secretary of War Henry Stimson received word of the successful test while accompanying President Truman at the Potsdam Conference. At the close of the conference, Truman made a deliberately veiled comment to Stalin, alluding to a new U.S. weapon. The Soviet premier responded with an equally cryptic nod and “Thank you.”18
Back in the U.S. the wheels were in motion, and the first atomic bomb, “Little Boy,” was on a ship headed to Tinian, an island off the coast of Japan. In the months leading up to Trinity, top government officials had selected targets and formed a policy of use. The eight-member Interim Committee, responsible for A-bomb policy and chaired by Stimson, concluded that “we could not give the Japanese any warning; that we could not concentrate on a civilian area; but that we should seek to make a profound psychological impression on as many of the inhabitants as possible . . . [and] that the most desirable target would be a vital war plant employing a large number of workers and closely surrounded by workers’ houses.”19 On August 6, 1945, Little Boy exploded with a force of 15 kilotons over the first city on the target list, Hiroshima.
 
 
DROPPING THE BOMB
 
To this day, the decision to drop the bomb on Japan remains controversial and historians continue to dispute the bomb’s role in ending the Pacific war. The traditional view argues that Truman faced a hellish choice: use the bomb or subject U.S. soldiers to a costly land invasion. Officials at the time did not believe that Japan was on the verge of unconditional surrender, and the planned land invasion of the home islands would have resulted in extremely high casualties on both sides. The months preceding the atomic bombings had witnessed some of the most horrific battles of the war in the Pacific, with thousands of U.S. troops dying in island assaults. Historians Thomas B. Allen and Norman Polmar write:
 
Had the invasions occurred, they would have been the most savage battles of the war. Thousands of young U.S. military men and perhaps millions of Japanese soldiers and civilians would have died. Terror weapons could have scarred the land and made the end of the war an Armageddon even worse than the devastation caused by two atomic bombs.20
 
Immediately after the bombing of Hiroshima and Nagasaki, there was significant moral backlash, expressed most poignantly in the writings of John Hersey, whose gripping story of six Hiroshima residents on the day of the bombing shocked readers of the New Yorker in 1946. But the debate was not over whether the bombing was truly necessary to end the war. It was not until the mid-1960s that an alternate interpretation sparked a historiographical dispute.21 In 1965, Gar Alperovitz argued in his book Atomic Diplomacy that the bomb was dropped primarily for political rather than military reasons. In the summer of 1945, he says, Japan was on the verge of surrender. Truman and his senior advisors knew this but used the atomic bomb to intimidate the Soviet Union and thus gain advantage in the postwar situation.22 Some proponents of this perspective have disagreed with Alperovitz on the primacy of the Soviet factor in A-bomb decision making, but have supported his conclusion that the bomb was seen by policy makers as a weapon with diplomatic leverage.23
A middle-ground historical interpretation, convincingly argued by Barton Bernstein, suggests that ending the Pacific war was indeed Truman’s primary reason for dropping the bomb, but that policy makers saw the potential to impress the Soviets, and to end the war before Moscow could join an allied invasion, as a “bonus.”24 This view is buttressed by compelling evidence that most senior officials did not see a big difference between killing civilians with fire bombs and killing them with atomic bombs. The war had brutalized everyone. The strategy of intentionally attacking civilian targets, considered beyond the pale at the beginning of the war, had become commonplace in both the European and Asian theaters. Hiroshima and Nagasaki, in this context, were the continuation of decisions reached years earlier. It was only after the bombings that the public and the political leaders began to comprehend the great danger the Manhattan Project had unleashed and began to draw a distinction between conventional weapons and nuclear weapons.