I was born in 1951, the year that atmospheric tests of nuclear weapons began at the Nevada Proving Ground, later renamed the Nevada Test Site, 65 miles northwest of Las Vegas. The Atomic Energy Commission made no public announcements about the tests, even though they were powerful enough to light up the night sky over Los Angeles and San Francisco, and tourists came to Las Vegas and other nearby towns to witness the mushroom clouds.194 The November explosion of the so-called “Dog” bomb was the first time American soldiers conducted field exercises in conjunction with an atomic test. The army used them as guinea pigs, making them witness the blast from about six miles away and then move in closer for “defensive” maneuvers.
Thanks to the Nevada Test Site, and other locations around the world where the Americans, Soviets, British, French, and Chinese exploded atomic bombs into the atmosphere, we all carry radioactive residues in our bodies.195 According to a 2001 US government report, if you’ve lived in the contiguous United States any time since 1951, all organs and tissues of your body have received some radiation exposure from nuclear fallout because of its lasting effects on the environment. Your degree of exposure isn’t just a question of proximity to test sites, but also of which direction the wind happened to blow the day of the test or where it happened to rain. Maps of radiation exposure show higher densities around Nevada and neighboring states, but hotspots are sprinkled throughout the country. The government report estimates that excess cancer deaths from fallout exposure are likely to be the highest in persons born in 1951 because on average they received higher doses of radiation than people born either earlier or later.196
From that perspective, 1951 is not a terribly auspicious birth year, but I was lucky in other ways—lucky to be born, in fact. In 1942, at the age of 19, my father was one of the first students to drop out of Princeton University to join the war effort. He became a marine dive bomber pilot, flying more than 80 missions in the Gilbert and Marshall Islands in the Pacific. His plane was hit a number of times, but he made it home alive. Shortly after he returned to the US in late 1945, he proposed to my mother, who was in graduate school at Iowa State University. They settled back in Princeton, where my father completed university on the GI Bill and then became a history teacher at a local prep school. Sometimes my parents spotted Einstein sailing on Princeton’s Lake Carnegie. My mother bore three healthy daughters, of whom I’m the youngest.
A year after my birth, we moved to Wilmington, Delaware, where my father got a better-paying teaching job. Three years later, my mother developed malignant thyroid cancer. While it’s impossible to pinpoint the cause, she was doubly exposed to nuclear contamination. Thyroid cancer is one of the most common cancers associated with fallout exposure, since radioiodines such as Iodine-131 concentrate in the thyroid gland. A 1997 National Cancer Institute study estimated that exposure to Iodine-131 from the Nevada tests alone led to between 11,300 and 212,000 additional thyroid cancers in the US.197
Risks of exposure were higher for those who worked directly with radioactive materials in the nuclear weapons complex. While my mother was studying agricultural economics at Iowa State, her best friends there, including my godmother, were doing secret research for the Manhattan Project at the university’s Ames Laboratory. Beginning in 1942, the lab developed new methods for producing high-purity uranium. In the process, it generated radioactive dusts at extremely high levels. There was little in the way of personal protection, engineering controls, or radiation monitoring to protect workers, though my godmother’s job was to test the scientists’ urine to gauge their exposure. Over 60 years later, the Department of Energy finally established a Former Worker Medical Screening Program for the Ames Laboratory, with possible compensation for 22 types of radiation-induced cancers. Thyroid cancer is on the list.198
So where did my mother’s cancer come from? Fallout? Or exposure at Ames where she shared an apartment with my godmother and dated a Manhattan Project scientist before my father swept her off her feet? Was there radioactive dust on their hands or clothes or in the food they ate? Was she just genetically predisposed to cancer? Or could it be all of the above?
While I’ll never know the exact cause, I do know something about the effects. In the 1950s such things were not talked about openly with children, and my mother’s diagnosis was a carefully kept secret from my sisters and me. But no matter how tight the container, fear has a way of seeping out. When my mother went to Boston for her operation, I came down with the flu. One of my first memories is of our favorite babysitter giving me a stuffed cat toy as I lay in bed recovering from fever. All through childhood I clutched that cat at night, even after its fur and button eyes fell off and it looked like a blind old alley cat that had been in one too many fights.
Fortunately, my mother survived. Life went on, though a sense of danger remained, a slight whiff of death that hung in the air. It was like in an upscale nursing home where they do everything they can to keep the floors and patients spotlessly clean, yet visitors can still smell the presence of the Grim Reaper. In our house he hid in the shadows, but I knew he was there. Many years later, my mother told me that, after her operation, her goal in life was to survive until I turned 16.
When I look back now, I see my childhood in a kind of chiaroscuro. When I was seven we moved to Dallas, where my father became headmaster of an elite boys’ school. Affluence surrounded us. The campus was a gigantic playground, and we swam in the pools of millionaires, diving into sky blue chlorinated water and coming up for air in a world baked white by hot sun and racial prejudice. That incredible whiteness of being made my fears seem all the darker by comparison. I struggled for mastery over them by reading mysteries and then imagining and enacting my own as if I were Nancy Drew.
When I got old enough to analyze myself—in those heady college years when we stayed up late into the night discussing Freud and Jung—I latched on to my mother’s cancer as the source of my early fears. But there was still a sense of a darker, deeper mystery left unsolved. It wasn’t until recently that I came to realize the Grim Reaper of my childhood wore more than one hat. I carried within me not only the fear of losing my mother, but of losing the whole world in a nuclear holocaust.
How does one unbury one’s nuclear fears, inspect them, catalogue them, perhaps put them to rest? There’s no easy answer, but sometimes something comes along to cast a little light. While researching the Nevada Test Site, I stumbled on a government photograph taken during a 1952 bomb test that shows two Marines lifting their hands to touch the mushroom cloud.
The caption reads, “The atomic cloud formed by the detonation seems close enough to touch, and tension gone, Poth and Wilson do a little clowning for the camera.”199 The irony is that although they weren’t close enough to touch the cloud, its radioactivity was close enough to touch them. Is that the metaphor I’m searching for—the bomb as magic show, with its illusions become delusions? The tricks up the atomic magicians’ sleeves not only steered us toward apocalypse then, but still delude the American body politic now.
Strictly speaking, the hidden history of the bomb is not hidden any more. Plenty of good books by good historians painstakingly deconstruct our national nuclear myths. The problem is few of us read them. On the nuclear tourist trail, most of the public history on display is unapologetically uncritical. The basic message is that it’s good the Cold War is over, but we should be grateful that the nuclear balance of terror kept us safe.
If all you knew about the atomic bomb you learned from the introductory video The Town that Never Was at the Bradbury Science Museum in Los Alamos, you’d think the brilliant Manhattan Project scientists just brought American soldiers home to the waiting arms of pretty women. The killing in Hiroshima and Nagasaki isn’t shown, or even mentioned, in the video, much less in the museum’s displays. At the Titan Missile Museum outside Tucson, you can descend into the underground control room of a decommissioned Minuteman intercontinental ballistic missile (ICBM) that could deliver nuclear bombs, but that’s as deep as the history gets. Visitors are encouraged to feel awe and nostalgia for the clever engineering and the loyal soldiers who worked shifts underground, always at the ready to destroy the world. At the museum’s perimeter fence, white Border Patrol trucks wait between desert forays against one of our latest national security threats—poor Mexican migrants.
There are better museums, like the National Museum of Nuclear Science & History in Albuquerque, which has a display about the bombing of Japan, including the burnt remains of a three-year-old victim’s tricycle. Arguments for and against the dropping of the bomb encourage viewers to ask the question of whether it was necessary. This museum is the exception, however. The rule is not to question.
That lack of questioning brings us to the first magic trick, the “no choice” illusion: The US had to drop the bomb on Hiroshima and Nagasaki in August 1945 to end World War II and spare hundreds of thousands of American soldiers from a deadly invasion of Japan. I grew up believing this. I was taught that thanks to the bomb, my father didn’t have to return to the Pacific. We had used a weapon of mass destruction to save ourselves from destruction—we had to do bad in order to do good.
If more Americans knew there were other options of ending the war, they might be less inclined toward apocalypse. Other choices could have been made. While historians still debate whether those other choices should have been made, the historical record shows that the fatal and fateful act of dropping the bomb wasn’t inevitable. Even President Harry Truman, who called the destruction of Hiroshima the “greatest day in history” and claimed that he never lost sleep over it, knew he had made a decision.200 The decision was less about saving American lives than about punishing Japan, justifying the enormous costs of the Manhattan Project, displaying American power, and letting our erstwhile Soviet allies know who was boss.201
Many World War II American military leaders later admitted that we didn’t have to drop the bomb. The official US Strategic Bombing Survey of 1946 came to the conclusion that Japan would have surrendered before the end of 1945 without the bomb, without the entrance of Russia into the Pacific war, and even without the threat of an invasion.202 In 1963, former General and President Dwight Eisenhower wrote that, when he learned from the Secretary of War Stimson that the bomb would be used, he became depressed and voiced “grave misgivings.” The Japanese would surrender soon, so that “dropping the bomb was completely unnecessary” and “no longer mandatory as a measure to save American lives.”203
A second choice also lay within Truman’s decision to drop the bomb—the deliberate selection of civilian targets. Truman claimed that Hiroshima was targeted because it was a military base, and he wanted to avoid as much as possible the killing of civilians.204 Although Hiroshima had an army base, it was a city of 350,000 people, 140,000 of whom died. Nagasaki had 270,000 inhabitants, and 70,000 died. The Interim Committee that advised Truman on the bomb rejected the idea of a demonstration blast, and bombing a less populated area of Japan apparently wasn’t considered. Instead the committee looked for targets that had many wood-frame buildings in close proximity to each other, since they would be more vulnerable to the blast and ensuing fires.205
What’s extraordinary about the “no choice” illusion is its staying power. Despite the wealth of scholarship dismantling its myths and the blunt statements by men who executed the decision to drop the bomb—even “bombs away” Air Force General Curtis LeMay later admitted it wasn’t necessary, saying he did it “because President Truman told me to do it”206—in the minds of most Americans, the official narrative of my childhood still stands as sacred truth and we owe Japanese civilians no apology.
In 1994, to mark the upcoming 50th anniversary of the bombing, the US Postal Service designed a mushroom cloud stamp with the words, “Atom bombs hasten war’s end, August 1945.” After protests from the Japanese government, President Clinton withdrew the proposed stamp, but he refused to apologize for the bombing.207 The Smithsonian National Air and Space Museum planned an extensive exhibit for the anniversary, but its content was censored when the curators dared to raise a few questions about the A-bomb decision.208 The remains of a watch and a child’s melted lunchbox, found among the ruins of Hiroshima, were excluded from the exhibit, lest they remind visitors of the flesh-and-blood people who perished that day.209
The second magic trick, the illusion of distance, depends on this censorship of the bomb’s human impact. I grew up exposed to vivid pictures of the Holocaust’s human toll, haunting images of emaciated concentration camp survivors and mass graves of crumpled skeletons. These lessened the distance between me and the millions who died. I saw and felt the horror of the Holocaust, knew it was evil, found hope in the words “never again.” But the bomb wasn’t about bodies, it was about a far-off mushroom cloud, more a force of nature than a human creation, eerie and even beautiful like the blue northers that roiled the big Texas sky.
Why didn’t I see the bodies?
The abstraction was calculated. As Nagasaki’s mayor remarked during the postage stamp controversy, the image of the cloud prevented people from seeing that beneath it “hundreds of thousands of noncombatant women and children were killed or injured on the spot.”210 Most Japanese survivors have no memory of seeing a mushroom cloud because it would have been visible only from high in the sky or miles away. At ground zero, all they saw was a blinding flash.211 What happened to civilians was kept carefully from us, censored in fact. When pictures of the bombing were later shown to the American public, they were panoramic views of the destruction, burned buildings but not charred bodies.
It wasn’t that the photographic footage didn’t exist. Japanese photographer Yamahata Yosuke took pictures of the dead and wounded in Nagasaki a day after the blast. Initially, a few photographs of the carnage found their way into the American and Japanese press, but as the US began its military occupation of Japan, most of the visible evidence was seized and locked in a vault so as not to disturb “public tranquility.”212 The extent of radiation injuries was also kept hush-hush.213
With the end of American occupation in 1952, suppressed photographs and films of the atomic victims finally became available in Japan. But they remained largely unseen by American viewers until much later. Not until the late 1960s was film footage that had been confiscated by the American occupation authorities finally released.214 If Americans had been exposed to those pictures earlier, as they were to the terrible realities of the Holocaust, might “never again” have come to include the atomic bomb as well? “Never again” has a healing finality to it, a moral assertiveness that helps keep apocalyptic fears at bay. Instead, the mushroom cloud carried another message: “always possible.”
Shielding Americans from awareness of the bomb’s impact on the Japanese prepared the way for the third magic trick, the illusion that the perpetrator is the victim, a key element of the America syndrome. How better to avoid moral responsibility for one’s crimes than to assume victim status? The vacuum created by the censorship of the Japanese reality left a void in the American imagination which was filled by government propaganda that the bomb was coming to get us. The Atomic Energy Commission (AEC) distributed frightening images of atomic attacks on major American cities, including a glowing nuclear fireball above the New York skyline.215 While this made it even easier to forget the suffering of the Japanese, it also made the apocalypse seem more imminent, coming soon to a war theater near you.
Not only the government engaged in these tactics. Well before the nuclear arms race between the Americans and the Soviets, popular media conjured up nightmarish scenarios of atomic death and destruction on US soil. In November 1945, scarcely three months after Hiroshima and Nagasaki, Life magazine ran a spread on “The 36-Hour War,” a grim depiction of an atomic attack on 13 American cities, complete with realistic drawings of the ruins left behind. New York is pictured as a tangled mess of radioactive debris, with only the two iconic marble lions of the Public Library left standing watch.216
Initially, it wasn’t clear who our actual enemy was. In Life’s 36-Hour War, the attackers are an unspecified hostile force operating from Africa, of all places. But once the Soviet Union detonated its first atomic bomb in 1949, the new enemy had a face. Already terrified and titillated by atomic thrills, many Americans fell under the spell of the fourth magic trick, the Great Satan illusion. Whatever moral ambiguity might have existed in the immediate aftermath of the war, it was now lost in the latest battle between the forces of Good and Evil.
The Christian dualism that underpins the Great Satan illusion is a potent element in the America syndrome. Add the God/Satan binary to the bomb and the result approaches a kind of national psychosis. This dualism provided important psychological scaffolding for the Cold War and the nuclear arms race, and became a defining characteristic of the age of American empire.
In his book on fear and faith in the arms race, Sheldon Ungar describes how the bomb came to be perceived as a mystical, quasi-religious entity, provoking powerful feelings of awe and transcendent power on the one hand, and dread and fear on the other. These aligned well with the country’s civil religion: Only the Chosen People have the superior moral and spiritual qualities to own and control the bomb. As President Truman put it in 1945, “The possession in our hands of this new power of destruction we regard as a sacred trust.”217
When the Soviets muscled into the action in 1949, it didn’t take long for most Americans to view them as the diabolical foe. Indeed, the existence of precisely such a foe was necessary to create the public fear that sustained the American drive for nuclear supremacy. Periodic moral panics that Satan was about to overtake us—during the Korean War, the Soviets’ Sputnik satellite launch, and the Cuban missile crisis—raised the background level of nuclear anxiety into near hysteria and fueled escalation of the arms race. The bomb became the only guarantor of the “American way of life,” our shield against evil Soviet Communists who threatened the very foundations of our national identity, including that other sacred trust of ours, the “free market.”218 So strong was fear of Communism that a 1961 Gallup Poll found that, given the choice between fighting an all-out nuclear war and living under Communist rule, a whopping 81 percent of Americans chose war. In Britain, the corresponding figure was only 21 percent.219
For those working directly on the bomb, apocalypse became a way of life. In her early 1980s study of Amarillo, Texas, home to the Pantex nuclear weapons plant, A.G. Mojtabai found an extraordinary level of literal belief in the Christian apocalypse among plant employees as well as town inhabitants at all rungs of the social ladder. Rather than disarmament, they put their faith in the Rapture.220
Of course, not everyone succumbed to the Great Satan delusion. The non-believers included my own secular, liberal family. God might or might not exist—Unitarian Sunday school left it up to us to decide—and there was never any scare talk of Satan. My parents weren’t fervent anti-Communists either; they believed in civil liberties and political pluralism and despised Joseph McCarthy. The message I received at home wasn’t so much that the Soviets were evil, but that we had to beat them at their game. I still remember when my parents woke me up on a brisk fall night in 1957 to see Sputnik, the first satellite launched into space, twinkling across the sky. I’d never been outside in the middle of the night and the experience was more fun than scary. We were in a race with the Russians! I understood races—I loved to run and I loved to win. A whole generation of schoolchildren was Sputnikized that night. Study hard and the country can catch up. Run, run, run. I internalized the message well into my teenage years. Wanting to help save the world, I studied Russian and strived to get good grades. Even as it began to dawn on me that the race was absurd, I kept running and running.
In Sputnik’s aftermath, federal money poured into education, scientific research and development, defense reorganization, and the space program. A fictive “missile gap” between the Americans and Soviets was manufactured to justify the expenditures. The military-industrial complex that President Eisenhower had so presciently warned about was birthed and fattened, even over some of his own objections.221 All for the race—but what lay beyond the finish line? Who would win when the end game was mass annihilation on both sides? In the face of such questions, it was no easy matter for the government to sustain public confidence in the crazed competition with the Great Satan. But one more magic trick lay up its sleeve: the illusion of atomic utopia as the counterpoint to nuclear Armageddon. Thanks to atomic energy, the New Jerusalem would be bathed in eternal light.
For every immanent doom forecast by the Cold Warriors, there was also the promise of a New Jerusalem only a few more steps away. It wasn’t hard to sell. After all, the postwar years saw the spectacular rise of American consumer capitalism: the economy was booming, living standards were improving, Madison Avenue and the middle class danced in step. The bomb became an edgy part of the mix. In his book By the Bomb’s Early Light, historian Paul Boyer describes how entrepreneurs and advertisers quickly seized on its sex appeal. The radioactive dust had barely settled in Hiroshima when “Atom Bomb Dancers” performed in burlesque shows in Los Angeles. Life magazine ran a full-page picture of “The Anatomic Bomb,” a new MGM starlet who was featured lying languidly by a swimming pool, soaking up the rays. In 1946, when the US began nuclear tests in the South Pacific, the first bomb dropped on the Bikini islands sported an image of Rita Hayworth painted on its side, and a French designer coined the name “bikini” for a skimpy new bathing suit. For the kids, more wholesome items like the atomic bomb ring were available for 15 cents plus a Kix cereal box top from General Mills.222
In the aftermath of the Bikini tests, however, the public became warier of radiation effects. In the 1948 bestseller No Place to Hide, David Bradley, a physician employed by the Radiological Safety Unit in the islands, wrote powerfully about the ecological damage caused by the bombs and their impact on the displaced natives. The bomb started to lose its sex appeal.223 As Americans began to shed some of their tasteless innocence, Cold Warriors found a new strategy to sell the bomb: the “peaceful atom” that would usher in a fantastic new energy millennium. Appealing to the American dream of eternal abundance, the AEC and corporate partners like General Electric launched a public relations campaign in the late 1940s that included exhibits, comic books, movies, and school textbooks about the wonders of the atom, focusing on its potential as an inexpensive and clean energy source. The 1948 high school study unit, Operation Atomic Vision, informed students that with nuclear power,
You may live to drive a plastic car powered by an atomic engine and reside in a completely air-conditioned plastic house. Food will be cheap and abundant everywhere in the world . . . No one will need to work long hours. There will be much leisure and a network of large recreational areas will cover the country, if not the world.224
In 1953 President Eisenhower delivered his famous “Atoms for Peace” speech before the UN, and in 1957 the world’s first nuclear power plant went online in Shippingport, Pennsylvania.
As AEC Chairman David Lilienthal openly acknowledged, there was a symbiotic relationship between nuclear weapons and the atom’s peaceful uses: they were two sides of the same coin.225 Some scientists went even further, promoting nuclear bombs as tools for massive earthworks such as the construction of canals and harbors. In 1957 the AEC initiated Project Plowshare to investigate such possibilities, and a year later H-bomb guru Edward Teller launched Project Chariot, a plan to dig a deep water harbor at Cape Thompson in northwestern Alaska by means of thermonuclear bombs. The ostensible purpose was bringing economic development to the region, but the real motive was to test the latest weapons and, once again, to show the Soviets who was boss.226
While Project Chariot was ultimately foiled by resistance from the region’s Inupiat inhabitants as well as critical work by scientists and ecologists (some of whom the government then blacklisted), the utopian promises of nuclear energy lived on. In the 1960s and early 1970s, more than a hundred nuclear power plants were constructed across the country. Their formidable financial costs gave lie to the claim of cheap energy abundance, and in 1979 the near-disaster at Three Mile Island gave lie to assurances of public safety. Despite the massive accidents at Chernobyl in 1986 and Fukushima in 2011, there is still talk of a “nuclear renaissance,” this time in relation to climate change.
Together the five illusions—that there was no choice, that there were no tangible bodies, that the perpetrator was the victim, that the Soviet enemy was Satan, and that the atomic bomb held within it the promise of a New Jerusalem of plenty—helped put the country in an apocalyptic mood. Ask almost any one of us who grew up in those years what it was like and we’ll tell you about the bomb drills at school—how we had to “duck and cover” under our desks or in the hallways, hands crossed behind our heads as if forcing ourselves to bow and pray. And then next we might tell you about the bomb shelter a neighbor built in his cellar. Drilled into us from an early age was the belief that we could survive a nuclear attack, a prospect that raised a disturbing existential question: Would we want to if our family and friends died and the world around us was destroyed?
If I were to draw a map of nuclear tourist territory, I’d highlight Highway 285 in New Mexico with glow-in-the-dark ink. Where the highway crosses from Texas into New Mexico, it skirts the town of Carlsbad. Famous for its remarkable caverns, Carlsbad also has the distinction of being near the Waste Isolation Pilot Plant (WIPP), one of the world’s few underground dumps for permanent disposal of transuranic radioactive wastes, including plutonium, generated in the research and production of nuclear weapons.
Arriving by truck from over 12 government facilities, the waste is stored 2000 feet deep in a salt rock formation. The route from Los Alamos follows Highway 285 south through Santa Fe, with a new bypass built to keep the trucks out of the city center. Since plutonium is one of the most poisonous substances on Earth, taking over 24,000 years to lose just half of its radioactivity, one of the WIPP’s challenges is how to communicate the site’s dangers to generations in the distant future who may not speak our languages. The WIPP has consulted with linguists, scientists, science writers, and anthropologists to come up with communication strategies for people who may live in a future as distant from the present as the present is from the Stone Age. One of the solutions is a pictograph showing a skull and crossbones on cans of poisonous chemicals.227
The WIPP is not the only underground nuclear wonder along Highway 285. About 30 miles north of Carlsbad is the small town of Artesia, with a population of less than 12,000. Its main claim to fame is the now deserted Abo Elementary School, the first—and, as far as I know, the only—school to be built entirely underground to serve a dual purpose as school and fallout shelter. Designed to withstand a 20-megaton blast ten miles away, the reinforced concrete slab above the school doubled as the playground. I visited it on a parched day in March 2011. You can’t go inside, so I stood on top. Apart from a national historical register plaque and a statue of an eagle with one wing missing, there’s not much to see.
There’s more history here than meets the eye, however. Abo was a tomb prepared for the living dead. The zombie apocalypse has nothing on Abo. A 1963 Saturday Evening Post article titled “Nuclear-age School: New Mexico students pursue knowledge underground,” began with this vignette:
Betsy Anne Hart, a fourth-grader in Artesia, New Mexico, learned something new at school the other day. “Mother,” she burst out when she got home, “did you know there is a room for dead people at our school?” Having a morgue on the premises is just one of the things that makes Betsy Anne’s school unusual. For Abo Public Elementary School, named for a nearby oil formation, is the only school in the nation which lies entirely underground—and which doubles as a fully equipped fallout shelter.228
While students appreciated the air conditioning, and teachers found students to be “less rambunctious” with no windows to distract them, other features of the school elicited negative responses. As a bomb shelter, it contained enough space and provisions for only 2,160 people. Once that limit was reached, the 1,800-pound steel doors would be bolted shut. The other ten thousand or so people in Abo would be out of luck.
This worrisome math didn’t escape the notice of Abo’s students. Sixth-grader Martha Terpening told the Post reporter, “What I’m afraid of is that my mother is a teacher and she would be safe, but my daddy works at the post office and he wouldn’t have any place to go.” Studying underground also fostered other nuclear fears. “You think a lot about the danger while you’re here,” one boy confided. “Sometimes I have the feeling that fallout is coming now—that it is out there now—and then I go out and it isn’t.”229
That Artesia was selected as the location for the school-cum-shelter isn’t surprising. The Trinity Site, where the first atomic bomb was exploded in 1945, lies 100 miles to the west on the White Sands Missile Range. Closer by, just up Highway 285 toward Roswell, Atlas nuclear missile silos dotted the landscape, placed there by the Strategic Air Command in the early 1960s. The region was a Cold War epicenter, a potential target. Artesia’s terrorized residents agreed to the school, while in Roswell many people displaced their nuclear fears onto UFOs and space aliens. Roswell now hosts the International UFO Museum and Research Center, another must-see on New Mexico’s nuclear tourist trail. My husband took my picture there with the model of a space alien who looked like a cross between Casper the Ghost and ET.
The Abo school is an extreme example of the shelter craze that seized the country in the early 1960s. Before then, the government’s civil defense preparations centered on the evacuation of city dwellers into rural areas as a short-term response, and suburbanization as a long-term strategy, to reduce likely nuclear death tolls amid major urban targets.230 Released in 1951, the Duck and Cover booklet and animated film featuring Bert the Turtle ushered in an era of bomb drills in schools across the country. A pamphlet called Survival under Atomic Attack instructed adults on how to protect themselves by sheltering in a culvert or their cars.231 In a particularly macabre exercise, New York City school officials experimented with giving pupils dog tags so their bodies could be identified after an atomic blast.232 But with the development of the even more powerful hydrogen bomb, increased knowledge about fallout, and then the introduction in the late 1950s of ICBMs that reduced warning times to 15 minutes, it became more obvious that Bert the Turtle could no longer safely hide under his shell.
Civil defense officials had started some promotion of home shelters in the 1950s, likening nuclear attacks to survivable natural disasters like hurricanes, earthquakes, and tornadoes. But few home shelters were actually built—by March 1960 there were only an estimated 1500 nationwide.233 Then in the summer of 1961 the Cold War suddenly heated up. When Khrushchev threatened to kick the Western allies out of Berlin, President Kennedy responded by raising the specter of a nuclear response. In July he called for a $207 million initiative to fund public and private fallout shelters, claiming that “the lives of those families which are not hit in a nuclear blast and fire can still be saved—if they can be warned to take shelter and if that shelter is available.”234 The Cuban missile crisis the following year brought the world even closer to the brink of destruction.
The home shelter business took off. It was a shady business, complete with scam artists making quick bucks off people’s fears. Across the country, shopping centers and trade shows sported models. By 1965 the number of shelters in American homes had risen to about 200,000, but then the craze died out. Historian Kenneth Rose describes the reasons for the demise of the home shelter: The 1963 nuclear test ban treaty eased superpower tensions. A well-designed and well-stocked shelter was too expensive for many families. Psychologically, many people were resistant to the idea of burrowing underground like moles. And a national conversation about “shelter morality” brought up uncomfortable issues, like whether or not you would “Gun Thy Neighbor” if he or she pleaded to be let in. Added to these was a sense of mental fatigue from accommodating so long the prospect of Armageddon.235
Even if most Americans didn’t burrow in shelters, the idea of the bomb shelter burrowed deep into many of us. Scholars have analyzed the ways Cold War civil defense reflected and reinforced the dominant stereotypes of the time. The families depicted in bomb shelter propaganda and advertisements were invariably white and middle class, with father and son doing the construction and mother and daughter laying in the provisions.236 The nuclear family for the nuclear age.
I came across my first bomb shelter at the age of 10 or 11, in a shopping center a few miles from our house in the burgeoning suburbs of north Dallas. There in the parking lot, a pod-like model was on display, complete with a mannequin nuclear family. The attractive mom was serving a tray of what looked like TV dinners. By the time of the Cuban missile crisis, rich classmates of mine had their own bomb shelters outside town on their daddies’ ranches. I wasn’t exactly jealous of them—I suffer from claustrophobia—but there was something that fascinated me about shelters, their doll-house quality, as if they were made for play, for setting the mannequins in motion.
The image of the mommy mannequin stayed with me for a long time. I’d seen ones like her before in glittery Dallas department stores, where like giant Barbie dolls they offered up their nubile bodies to be clothed in the latest fashions. Mannequins were about buying, not dying, but the two became connected in my mind. It was as if the spectacle of the bomb and the spectacle of consumerism were two sides of a coin. Inside the bomb shelter, food and water would surely run out—I was old enough to make that calculation. But outside, while the world still existed, affluence beckoned like the Christmas star.
I didn’t know then that mannequins were also used in experiments at the Nevada Test Site, where the army constructed mock towns and suburbs to see the effects of the blasts. The mannequins who populated them were always white and well-dressed. In a perverse form of product placement, in one test they sported clothes donated by the J.C. Penney corporation.237 A 1953 issue of National Geographic featured a piece on how “Nevada Learns to Live with the Atom . . . Sagebrush State Takes the Spectacular Tests in Stride.” In one photograph, a well-heeled female mannequin is driving a Cadillac—lucky for her, the bomb only buckled the top of the car. In another, a “winsome” female dummy, a classy robe slipping provocatively off her shoulders, sits smilingly intact in a cellar shelter.238
The ’50s zeitgeist was of course weird and unsettling in other ways, too. Women who had joined the formal workforce during World War II were pushed back into an overdetermined domesticity. Television came to supersede radio as a form of entertainment, and we were brought up on Leave it to Beaver, Mickey Mouse, the black stereotypes of Amos ’n’ Andy, and the routine slaughtering of Native Americans and Mexicans in popular Westerns. Paranoia spread that Communists and perverts were hiding under every bed. Along with the Cold War, there was a hot war in Korea. Independence struggles were waged across the globe, from Cuba, to Algeria, to Vietnam. We were taught to fear the dangerous forces let loose by decolonization.
Underneath the surface of things, one could also sense the ice cracking, culturally and politically. Alternative music, art, and literary scenes flourished while the civil rights movement started to shake up the foundations of our segregated society. But of all the many influences on us, what really punctuated our childhood psyches was the flaming red exclamation point of nuclear apocalypse.
In the first decades of the Cold War, remarkably little attention was paid to the bomb’s psychological and emotional effects on children, or for that matter on adults, except as they pertained to the efficient execution of civil defense strategy. The government tried to strike a careful balance between scaring the public enough to take civil defense seriously and keeping them from panicking, or even worse, rejecting nuclear weapons altogether.
The psychology profession offered a helping hand in the propaganda operations needed to establish this “nuclear normality,” an early example of the militarization of inner space.239 In the mid-1950s, the National Security Council and the Federal Civil Defense Administration enlisted prominent psychiatrists on a panel to study how to prepare Americans to accept the risk of a nuclear attack. In its 1956 report, the panel recommended “less emphasis on the symbols and images of disaster,” since drawing attention to the possibility of annihilation could cause the public to be “attuned to the avoidance of nuclear war, no matter what the cost,” a pacifist response that would weaken support for the government’s policies. Instead the authors recommended a patriotic call to “our pioneer background and inheritance [which] predispose us to count hardships as a challenge and fortify us against complacency.”240 The choice of words was ironic, for public complacency was exactly what they were aiming for.
It wasn’t until the early 1960s that a few true pioneers in the psychology profession ventured to learn how children were experiencing the nuclear threat. One of them was Sibylle Escalona, a researcher and clinician associated with the prestigious Menninger Clinic in Kansas. Her best known work was on the emotions and play of infants. In 1962–63, Escalona and her colleagues conducted a survey of 311 New York City area schoolchildren between the ages of 10 and 17 to assess their views of the future. Even though the survey made no specific reference to war, more than two-thirds of the students, who were drawn from varied socioeconomic backgrounds, mentioned it. “All the people will die and the world will blow up,” wrote one. Even those who didn’t think war would happen imagined a grim future—for example, one in which everyone was forced to live underground. Only a few perceived the danger as coming from Communism; the vast majority of the kids saw the problem as nations and people just needing to get along better. “I wish Russia and Cuba [would] be our friends,” wrote a ten-year-old in what was a common refrain.241
Psychologist Milton Schwebel at New York University’s School of Education conducted similar research. A peace activist, Schwebel helped to found the field of peace psychology. He led a fight against a New York State plan to build bomb shelters in schools; his efforts brought praise from Eleanor Roosevelt.242 In 1961–62, Schwebel and his colleagues surveyed about 3000 students, mostly of high school age, in the New York City area, upstate New York, and suburban Philadelphia. The questions they asked were more directed than Escalona’s open-ended ones: Did students think there was going to be a war? Did they care? What did they think about fallout shelters?
The researchers found that students cared deeply about the threat of nuclear war. Indeed, students described “the nightmarish horrors with such vividness” that one might think they had read accounts of Hiroshima survivors.243 Most believed that shelters should be available to everyone, and they worried about being separated from their families during an attack. Many wished for peace. “Time and again,” Schwebel wrote, “the students described their universe as a highly uncertain one, its people greedy and irrational, its future questionable. Their great hope lay in the fact that no nation could win and that rational people would not choose suicide, or that, at least conflict would be postponed until they had a chance ‘to live’, i.e. to work, marry, have children.”244 While most functioned normally in their day-to-day lives, these fears gnawed at them, and some turned to denial. Schwebel’s conclusion was that the threat of nuclear disaster should be a focus of “therapeutic collective action”—by helping to build a more peaceful world, students would feel more secure.245
What was the reaction to these remarkable studies? The silence was deafening. For more than a decade, no other researchers touched the topic. Was this because the young people were voicing truths and fears that government and society wanted to avoid? Or because the psychologists themselves couldn’t deal emotionally with the subject, since they too were afraid to plumb the depths of nuclear fear, worried as they were about their own families’ safety?246 It wasn’t until the late 1970s that attention returned to the issue, when the American Psychiatric Association Task Force on the Psychosocial Impacts of Nuclear Developments surveyed over a thousand high school students. They found “a profound dis-ease and uncertainty about the future and a considerable amount of general pessimism” in student attitudes about nuclear war, civil defense, and survival.247
The election of President Ronald Reagan in 1980 brought the Cold War to another boiling point, as US nuclear policy shifted from the status quo of MAD, mutual assured destruction, to the demented Strategic Defense Initiative, popularly known as “Star Wars.” In 1983 Reagan announced the government’s plan to build a ground- and space-based missile defense system that would protect the country from nuclear attack. In reality, Star Wars was more about a first strike offense than a last ditch defense. If we could attack the Soviets first and protect ourselves from a counter-attack, we could win a nuclear war. And if a few Soviet bombs managed to penetrate our shield, we could tough it out. As T.K. Jones, Deputy Under Secretary of Defense for Strategic and Nuclear Forces, advised: “Dig a hole, cover it with a couple of doors and then throw three feet of dirt on top . . . It’s the dirt that does it . . . [I]f there are enough shovels to go around, everybody’s going to make it.”248
Reagan’s belligerence galvanized anti-nuclear resistance. The same year that he announced the Star Wars initiative, the widely viewed TV film The Day After graphically represented what life would be like for a family in Lawrence, Kansas in the aftermath of an atomic bomb. Fears of a nuclear winter enveloping the globe sent shivers up the spine. In the US and Europe, anti-nuclear movements gathered supporters and gained strength.
This sparked a new wave of research on children by psychologists and educators. In addition to undertaking student surveys, groups such as Educators for Social Responsibility and the Union of Concerned Scientists created curricula and sponsored dialogues in schools where children could voice their concerns about nuclear war.249 Anti-nuclear curricula in the schools drew the ire of conservative hawks. The 1983 Congressional hearing on Children’s Fears of War, conducted by the Select Committee on Children, Youth, and Families, provides a window on how this aspect of child psychology became politicized. A Republican representative from Virginia denounced anti-nuclear curricula as a form of “political indoctrination.” A Kansas psychiatrist testified that it could lead to the “devitalization” of America and induce in students despair, hopelessness, and unwillingness to support the military. He offered a different explanation for rising symptoms of psychological distress among children:
As you know, family life in our society is deteriorating at a terrifying rate. The divorce epidemic is the major factor for this deterioration, but the mass exodus of women from the home, often due to economic pressure but also and probably largely to the seductive but false drumbeat of the women’s lib movement are major determinants . . . The developing child pays the highest penalty for the breakup of the home, the part-time or pathological home.250
Prominent anti-nuclear psychiatrists, including Robert Jay Lifton, together with three eloquent students from Iowa, New York, and California, offered opposing views.
I was in my early thirties at the time of that hearing, living in England, soon to become pregnant with my first child. At the time, the sense of impending apocalypse was very real. I remember sitting around with my husband and friends and worrying about the risk of nuclear war in Europe. On the wall of our apartment we had a mock movie poster of Gone with the Wind, picturing Ronald Reagan holding Margaret Thatcher in his arms with a mushroom cloud in the background. “The Film to End all Films,” the caption read, “The Most EXPLOSIVE Love Story Ever.” And below the picture: “She promised to follow him to the end of the earth. He promised to organize it.”
Such gallows humor helped—just as Stanley Kubrick’s brilliant film Dr. Strangelove had helped in the 1960s—but I found real therapy, as Dr. Schwebel had recommended two decades before, in collective action. I marched with thousands of protestors against cruise missiles in Amsterdam in 1981, alongside a contingent of Dutch soldiers in uniform who were cheered when they joined the rally. In 1983 I marched with a million people in London in the Campaign for Nuclear Disarmament’s protest over the siting of American cruise and Pershing II missiles in England. Almost every day brought news of inspiring actions at peace encampments, such as the famous women’s camp near Greenham Common in England where tens of thousands of women joined hands around the Royal Air Force base’s perimeter fence. Back in the US, the nuclear freeze movement attracted many new supporters, including Democratic Party leaders. The sense that we were finally waking up from the nuclear nightmare was empowering. And the historical record suggests that the anti-nuclear movement really was powerful—it helped to push the Americans and Soviets toward disarmament.251
With the end of the Cold War, the potential for nuclear Armageddon receded from public consciousness. There has been little research about its lingering psychological effects in the baby boom generation. Is the bomb at the root of some of my generation’s pervasive anxieties? Does it continue to influence our relationships with people, nature, and death? Does it make us more susceptible to apocalyptic fears? To speak in generational terms is surely to over-generalize, but the times you grow up in matter and never stop mattering. They churn the water you swim in, steer the direction of the currents, make it easier or harder to come up for breath.
Of all the psychologists and psychiatrists who have written on the human dimensions of the nuclear bomb, Robert Jay Lifton’s work remains the most relevant today. In the early 1960s, Lifton was the first American to study the psychological legacy of the bomb in Japanese survivors. He went on from there to study its effects at home, becoming a powerful professional and political voice against nuclear weapons.
Lifton uncovered a number of influences. The fact that the bomb was shrouded in secrecy gave it the special power of forbidden knowledge, especially for children. Adults developed psychic numbing as a defensive measure against the ever-present threat of mass annihilation. Indeed, the continued existence of the nuclear weapons complex depended on a high degree of collective numbing. When fear periodically broke through, it often led to feelings of resignation, cynicism, and a bleak view of the human species. “Well, what is so special about man?” is how Lifton describes the syndrome. “Other species have come and gone, so perhaps this is our turn to become extinct.”252 One hears this same refrain today among those despondent about climate change. For many, the bomb represented the final triumph of the machine over humanity, Frankenstein on nuclear steroids.
But numbing and cynicism weren’t universal reactions. Some people were keenly aware that they were leading a double life: going about their day-to-day business when at any moment they and their loved ones, and maybe the whole planet, could be obliterated. In the best of circumstances, the radical absurdity of this contrast inspired political art and organized action against the bomb. “It is when we lose our sense of nuclear absurdity that we surrender to the forces of annihilation and cease to imagine the real,” Lifton wrote.253
The bomb profoundly shaped human relationships with nature too. Images of mushroom clouds and nuclear devastation intensified painful feelings of separation from the ideal of a healing and eternal nature. As explored before, the desire to overcome that separation helps to explain the lure of the 1960s back-to-the-land movement. With a doctrinaire emphasis on self-sufficiency, many back-to-the-landers embraced survivalist strategies, acting “as if the bomb had already been dropped.”254 Writer Janferie Stone vividly captures the mood:
The communal movement must be posed against our sense of the world as a terminally dangerous place. Our dreams were reft by images of nuclear holocaust; we were the generation who had practiced hiding under our desks in the Cuban Missile Crisis. We had bomb shelter visions of a world that, if poisoned, might begin anew. Humanity, nuclearly cleansed, tutored by destruction, might do better in such a future . . . We thought that in a community of scale we could pick up the pieces, we could create if not a new society then an On the Beach fulfillment of each day that we had yet to live.255
The search for purity took other forms as well. No doubt there were many reasons for getting high on drugs, or meditation, or both, but one was a yearning for transcendent, peak experiences as a counterweight to the prospect of atomic extinction. “When the structure of existence is threatened, people seek to do more with or to their bodies, to extend the experience of their total organisms,” observed Lifton.256 For writer Norman Mailer, the radical experientialism of the nuclear age was gloriously embodied in the macho figure of the “hipster” or “the American existentialist” who confronts death, divorces himself from society, and sets out on “that uncharted journey into the rebellious imperatives of the self.”257
The prospect of the bomb dropping also functioned as a metaphor of escape from the boring routines and existential malaise of everyday life. “The malaise has settled like a fallout,” wrote Walker Percy in his novel The Moviegoer, “and what people really fear is not that the bomb will fall but that the bomb will not fall.”258
Among those who chose to tow the government line and embrace nuclear weapons, the result could be a retreat into fundamentalism. Some made sense of the world through rigid categories of good and evil, a nostalgia for the past, and religious convictions about the future. For others of a more secular persuasion, the bomb created a sense of “radical futurelessness,” sowing doubts about the authenticity and endurance of individual achievement.259
But of all the psychological effects of the bomb, Lifton saw our changed relationship with death as the “most fundamental psychic deformation.”260 The bomb turned what would otherwise be our normal fears of death—or rather our fears of normal death—into grotesque images of nuclear annihilation, creating profound anxiety about the end of not only our own lives, but of life itself. One of the major psychological challenges of our times, then, was to reclaim “plain old death” and separate it from the insanity of nuclear holocaust.261
Many of Lifton’s insights ring true in my own experience: growing up with the sense of a double life and the absurdity of it all, seeking a purer relationship with nature, doubting the value of my own achievements, fearing death as a violent cataclysm. In a country of abundance, I shared the pervasive but irrational fear of scarcity, maybe rooted in the anxiety that food, water, and fresh air would run out in the bomb shelter and nothing would grow in the barren world outside. In the end, the mannequin mommy, no matter how pretty and resourceful, could only offer me an empty tray. Lifton didn’t write about this particular fear, but I believe it haunts my generation much as the Great Depression haunted our parents.
If I were lying on a therapist’s couch, she or he would tell me that you can’t blame all of your problems just on the bomb. Likewise, historians would remind us that we can’t isolate the bomb’s impact from the wide drive for American military supremacy after World War II. True enough. But even so, the bomb itself casts a long shadow, accounting in no small measure for the endurance of apocalyptic thought in other realms.
The bomb still threatens world peace, too, even though the prospect of an all-out nuclear war thankfully has diminished. Political instability in Pakistan casts doubt on the safety of its nuclear facilities, North Korea remains a nuclear wildcard, and the open secret that Israel has the bomb ups the ante in the Middle East. Poorly protected sites in the former Soviet Union raise the specter of terrorists acquiring nuclear materials.
Between them, the US and Russia now possess almost 95 percent of the world’s nuclear warheads. The US has nearly 5000. Despite hopeful pronouncements early in Obama’s tenure, the important diplomatic effort to prevent Iran from developing the bomb, and his historic visit to Hiroshima in 2016 in which he called for a world without nuclear weapons, Barack Obama did less to cut the nuclear arsenal than his predecessors Bill Clinton and both Bushes. Over the next 10 years, the US government plans to spend at least $570 billion—the price tag could rise to a trillion—for new nuclear weaponry and related programs, including 12 new nuclear-armed submarines that can carry up to 1000 hydrogen bombs between them. The government’s latest Nuclear Posture Review still holds open the possibility of using nuclear weapons in response to nuclear, biological, or chemical weapons deployment by countries that are judged to be in non-compliance with nonproliferation obligations.262 Donald Trump’s cavalier statements about nuclear weapons are frightening, to say the least. Also alarming is a Pentagon advisory board’s recent suggestion that the US increase its stock of lower-yield nuclear weapons for possible “limited use” in regional conflicts.263 Disarmament, in other words, has a long way to go.
Meanwhile the specter of the bomb continues to work its political magic. “We don’t want the smoking gun to be a mushroom cloud,” National Security Advisor Condoleezza Rice memorably warned in 2002 as the Bush administration ramped up fears of Iraq’s supposed weapons of mass destruction. This ostensible threat helped soften up the American public for the invasion of Iraq the following year.264
In summer 2014, national headlines about Artesia, New Mexico caught my eye. Six hundred mothers and their children seeking asylum from drug war-related violence in Central America were imprisoned in a federal immigration detention facility only a few miles away from the site of the Abo underground elementary school.265 “We will send you back,” Obama’s Homeland Security Secretary Jeh Johnson warned the incarcerated women and children when he visited the facility.266 A lawsuit before the federal District Court in Washington charged officials in Artesia with “egregious” violations of due process, including lack of access to lawyers and rushed video-teleconference deportation hearings.267 The detention facility has now been shut down and the mothers and children coming over the border have been imprisoned elsewhere.
Perhaps it’s not surprising that Homeland Security set up an immigrant detention center in the bleak nuclear landscape of southeastern New Mexico. During the Cold War, the government claimed to be protecting Artesia’s children from the enemy. In 2014 Homeland Security claimed the nation needed protection from the immigrant children they were rushing to deport.
Meanwhile, the real threat to security in the region lies 30 miles down the road at the WIPP nuclear waste dump. It, too, made national headlines in 2014 when a chemical reaction caused a radioactive leak that traveled up the mine shaft into the open environment. It turns out a container of nuclear waste from Los Alamos National Laboratory was improperly packaged with an organic brand of kitty litter that can generate high heat, instead of the usual inorganic product.268 At first the accident’s seriousness was downplayed, but it forced the closure of WIPP—it’s only just been reopened in January 2017—with the cost of clean-up perhaps topping $2 billion, making it one of the costliest nuclear accidents in US history. A new ventilation system needed to be put in place, and 35 percent of the underground area was contaminated. Meanwhile, in nuclear facilities like Los Alamos and the Hanford Site in Washington state, nuclear waste destined for WIPP has gotten dangerously backed up. Originally, the Energy Department calculated that one such incident might occur at the WIPP facility every 200,000 years. This one happened only 15 years after its opening and the cause was something as simple as the wrong kind of kitty litter.269 It was just lucky that only one barrel was contaminated.
And so the deadly gift of the Manhattan Project keeps on giving. The probability of human error underscores the continuing mortal danger posed by the nuclear weapons complex. Nuclear near-misses—and there have been many—turn up the volume on apocalypse. Journalist Eric Schlosser uncovered a January 1961 incident in which two hydrogen bombs were accidentally dropped by a B-52 bomber over North Carolina when the plane broke up in mid-air. Had it not been for one final low-voltage switch, one of the bombs would have detonated, creating a blast 260 times stronger than the bomb that obliterated Hiroshima.270 In more recent days, the Air Force has been rocked by revelations of cheating on proficiency tests as well as safety and code protection violations among members of nuclear missile launch crews.271 The scandals hardly inspire confidence in the system designed to secure our weapons of mass destruction and prevent their inadvertent deployment.
One of my last stops in New Mexico was the office of Tewa Women United, an indigenous women’s organization in the town of Española, not far from Los Alamos. There I spoke with Beata Tsosie-Peña, the coordinator of the group’s environmental justice program. Tsosie-Peña knows firsthand about the ongoing health threats to tribal communities posed by waste from Los Alamos, including elevated cancer rates. She participated in a project of the US Centers for Disease Control and Prevention (CDC) that documented for the first time the exposure of area residents to high levels of radiation. The preface to the CDC’s final report features a poem by Tsosie-Peña. She calls for a time of healing for “the good of all future generations”:
Let us share the stories that have never been told
And release the pain not even a century old
No longer shamed by accusations of ignorance
Let our diverse voices be our deliverance . . .272
Remembering the past, and believing in the future, can be a powerful antidote to apocalyptic despair. Beata now campaigns for transforming Los Alamos from a nuclear lab to a site of environmental restoration and research into renewable energy.
The bomb was never a given. Its development was, and is, a conscious choice by national security officials. Stripping the bomb of its magic and inevitability positions it differently. We can rid the world of the scourge of nuclear weapons if we choose to do so. As the history of the Cold War shows, sanity backed by the force of collective action can prevail. That may not be enough to rid the mind of apocalyptic fears, but it’s a helpful start toward deliverance.