Thirteen

image

A WORLD OF KNOWLEDGE

image
In an era of American abundance, TV sets in a store window broadcast Eisenhower’s announcement of his decision to run for reelection in 1956.

THE END OF TIME BEGAN AT EIGHT FIFTEEN ON THE morning of August 6, 1945. “Miss Toshiko Sasaki, a clerk in the personnel department of the East Asia Tin Works, had just sat down at her place in the plant office and was turning her head to speak to the girl at the next desk,” the writer John Hersey reported in The New Yorker. “Just as she turned her head away from the windows, the room was filled with a blinding light. She was paralyzed by fear, fixed still in her chair for a long moment.”

Everything fell, and Miss Sasaki lost consciousness. The ceiling dropped suddenly and the wooden floor above collapsed in splinters and the people up there came down and the roof above them gave way; but principally and first of all, the bookcases right behind her swooped forward and the contents threw her down, with her left leg horribly twisted and breaking underneath her. There, in the tin factory, in the first moment of the atomic age, a human being was crushed by books.1

In the first moment of the atomic age, a human being was crushed by books: the violence of knowledge.

Hiroshima marked the beginning of a new and differently unstable political era, in which technological change wildly outpaced the human capacity for moral reckoning. It wasn’t only the bomb, and the devastation it wreaked. It was the computers whose development had made dropping the bomb possible. And it was the force of technological change itself, a political power unchecked by an eighteenth-century constitution and unfathomed by a nineteenth-century faith in progress.

Truman got word of the bombing on board a cruiser. The White House told the press the next day. The story went out over the radio at noon. Listeners reeled. John Haynes Holmes, a Unitarian minister and avowed pacifist, was on vacation at a cottage in Kennebunk, Maine. “Everything else seemed suddenly to become insignificant,” he said, about how he felt when he heard the news. “I seemed to grow cold, as though I had been transported to the waste spaces of the moon.” Days later, when the Japanese were forced to surrender, Americans celebrated. In St. Louis, people drove around the city with tin cans tied to the bumpers of their cars; in San Francisco, they tugged trolley cars off their tracks. More than four hundred thousand Americans had died in a war that, worldwide, had taken the lives of some sixty million people.2

And yet, however elated at the peace, Americans worried about how the war had ended. “There was a special horror in the split second that returned so many thousand humans to the primeval dust from which they sprang,” one Newsweek editorial read. “For a race which still did not entirely understand steam and electricity it was natural to say: ‘who next?’” Doubts gathered, and grew. “Seldom if ever has a war ended leaving the victors with such a sense of uncertainty and fear,” CBS’s Edward R. Murrow said. “We know what the bombs did to Hiroshima and Nagasaki,” wrote the editors of Fortune. “But what did they do to the U.S. mind?”3

Part of the uncertainty was a consequence of the surprise. Americans hadn’t known about the bomb before it fell. The Manhattan Project was classified. Even Truman hadn’t known about it until after FDR’s death. Nor had Americans known about the computers the military had been building, research that had also been classified, but which was dramatically revealed the winter after the war. “One of the war’s top secrets, an amazing machine which applies electronic speeds for the first time to mathematical tasks hitherto too difficult and cumbersome for solution, was announced here tonight by the War Department,” the New York Times reported from Philadelphia on February 15, 1946, in a front-page story introducing ENIAC, the Electronic Numerical Integrator and Computer, the first general-purpose electronic digital computer. Inside, the Times ran a full-page spread, including a photograph of the computer, the size of a room.4 It was as if the curtain had been lifted, a magician’s veil.

Like the atomic bomb, ENIAC was produced by the American military to advance the cause of war and relied on breakthroughs made by scientists in other parts of the world. In 1936, the English mathematician Alan Turing completed a PhD at Princeton and wrote a paper called “On Computable Numbers,” in which he predicted the possibility of inventing “a single machine that can be used to compute any computable sequence.”5 The next year, Howard Aiken, a doctoral student at Harvard, poking around in the attic of a Harvard science building, found a model of Charles Babbage’s early nineteenth-century Difference Engine; Aiken then proposed, to IBM, to build a new and better version, not mechanical but electronic. That project began at IBM in 1941 and three years later moved to Harvard, where Aiken, now a naval officer, was in charge of the machine, known as Mark I; Columbia astronomer L. J. Comrie called it “Babbage’s dream come true.” The Mark I was programmed by a longtime Vassar professor, the brilliant mathematician Grace Murray Hopper. “Amazing Grace,” her colleagues nicknamed her, and she understood, maybe better than anyone, how far-reaching were the implications of a programmable computer. As she would explain, “It is the current aim to replace, as far as possible, the human brain.”6

During the war, the Allied military had been interested in computers for two primary reasons: to break codes and to calculate weapons trajectories. At Bletchley Park, a six-hundred-acre manorial estate fifty miles northwest of London that became a secret military facility, Turing, who would later be prosecuted for homosexuality and die of cyanide poisoning, had by 1940 built a single-purpose computer able to break the codes devised by Germany’s Enigma machine. At the University of Pennsylvania, physicist John Mauchly and electrical engineer Presper Eckert had been charged with calculating firing-angle settings for artillery, work that required iterative and time-consuming calculations. To do that work, American scientists had been using an analog computer called a differential analyzer, invented at MIT in 1931 by FDR research czar Vannevar Bush, an electrical engineer. Numbers were entered into the differential analyzer by people who were known as “computers,” and who were usually women with mathematics degrees, not unlike the “checkers,” women with literature degrees, who worked at magazines. But even when these women entered numbers around the clock, it took a month to generate a single artillery-trajectory table. In August 1942, Mauchly proposed using vacuum tubes to build a digital electronic computer that would be much faster. The U.S. War Department decided on April 9, 1943, to fund it. Construction of ENIAC began in June 1943, but it wasn’t fully operational until July 1945. ENIAC could make calculations a hundred times faster than any earlier machine. Its first assignment, in the fall of 1945, came from Los Alamos: using nearly a million punch cards, each prepared and entered into the machine by a team of female programmers, ENIAC calculated the force of reactions in a fusion reaction, for the purpose of devising a hydrogen bomb.7

image
Vassar mathematician Grace Murray Hopper programmed Mark I.

The machines built to plot the trajectories and force of missiles and bombs would come to transform economic systems, social structures, and the workings of politics. Computers are often left out of the study of history and government, but, starting at the end of the Second World War, history and government cannot be understood without them. Democracies rely on an informed electorate; computers, the product of long and deep study and experiment, would both explode and unsettle the very nature of knowledge.

The boundlessness of scientific inquiry also challenged the boundaries of the nation-state. After the war, scientists were among the loudest constituencies calling for international cooperation and, in particular, for a means by which atomic war could be averted. Instead, their work was conscripted into the Cold War.

The decision to lift the veil of secrecy and display ENIAC to the public came at a moment when the nation was engaged in a heated debate about the role of the federal government in supporting scientific research. During the war, at the urging of Vannevar Bush, FDR had created both the National Defense Research Committee and the Office of Scientific Research and Development. (Bush headed both.) Near the end of the war, Roosevelt had asked Bush to prepare a report that, in July 1945, Bush submitted to Truman. It was called “Science, the Endless Frontier.”8

“A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade,” Bush warned. “Advances in science when put to practical use mean more jobs, higher wages, shorter hours, more abundant crops, more leisure for recreation, for study, for learning how to live without the deadening drudgery which has been the burden of the common man for ages past.”9

At Bush’s urging, Congress debated a bill to establish a new federal agency, the National Science Foundation. Critics said the bill tied university research to the military and to business interests and asked whether scientists had not been chastened by the bomb. Scientific advances did indeed relieve people of drudgery and produce wealth and leisure, but the history of the last century had shown nothing if not that these benefits were spread so unevenly as to cause widespread political unrest and even revolution; the project of Progressive and New Deal reformers had been to protect the interests of those left behind by providing government supports and regulations. Could this practice be applied to the federal government’s relationship to science? Democratic senator Harley M. Kilgore, a former schoolteacher from West Virginia, introduced a rival bill that extended the antimonopoly principles of the New Deal to science, tied university research to government planning, and included in the new foundation a division of social science, to provide funding for research designed to solve social and economic problems, on the grounds that one kind of knowledge had gotten ahead of another: human beings had learned how to destroy the entire planet but had not learned how to live together in peace. During Senate hearings, former vice president Henry Wallace said, “It is only by pursuing the field of the social sciences comprehensively” that the world could avoid “bigger and worse wars.”10

Many scientists, including those who belonged to the newly formed Federation of Atomic Scientists, agreed, and two rivulets of protest became a stream: a revision of Kilgore’s bill was attached to a bill calling for civilian control of atomic power. Atomic scientists launched a campaign to enlist the support of the public. “To the village square we must carry the facts of atomic energy,” Albert Einstein said. “From there must come America’s voice.” Atomic scientists spoke at Kiwanis clubs, at churches and at synagogues, at schools and libraries. In Kansas alone, they held eight Atomic Age Conferences. And they published One World or None: A Report to the Public on the Full Meaning of the Atomic Bomb, essays by atomic scientists, including Leo Szilard and J. Robert Oppenheimer, and by political commentators, including Walter Lippmann. Albert Einstein, in his essay, argued for “denationalization.”11

Against this campaign stood advocates for federal government funding of the new field of computer science, who launched their own publicity campaign, beginning with the well-staged unveiling of ENIAC. It had been difficult to stir up interest. No demonstration of a general-purpose computer could have the drama of an atomic explosion, or even of the 1939 World’s Fair chain-smoking Elektro the Moto-Man. ENIAC was inert. Its vacuum tubes, lit by dim neon bulbs, were barely visible. When the machine was working, there was no real way to see much of anything happening. Mauchly and Eckert prepared press releases and, in advance of a scheduled press conference, tricked up the machine for dramatic effect. Eckert cut Ping-Pong balls in half, wrote numbers on them, and placed them over the tips of the bulbs, so that when the machine was working, the room flashed as the lights flickered and blinked. It blinked fast. The Times gushed, “The ‘Eniac,’ as the new electronic speed marvel is known, virtually eliminates time.”12

The unintended consequences of the elimination of time would be felt for generations. But the great acceleration—the speeding up of every exchange—had begun. And so had the great atomization—the turning of citizens into pieces of data, fed into machines, tabulated, processed, and targeted, as the nation-state began to yield to the data-state.

I.

THE END OF THE WAR marked the dawn of an age of affluence, a wide and deep American prosperity. It led both to a new direction for liberalism—away from an argument for government regulation of business and toward an insistence on individual rights—and to a new form of conservatism, dedicated to the fight against communism and deploying to new ends the rhetoric of freedom.

The origins of postwar prosperity lay in the last legislative act of the New Deal. In June 1944, FDR had signed the Serviceman’s Readjustment Act, better known as the G.I. Bill of Rights. It created a veterans-only welfare state. The G.I. Bill extended to the sixteen million Americans who served in the war a series of benefits, including a free, four-year college education, zero-down-payment low-interest loans for homes and businesses, and a “readjustment benefit” of twenty dollars a week for up to fifty-two weeks, to allow returning veterans to find work. More than half of eligible veterans—some eight million Americans—took advantage of the G.I. Bill’s educational benefits. Those who did enjoyed average earnings of $10,000–$15,000 more than those who didn’t. They also paid more in taxes. By 1948, the cost of the G.I. Bill constituted 15 percent of the federal budget. But, with rising tax revenues, the G.I. Bill paid for itself almost ten times over. It created a new middle class, changed the face of American colleges and universities, and convinced many Americans that the prospects for economic growth, for each generation’s achieving a standard of living higher than the generation before, might be limitless.13

image
The G.I. Bill made it possible for a generation of Americans to attend college. In September 1947, three jubilant former servicemen leave a student union at Indiana University, waving their notices of admission.

That growth was achieved, in part, by consumer spending, as factories outfitted for wartime production were converted to manufacture consumer goods, from roller skates to color televisions. The idea of the citizen as a consumer, and of spending as an act of citizenship, dates to the 1920s. But in the 1950s, mass consumption became a matter of civic obligation. By buying “the dozens of things you never bought or even thought of before,” Brides magazine told its readers, “you are helping to build greater security for the industries of this country.”14

Critics suggested that the banality and conformity of consumer society had reduced Americans to robots. John Updike despaired: “I drive my car to supermarket, / The way I take is superhigh, / A superlot is where I park it, / And Super Suds are what I buy.”15 Nothing epitomized what critics called the “Packaged Society” so much as Disneyland, an amusement park that had opened in 1955 as a reimagined 1939 World’s Fair, more provincial and more commercial, with a Main Street and a Tomorrowland. In Frontierland, Walt Disney explained, visitors “can return to frontier America, from the Revolutionary Era to the final taming of the great southwest,” riding stagecoaches and Conestoga wagons over dusty trails and boarding the steamship Mark Twain within sight of the park’s trademark turquoise-towered castle, a fairyland that sold itself as “The Happiest Place on Earth.”16

Most of the buying was done by women: housewives and mothers. The home, which had become separated from work during the process of industrialization, became a new kind of political space, in which women met the obligations of citizenship by spending money. Domesticity itself took on a different cast, as changes to the structure of the family that had begun in the Depression and continued during the war were reversed. Before the war, age at first marriage had been rising; after the war, it began falling. The number of children per family had been falling; it began rising. More married women and mothers of young children had been entering the paid labor force; they began leaving it. Having bigger families felt, to many Americans, an urgent matter. “After the Holocaust, we felt obligated to have lots of babies,” one Jewish mother later explained. “But it was easy because everyone was doing it—non-Jews, too.” Expectations of equality between men and women within marriage diminished, as did expectations of political equality. Claims for equal rights for women had been strenuously pressed during the war, but afterwards, they were mostly abandoned. In 1940, the GOP had supported the Equal Rights Amendment (first introduced into Congress in 1923), and in 1944 the Democrats had supported it, too. The measure reached the Senate in 1946, where it won a plurality, but fell short of the two-thirds vote required to send an amendment to the states for ratification.17 It would not pass Congress until 1972, after which an army of housewives, the foot soldiers of the conservative movement, would block its ratification.

The G.I. Bill, for all that it did to build a new middle class, also reproduced and even exacerbated earlier forms of social and economic inequality. Most women who had served in the war were not eligible for benefits; the women’s auxiliary divisions of the branches of the military had been deliberately decreed to be civilian units with an eye toward avoiding providing veterans’ benefits to women, on the assumption that they would be supported by men. After the war, when male veterans flocked to colleges and universities, many schools stopped admitting women, or reduced their number, in order to make more room for men. And, even among veterans, the bill’s benefits were applied unevenly. Some five thousand soldiers and four thousand sailors had been given a “blue discharge” during the war as suspected homosexuals; the VA’s interpretation of that discharge made them ineligible for any G.I. Bill benefits.18

African American veterans were excluded from veterans’ organizations; they faced hostility and violence; and, most significantly, they were barred from taking advantage of the G.I. Bill’s signal benefits, its education and housing provisions. In some states, the American Legion, the most powerful veterans’ association, refused to admit African Americans, and proved unwilling to recognize desegregated associations. Money to go to college was hard to use when most colleges and universities refused to admit African Americans and historically black colleges and universities had a limited number of seats. The University of Pennsylvania had nine thousand students in 1946; only forty-six were black. By 1946, some one hundred thousand black veterans had applied for educational benefits; only one in five had been able to register for college. More than one in four veterans took advantage of the G.I. Bill’s home loans, which meant that by 1956, 42 percent of World War II veterans owned their own homes (compared to only 34 percent of nonveterans). But the bill’s easy access to credit and capital was far less available to black veterans. Banks refused to give black veterans loans, and restrictive covenants and redlining meant that much new housing was whites-only.19

Even after the Supreme Court struck down restrictive housing covenants in 1948, the Federal Housing Administration followed a policy of segregation, routinely denying loans to both blacks and Jews. In cities like Chicago and St. Louis and Los Angeles and Detroit, racially restrictive covenants in housing created segregated ghettos where few had existed before the war. Whites got loans, had their housing offers accepted, and moved to the suburbs; blacks were crowded into bounded neighborhoods within the city. Thirteen million new homes were built in the United States during the 1950s; eleven million of them were built in the suburbs. Eighty-three percent of all population growth in the 1950s took place in the suburbs. For every two blacks who moved to the cities, three whites moved out. The postwar racial order created a segregated landscape: black cities, white suburbs.20

The New Deal’s unfinished business—its inattention to racial discrimination and racial violence—became the business of the postwar civil rights movement, as new forms of discrimination and the persistence of Jim Crow laws and even of lynching—in 1946 and 1947, black veterans were lynched in Georgia and Louisiana—contributed to a new depth of discontent. As a black corporal from Alabama put it, “I spent four years in the Army to free a bunch of Dutchmen and Frenchmen, and I’m hanged if I’m going to let the Alabama version of the Germans kick me around when I get home.” Langston Hughes, who wrote a regular column for the Chicago Defender, urged black Americans to try to break Jim Crow laws at lunch counters. “Folks, when you go South by train, be sure to eat in the diner,” Hughes wrote. “Even if you are not hungry, eat anyhow—to help establish that right.”21

But where Roosevelt had turned a blind eye, Truman did not. He had grown up in Independence, Missouri, just outside of Kansas City, and worked on the family farm until the First World War, when he saw combat in France. Back in Missouri, he began a slow ascension through the Democratic Party ranks, starting with a county office and rising to the U.S. Senate in 1934. Roosevelt had chosen him as his running mate in 1944 chiefly because he was unobjectionable; neither wing of the Democratic Party was troubled by Truman. He had played virtually no role in White House affairs during his vice presidency, and was little prepared to move into the Oval Office upon Roosevelt’s death. No president had faced a greater trial by fire than the decision that had fallen to Truman over whether or not to use the atomic bomb. Mild-mannered and myopic, Truman had a common touch. Unlike most American presidents, he had neither a college degree nor a law degree. For all his limitations as a president, he had an intuitive sense of the concerns of ordinary Americans. And, from the very beginning of his career, he’d courted black voters and worked closely with black politicians.

Unwilling to ignore Jim Crow, Truman established a commission on civil rights. To Secure These Rights, its 1947 report, demonstrated that a new national consensus had been reached, pointing to a conviction that the federal government does more than prevent the abuse of rights but also secures rights. “From the earliest moment of our history we have believed that every human being has an essential dignity and integrity which must be respected and safeguarded,” read the report. “The United States can no longer countenance these burdens on its common conscience.”22

Consistent with that commitment, Truman made national health insurance his first domestic policy priority. In September 1945, he asked Congress to act on FDR’s Second Bill of Rights by passing what came to be called a Fair Deal. Its centerpiece was a call for universal medical insurance. The time seemed, finally, right, and Truman enjoyed some important sources of bipartisan support, including from Earl Warren, the Republican governor of California. What Truman proposed was a national version of a plan Warren had proposed in California: compulsory insurance funded with a payroll tax. “The health of American children, like their education, should be recognized as a definite public responsibility,” the president said.23

Warren, the son of a Norwegian immigrant railroad worker, a striker who was later murdered, had grown up knowing hardship. After studying political science and the law at Berkeley and serving during the First World War, he’d become California’s attorney general in 1939. In that position, he’d been a strong supporter of the Japanese American internment policy. “If the Japs are released,” Warren had warned, “no one will be able to tell a saboteur from any other Jap.” (Warren later publicly expressed pained remorse about this policy and, in a 1972 interview, wept over it.) On the strength of his record as attorney general, Warren had run for governor in 1942. Clem Whitaker and Leone Baxter had managed his campaign, which had been notoriously heated. “War-time voters live at an emotional pitch that is anything but normal,” Whitaker had written in his Plan of Campaign. “This must be a campaign that makes people hear the beat of drums and the thunder of bombs—a campaign that stirs and captures the imagination; a campaign that no one who loves California can disregard. This must be A CALL TO ARMS IN DEFENSE OF CALIFORNIA!”24

Warren won, but he didn’t like how he’d won. Just before the election, he fired Whitaker and Baxter. They never forgave him.

Late in 1944, Warren had fallen seriously ill with a kidney infection. His treatment required heroic and costly medical intervention. He began to consider the catastrophic effects a sudden illness could have on a family of limited means. “I came to the conclusion that the only way to remedy this situation was to spread the cost through insurance,” he later wrote. He asked his staff to develop a proposal. After conferring with the California Medical Association, he anticipated no objections from doctors. And so, in his January 1945 State of the State address, Warren announced his plan, a proposal modeled on the social security system: a 1½ percent withholding of wages would contribute to a statewide compulsory insurance program.25 And then the California Medical Association hired Campaigns, Inc.

image
Leone Baxter and Clem Whitaker, who founded Campaigns, Inc., in California in 1933, attained national prominence at the end of the 1940s through their successful defeat of Truman’s health insurance plan.

Earl Warren began his political career as a conservative and ended it as a liberal. Years later, Leone Baxter was asked by a historian what she made of Warren’s seeming transformation. Warren’s own explanation, the historian told Baxter, was this: “I grew up a poor boy myself and I saw the trials and tribulations of getting old without having any income and being sick and not being able to work.” Baxter shot back, “He didn’t see them until that Sunday in 1945.” Then she ended the interview.26

What really changed Earl Warren was Campaigns, Inc. Whitaker and Baxter took a piece of legislation that enjoyed wide popular support and torpedoed it. Fifty newspapers initially supported Warren’s plan; Whitaker and Baxter whittled that down to twenty. “You can’t beat something with nothing,” Whitaker liked to say, so they launched a drive for private health insurance. Their “Voluntary Health Insurance Week,” driven by 40,000 inches of advertising in more than four hundred newspapers, was observed in fifty-three of the state’s fifty-eight counties. Whitaker and Baxter sent more than nine thousand doctors out with prepared speeches. They coined a slogan: “Political medicine is bad medicine.”27 They printed postcards for voters to stick in the mail:

Dear Senator:

Please vote against all Compulsory Health Insurance Bills pending before the Legislature. We have enough regimentation in this country now. Certainly we don’t want to be forced to go to “A State doctor,” or to pay for such a doctor whether we use him or not. That system was born in Germany—and is part and parcel of what our boys are fighting overseas. Let’s not adopt it here.28

When Warren’s bill failed to pass by just one vote, he blamed Whitaker and Baxter. “They stormed the Legislature with their invective,” he complained, “and my bill was not even accorded a decent burial.”29 It was the greatest legislative victory at the hands of admen the country had ever seen. It would not be the last.

II.

RICHARD MILHOUS NIXON counted his resentments the way other men count their conquests. Born in the sage-and-cactus town of Yorba Linda, California, in 1913, he’d been a nervous kid, a whip-smart striver. His family moved to Whittier, where his father ran a grocery store out of an abandoned church. Nixon went to Whittier College, working to pay his way, resenting that he didn’t have the money to go somewhere else. He had wavy black hair; small, dark eyes; and heavy, brooding eyebrows. An ace debater, he’d gone after college to Duke Law School, resented all the Wall Street law firms that refused to hire him when he finished, and returned to Whittier. He went away again, to serve in the navy in the South Pacific. And when he got back, serious and strenuously intelligent Lieutenant Commander Nixon, thirty-two, was recruited by a group of California bankers and oilmen to try to defeat five-term Democratic incumbent Jerry Voorhis for a seat in the House. The man from Whittier wanted to go to Washington.

Voorhis, a product of Hotchkiss and Yale and a veteran of Upton Sinclair’s EPIC campaign, was a New Dealer who’d first been elected to Congress in 1936, but, ten years later, the New Deal was old news. The midterm elections during Truman’s first term—and the fate of his legislative agenda—were tied to heightening tensions between the United States and the Soviet Union. Nixon in California was only one in a small battalion of younger men, mainly ex-servicemen, who ran for office in 1946, the nation’s first Cold Warriors. In Massachusetts, another veteran of the war in the Pacific, twenty-nine-year-old John F. Kennedy, ran for a House seat from the Eleventh District. But, unlike Nixon, he’d been readied for that seat from the cradle.

Kennedy, born to wealth and groomed at Choate and Harvard, represented everything Nixon detested: all that Nixon had fought for, by tooth and claw, had been handed to Kennedy, on a platter decorated with a doily. But both Nixon and Kennedy were powerfully shaped by the rising conflict with the Soviet Union, and both understood domestic affairs through the lens of foreign policy. After Stalin broke the promise he’d made at Yalta to allow Poland “free and unfettered elections,” it had become clear that he was ruthless, even if the West had, as yet, little knowledge of the purges with which he was overseeing the murder of millions of people. Inside the Truman administration, a conviction grew that the Soviet regime was ideologically and militarily relentless. In February 1946, George Kennan, an American diplomat in Moscow, sent the State Department an 8,000-word telegram in which he reported that the Soviets were resolute in their determination to battle the West in an epic confrontation between capitalism and communism. “We have here a political force committed fanatically to the belief that with US there can be no permanent modus vivendi that it is desirable and necessary that the internal harmony of our society be disrupted, our traditional way of life be destroyed, the international authority of our state be broken, if Soviet power is to be secure,” Kennan wrote. “This political force has complete power of disposition over energies of one of world’s greatest peoples and resources of world’s richest national territory, and is borne along by deep and powerful currents of Russian nationalism.” Two weeks later, Winston Churchill, speaking in Truman’s home state of Missouri, warned of an “iron curtain” falling across Europe.30

The postwar peace had been fleeting. As keenly as Roosevelt and Churchill had wanted to avoid repeating the mistakes of the peace made at the end of the First World War, political instability had inevitably trailed behind the devastation of the Second World War. The Soviet Union’s losses had been staggering: twenty-seven million Russians died, ninety times as many casualties as were suffered by Americans. Much of Europe and Asia had been ravaged. From ashes and ruins and graveyards, new regimes gathered. In Latin America, Africa, and South Asia, nations and peoples that had been colonized by European powers, began to fight to secure their independence. They meant to choose their own political and economic arrangements. But, in a newly bipolar world, that generally meant choosing between democracy and authoritarianism, between capitalism and communism, between the influence of the United States or the influence of the USSR.31

“At the present moment in world history nearly every nation must choose between alternative ways of life,” Truman said. He conceived of a choice between freedom and oppression. Much about this conception derived from the history of the United States, a refiguring of the struggle between “freedom” and “slavery” that had divided nineteenth-century America into “free states” and “slave states” and during which opponents of slavery had sought to “contain” it by refusing to admit “slave states” into the Union. In the late 1940s, Americans began applying this rhetoric internationally, pursuing a policy of containing communism while defending the “free world.”32

The same rhetoric, of course, infused domestic politics. Republicans characterized the 1946 midterm elections as involving a stark choice: “Americanism vs. Communism.” In California, scrappy Richard Nixon defeated the diffident Voorhis by debating him on stage a half-dozen times, but especially by painting him as weak on communism and slaughtering him with innuendo and smear. Nixon adopted, in his first campaign, his signature tactic: making false claims and then taking umbrage when his opponent impugned his integrity. Voorhis was blindsided. “Every time that I would say that something wasn’t true,” he recalled, “the response was always ‘Voorhis is using unfair tactics by accusing Dick Nixon of lying.’” But Nixon, the lunch-bucket candidate, also exploited voters’ unease with a distant government run by Ivy League–educated bureaucrats; he found it took only the merest of gestures to convince voters that there was something un-American about people like Voorhis, people like them. His campaign motto: “Richard Nixon is one of us.”33

In November 1946, the GOP won both the House and Senate for the first time since 1932. The few Democrats who were elected, like Kennedy in Massachusetts, had sounded the same themes as Nixon: the United States was soft on communism. As freshmen congressmen, Kennedy and Nixon struck up an unlikely friendship while serving together on the House Education and Labor Committee. Nixon and his fellow Republicans supported a proposed Taft-Hartley Act, regulating the unions and prohibiting certain kinds of strikes and boycotts—an attempt to rein in the power of unions, whose membership had surged before the war, from three million in 1933 to more than ten million in 1941. After Pearl Harbor, the AFL and the CIO had promised to abstain from striking for the duration of the conflict and agreed to wage limits. As soon as the war ended, though, the strikes began. Some five million workers walked out in 1946 alone. Truman opposed Taft-Hartley, and, when Congress passed it, Truman vetoed it. Republicans in Congress began lining up votes for an override. Nixon and Kennedy went to a steel town in western Pennsylvania to debate the question before an audience of union leaders and businessmen. Each man admired the other’s style. On the train back to Washington, they shared a sleeping car. Kennedy’s halfhearted objections would, in any case, hold no sway against Republicans who succeeded in depicting unionism as creeping communism. Congress overrode the president’s veto.34

On foreign policy, Truman began to move to the right. Disavowing the legacy of American isolationism, he pledged that the nation would aid any besieged democracy. The immediate cause of this commitment was Britain’s decision to stop providing aid to Greece and Turkey, which were struggling against communism. In March of 1947, the president announced what came to be called the Truman Doctrine: the United States would “support free peoples who are resisting subjugation by armed minorities or by outside pressures.” (Truman aides later said that the president himself was unpersuaded by the growing fear of communism but was instead concerned about his chances for reelection. “The President didn’t attach fundamental importance to the so-called Communist scare,” one said. “He thought it was a lot of baloney.”) He also urged passage of the Marshall Plan, which provided billions of dollars in aid for rebuilding Western Europe. The Truman Doctrine and the Marshall Plan, the president liked to say, were “two halves of the same walnut.” Abroad, the United States would provide aid; at home, it would root out suspected communists. Coining a phrase, the financier and presidential adviser Bernard Baruch in April 1947 said in a speech in South Carolina, “We are today in the midst of a cold war.”35

Instead of a welfare state, the United States built a national security state. A peace dividend expected after the Allied victory in 1945 never came; instead came the fight to contain communism, unprecedented military spending, and a new military bureaucracy. During Senate hearings on the future of the national defense, military contractors including Lockheed, which had been an object of congressional investigation in the merchants-of-death era of the 1930s and had built tens of thousands of aircraft during the Second World War, argued that the nation required “adequate, continuous, and permanent” funding for military production, pressing not only for military expansion but also for federal government subsidies.36

In 1940, when Roosevelt pledged to make the United States an “arsenal of democracy,” he meant wartime production. A central political question of postwar American politics would become whether the arsenal was, in fact, compatible with democracy.

After the war, the United States committed itself to military supremacy in peacetime, not only through weapons manufacture and an expanded military but through new institutions. In 1946, the standing committees on military and naval affairs combined to become the Armed Services Committee. The 1947 National Security Act established the Central Intelligence Agency and the National Security Agency; created the position of the chairman of the Joint Chiefs of Staff; and made the War Department, now housed for the first time in a building of its own, into the Department of Defense.

In this political climate, the “one world” vision of atomic scientists, along with the idea of civilian, international control of atomic power, faded fast. Henry Stimson urged the sharing of atomic secrets. “The chief lesson I have learned in a long life,” he said, “is the only way you can make a man trustworthy is to trust him; and the surest way you can make a man untrustworthy is to distrust him and to show your distrust.” Truman disagreed. Atomic secrets were to be kept secret, and the apparatus of espionage was to be deployed to ferret out scientists who might dissent from that view.37

The Bulletin of the Atomic Scientists began publishing a Doomsday Clock, an assessment of the time left before the world would be annihilated in an atomic war. In 1947, they set the clock at seven minutes before midnight. Kennan, in a top secret memo to Truman, warned that to use an atomic or hydrogen bomb would be to turn back time. These weapons, Kennan argued, “reach backward beyond the frontiers of western civilization”; “they cannot really be reconciled with a political purpose directed to shaping, rather than destroying, the lives of the adversary”; “they fail to take into account the ultimate responsibility of men for one another.”38

No caution slowed the development of the weapons program, and Soviet aggression and espionage, along with events in China, aided the case for national security and undercut the argument of anyone who attempted to oppose the military buildup. With every step of communist advance, the United States sought out new alliances, strengthened its defenses, and increased military spending. In 1948, the Soviet-supported Communist Party in Czechoslovakia staged a coup, the Soviets blockaded Berlin, Truman sent in support by air, and Congress passed a peacetime draft. The next year, the United States signed the North Atlantic Treaty, joining with Western Europe in a military alliance to establish, in NATO, a unified front against the USSR and any further Soviet aggression. Months later, the USSR tested its first atomic bomb and Chinese communists won a civil war. In December 1949, Mao Zedong, the chairman of China’s Communist Party, visited Moscow to form an alliance with Stalin; in January, Klaus Fuchs, a German émigré scientist who had worked on the Manhattan Project confessed that he was, in fact, a Soviet spy. Between 1949 and 1951, U.S. military spending tripled.39

The new spending restructured the American economy, nowhere more than in the South. By the middle of the 1950s, military spending made up close to three-quarters of the federal budget. A disproportionate amount of this spending went to southern states. The social welfare state hadn’t saved the South from its long economic decline, but the national security state did. Southern politicians courted federal government contracts for defense plants, research facilities, highways, and airports. The New South led the nation in aerospace and electronics. “Our economy is no longer agricultural,” the southern writer William Faulkner observed. “Our economy is the Federal Government.”40

Nixon staked his political future on becoming an instrument of the national security state. Keen to make a name for himself by ferreting out communist subversives, he gained a coveted spot on the House Un-American Activities Committee, where his early contributions included inviting the testimony of the actor Ronald Reagan, head of the Screen Actors Guild, a Californian two years Nixon’s junior. But Nixon’s real chance came when the committee sought the testimony of Time magazine senior editor and noted anticommunist Whittaker Chambers.

On August 3, 1948, Chambers, forty-seven, told the committee that, in the 1930s, he’d been a communist. Time, pressured to fire Chambers, refused, and published this statement: “TIME was fully aware of Chambers’ political background, believed in his conversion, and has never since had reason to doubt it.” But if Chambers’s past was no real surprise, his testimony nevertheless contained a bombshell: Chambers named as a fellow communist the distinguished veteran of the U.S. State Department, former general secretary of a United Nations organizing conference, and now president of the widely respected Carnegie Endowment for International Peace, forty-three-year-old Alger Hiss—news that, by the next morning, was splashed across the front of every newspaper in the country.

Hiss appeared before the committee on August 25 in a televised congressional hearing. He deftly denied the charges and seemed likely to be exonerated, especially after Chambers, who came across as unstable, vengeful, and possibly unhinged, admitted that he had been a Soviet spy (at that point, Time publisher Henry Luce accepted his resignation). Chambers having presented no evidence to support his charges against Hiss, the committee was inclined to let it pass—all but Nixon, who seemed to hold a particular animus for Hiss.41 Rumor had it that in a closed session, not seen on television, Nixon had asked Hiss to name his alma mater.

“Johns Hopkins and Harvard,” Hiss answered, and then added dryly, “And I believe your college is Whittier?”42

Nixon, who never forgave an Ivy League snub, began an exhaustive investigation, determined to catch his prey, the Sherlock Holmes to Hiss’s Professor Moriarty. Meanwhile, the press and the public forgot about Hiss and turned to the upcoming election, however unexciting it appeared. Hardly anyone expected Truman to win his first full term in 1948 against the Republican presidential nominee, Thomas Dewey, governor of New York. Few Americans were excited about either candidate, but Truman’s loss seemed all but inevitable. “We wish Mr. Dewey well without too much enthusiasm,” Reinhold Niebuhr said days before the election, “and look to Mr. Truman’s defeat without too much regret.”43

Truman had accomplished little of his domestic agenda, with one exception, which had the effect of alienating him from his own party: he had ordered the desegregation of the military. Aside from that, a Republican-controlled Congress had stymied nearly all of his legislative initiatives, including proposed labor reforms. Truman was so weak a candidate that two other Democrats ran against him on third-party tickets. Henry Wallace ran to Truman’s left, as the nominee of the Progressive Party. The New Republic ran an editorial with the headline TRUMAN SHOULD QUIT.44 At the Democratic convention in Philadelphia that summer, segregationists bolted: the entire Mississippi delegation and thirteen members of the Alabama delegation walked out, protesting Truman’s stand on civil rights. These southerners, known as Dixiecrats, formed the States’ Rights Democratic Party and ran a candidate to Truman’s right. They held a nominating convention in Birmingham during which Frank M. Dixon, a former governor of Alabama, said that Truman’s civil rights programs would “reduce us to the status of a mongrel, inferior race, mixed in blood, our Anglo-Saxon heritage a mockery.” The Dixiecrat platform rested on this statement: “We stand for the segregation of the races and the racial integrity of each race.” As its candidate, the States’ Rights Party nominated South Carolina governor Strom Thurmond.45

Waving aside the challenges from Wallace and Thurmond, Truman campaigned vigorously against Dewey, running on his chief campaign pledge: a national health insurance plan. Dewey, on the other hand, proved about as good a campaigner as a pail of paint. From Kentucky, the Louisville Courier-Journal complained, “No presidential candidate in the future will be so inept that four of his major speeches can be boiled down to these historic four sentences. Agriculture is important. Our rivers are full of fish. You cannot have freedom without liberty. Our future lies ahead.”46

Truman might have felt that the crowds were rallying to him, but every major polling organization predicted that Dewey would defeat him. Truman liked to mock leaders who paid attention to polls. “I wonder how far Moses would have gone if he’d taken a poll in Egypt,” he said. “What would Jesus Christ have preached if he’d taken a poll in Israel?”47 The week before Election Day, George Gallup issued a statement: “We have never claimed infallibility, but next Tuesday the whole world will be able to see down to the last percentage point how good we are.”48 Gallup predicted that Truman would lose. The Chicago Tribune, crippled by a strike of typesetters, went to press with the headline DEWEY DEFEATS TRUMAN. A victorious Truman was caught on camera two days later, holding up the paper and wearing a grin as wide as the Mississippi River.

The 1948 election became a referendum on polling, a referendum with considerable consequences because Congress was still debating whether or not to establish a National Science Foundation, and whether such a foundation would provide funding to social science. The pollsters’ error likely had to do with undercounting black votes. Gallup routinely failed to poll black people, on the theory that Jim Crow, voter violence, intimidation, and poll taxes prevented most from voting. But blacks who could vote overwhelmingly cast their ballots for Truman, and probably won him the election.

That was hardly the only problem with the polling industry. In 1944, Gallup had underestimated Democratic support in two out of every three states; Democrats charged that he had engineered the poll to favor Republicans. Questioned by Congress, he’d weakly offered that, anticipating a low turnout, he had taken two points off the projected vote for FDR, more or less arbitrarily.49 Concerned that the federal government might institute regulatory measures, the polling industry had decided to regulate itself by establishing, in 1947, the American Association for Public Opinion Research. But the criticism had continued, especially from within universities, where scholars pointed out that polling was essentially a commercial activity, cloaked in the garb of social science.

The most stinging critiques came from University of Chicago sociologist Herbert Blumer and Columbia political scientist Lindsay Rogers. Public opinion polling is not a form of empirical inquiry, Blumer argued, since it skips over the crucial first step of any inquiry: identifying what it is that is to be studied. As Blumer pointed out, this is by no means surprising, since polling is a business, and an industry run by businessmen will create not a science but a product. Blumer argued that public opinion does not exist, absent its measurement; pollsters created it: “public opinion consists of what public opinion polls poll.” The very idea that a quantifiable public opinion exists, Blumer argued, rests on a series of false propositions. The opinions held by any given population are not formed as an aggregation of individual opinions, each given equal weight, as pollsters suppose; they are formed, instead, “as a function of a society in operation”; we come to hold and express the opinions that we hold and express in conversation and especially in debate with other people and groups, over time, and different people and groups influence us, and we them, in different degrees.50

Where Herbert Blumer argued that polling rested on a misunderstanding of empirical science, Lindsay Rogers argued that polling rested on a misunderstanding of American democracy. Rogers, a scholar of American political institutions, had started out as a journalist. In 1912, he reported on the Democratic National Convention; three years later, he earned a doctorate in political science from Johns Hopkins. In the 1930s, he’d served as an adviser to FDR. In 1949, in The Pollsters: Public Opinion, Politics, and Democratic Leadership, Rogers argued that he wasn’t sold on polling as an empirical science, but that neither was that his particular concern. “My criticisms of the polls go to questions more fundamental than imperfections in sampling methods or inaccuracy in predicting the results of elections,” he explained. Even if public opinion could be measured by adding up what people say in interviews over the telephone to people they’ve never met, legislators using this information to inform their votes in representative bodies would be inconsistent with the Constitution.

“Dr. Gallup wishes his polls to enable the United States to become a mammoth town meeting in which yeses and noes will suffice,” Rogers wrote. “He assumes that this can happen and that it will be desirable. Fortunately, both assumptions are wrong.” A town meeting has to be small; also, it requires a moderator. Decisions made in town meetings require deliberation and delay. People had said the radio would create a town meeting, too. It had not. “The radio permits the whole population of a country, indeed of the world, to listen to a speaker at the same time. But there is no gathering together. Those who listen are strangers to each other.” Nor—and here was Rogers’s key argument—would a national town meeting be desirable. The United States has a representative government for many reasons, but among them is that it is designed to protect the rights of minorities against the tyranny of majority opinion. But, as Rogers argued, “The pollsters have dismissed as irrelevant the kind of political society in which we live and which we, as citizens should endeavor to strengthen.” That political society requires participation, deliberation, representation, and leadership. And it requires that the government protect the rights of minorities.51

Blumer and Rogers offered these critiques before the DEWEY-BEATS-TRUMAN travesty. But after the election, the Social Science Research Council announced that it would begin an investigation. The council, an umbrella organization, brought together economists, anthropologists, historians, political scientists, psychologists, statisticians, and sociologists. Each of these social sciences had grown dependent on the social science survey, the same method used by commercial pollsters: they used weighted samples of larger wholes to measure attitudes and opinions. Many social scientists subscribed to rational choice theory. Newly aided by the power of computers, they used quantitative methods to search for a general theory that could account for the behavior of individuals. In 1948, political scientists at the University of Michigan founded what became the American National Election Survey, the largest, most ambitious, and most significant survey of American voters. Rogers didn’t object to this work, but he wasn’t persuaded that counting heads is the best way to study politics, and he believed that polling was bad for American democracy. Blumer thought pollsters misunderstood science. But what many other social scientists came to believe, after the disaster of the 1948 election, was that if the pollsters took a fall, social science would fall with them.

The Social Science Research Council warned, “Extended controversy regarding the pre-election polls among lay and professional groups might have extensive and unjustified repercussions upon all types of opinion and attitude studies and perhaps upon social science research generally.” Its report, issued in December 1948, concluded that pollsters, “led by false assumptions into believing their methods were much more accurate than in fact they are,” were not up to the task of predicting a presidential election, but that “the public should draw no inferences from pre-election forecasts that would disparage the accuracy or usefulness of properly conducted sampling surveys in fields in which the response does not involve expression of opinion or intention to act.” That is to say, the polling industry was unsound, but social science was perfectly sound.52

Despite social scientists’ spirited defense of their work, when the National Science Foundation was finally established in 1950, it did not include a social science division. Even before the founding of the NSF, the federal government had committed itself to fortifying the national security state by funding the physical sciences. By 1949, the Department of Defense and the Atomic Energy Commission represented 96 percent of all federal funds for university research in the physical sciences. Many scientists were concerned about the consequences for academic freedom. “It is essential that the trend toward military domination of our universities be reversed as speedily as possible,” two had warned. Cornell physicist Philip Morrison predicted that science under a national security state would become “narrow, national, and secret.”53 The founding of the NSF did not allay these concerns. Although the NSF’s budget, capped at $15 million, was a fraction of the funds provided to scientists engaged in military research (the Office of Naval Research alone had an annual research budget of $85 million), the price for receiving an NSF grant was being subjected to a loyalty test, surveillance, and ideological oversight, and agreeing to conduct closeted research. As the Federation of American Scientists put it, “The Foundation which will thus come into existence after 4 years of bitter struggle is a far cry from the hopes of many scientists.”54

Even without support from the National Science Foundation, of course, social science research proceeded. Political scientists applied survey methods to the study of American politics and relied on the results to make policy recommendations. In 1950, when the distance between the parties was smaller than it has been either before or since—and voters had a hard time figuring out which party was conservative and which liberal—the American Political Science Association’s Committee on Political Parties issued a report called “Toward a More Responsible Two-Party System.” The problem with American democracy, the committee argued, is that the parties are too alike, and too weak. The report recommended strengthening every element of the party system, from national leadership committees to congressional caucuses, as well as establishing a starker difference between party platforms. “If the two parties do not develop alternative programs that can be executed,” the committee warned, “the voter’s frustration and the mounting ambiguities of national policy might set in motion more extreme tendencies to the political left and the political right.”55

The recommendation of political scientists that American voters ought to become more partisan and more polarized did not sit well with everyone. In 1950, in a series of lectures at Princeton, Thomas Dewey, still reeling from his unexpected loss to Truman, damned scholars who “want to drive all moderates and liberals out of the Republican party and then have the remainder join forces with the conservative groups of the South. Then they would have everything neatly arranged, indeed. The Democratic party would be the liberal-to-radical party. The Republican party would be the conservative-to-reactionary party. The results would be neatly arranged, too. The Republicans would lose every election and the Democrats would win every election.”56

Exactly this kind of sorting did eventually come to pass, not to the favor of one party or the other but, instead, to the detriment of everyone. It may have been the brainchild of quantitative political scientists, but it was implemented by pollsters and political consultants, using computers to segment the electorate. The questions raised by Blumer and Rogers went unanswered. Any pollster might have predicted it: POLLSTERS DEFEAT SCHOLARS.

WHEN TRUMAN BEAT DEWEY, and not the reverse, and Democrats regained control of both houses, and long-eared Lyndon B. Johnson took a seat in the Senate, the American Medical Association panicked and telephoned the San Francisco offices of Campaigns, Inc. In a message to Congress shortly before his inauguration, Truman called for the passage of his national health insurance plan.

The AMA, knowing how stunningly Campaigns, Inc., had defeated Warren’s health care plan in California, decided to do exactly what the California Medical Association had done: retain Clem Whitaker and Leone Baxter. The Washington Post suggested that maybe the AMA, at the hands of Whitaker and Baxter, ought to stop “whipping itself into a neurosis and attempting to terrorize the whole American public every time the Administration proposes a Welfare Department or a health program.” But the doctors’ association, undaunted, hired Whitaker and Baxter for a fee of $100,000 a year, with an annual budget of more than a million dollars. Campaigns, Inc., relocated to a new, national headquarters in Chicago, with a staff of thirty-seven. To defeat Truman’s proposal, they launched a “National Education Campaign.” The AMA raised $3.5 million, by assessing twenty-five dollars a year from its members. Whitaker and Baxter liked to talk about their work as “grass roots campaigning.” Not everyone was convinced. “Dear Sirs,” one doctor wrote them in 1949. “Is it 2½ or 3½ million dollars you have allotted for your ‘grass roots lobby’?”57

They started, as always, by drafting a Plan of Campaign. “This must be a campaign to arouse and alert the American people in every walk of life, until it generates a great public crusade and a fundamental fight for freedom,” it began. “Any other plan of action, in view of the drift towards socialization and despotism all over the world, would invite disaster.” Then, in an especially cunning maneuver, aimed, in part, at silencing the firm’s critics, Whitaker had hundreds of thousands of copies of their plan, “A Simplified Blueprint of the Campaign against Compulsory Health Insurance,” printed on blue paper—to remind Americans that what they ought to do was to buy Blue Cross and Blue Shield—and distributed it to reporters and editors and to every member of Congress.58

The “Simplified Blueprint” wasn’t their actual plan; a different Plan of Campaign circulated inside the office, in typescript, marked “CONFIDENTIAL:—NOT FOR PUBLICATION.” While the immediate objective of the campaign was to defeat Truman’s proposal, its long-term objective was “to put a permanent stop to the agitation for socialized medicine in this country by”:

        (a)    awakening the people to the danger of a politically-controlled, government-regulated health system;

        (b)    convincing the people . . . of the superior advantages of private medicine, as practiced in America, over the State-dominated medical systems of other countries;

        (c)    stimulating the growth of voluntary health insurance systems to take the economic shock out of illness and increase the availability of medical care to the American people.

As Whitaker and Baxter put it, “Basically, the issue is whether we are to remain a free Nation, in which the individual can work out his own destiny, or whether we are to take one of the final steps toward becoming a Socialist or Communist State. We have to paint the picture, in vivid verbiage that no one can misunderstand, of Germany, Russia—and finally, England.”59

They mailed leaflets, postcards, and letters across the country, though they were not always well met. “RECEIVED YOUR SCARE LETTER. AND HOW PITYFUL,” an angry pharmacist wrote from New York. “I DO HOPE PRESIDENT TRUMAN HAS HIS WAY. GOOD LUCK TO HIM.” Truman could have used some luck. Whitaker and Baxter’s campaign to defeat his national health insurance plan ended up costing the AMA nearly $5 million and took more than three years. But it worked.60

Truman was furious. As to what in his plan could possibly be construed as “socialized medicine,” he said, he didn’t know what in the Sam Hill that could be. He had one more thing to say: there was “nothing in this bill that came any closer to socialism than the payments the American Medical Association makes to the advertising firm of Whitaker and Baxter to misrepresent my health program.”61

National health insurance would have to wait for another president, another Congress, and another day. The fight would only get uglier.

III.

MOST POLITICAL CAREERS follow an arithmetic curve. Richard Nixon’s rise was exponential: elected to Congress at thirty-three, he won a Senate seat at thirty-six. Two years later, he would be elected vice president.

He had persisted in investigating Whittaker Chambers’s claim that Alger Hiss had been a communist. In a series of twists and turns worthy of a Hitchcock film—including microfilm hidden in a hollowed-out pumpkin on Chambers’s Maryland farm, the so-called Pumpkin Papers—Nixon charged that Hiss had been not only communist but, like Chambers, a Soviet spy.62

In January 1950, Hiss was convicted of perjury for denying that he had been a communist (the statute of limitations for espionage had expired) and sentenced to five years in prison. Five days after the verdict, on the twenty-sixth, Nixon delivered a four-hour speech on the floor of Congress, a lecture he called “The Hiss Case—A Lesson for the American People.” It read like an Arthur Conan Doyle story, recounting the entire history of the investigation, with Nixon as ace detective. Making a bid for a Senate seat, Nixon had the speech printed and mailed copies to California voters.63

Nixon sought the Senate seat of longtime California Democrat Sheridan Downey, the “Downey” of the “Uppie-and-Downey” EPIC gubernatorial ticket of 1933, who had decided not to run for reelection. Nixon defeated his opponent, Democrat Helen Gahagan Douglas, by Red-baiting and innuendo-dropping. Douglas, he said, was “Pink right down to her underwear.” The Nation’s Carey McWilliams said Nixon had “an astonishing capacity for petty malice.”64 But what won him the seat was the national reputation he’d earned in his prosecution of Alger Hiss, even if that crusade was soon taken over by a former heavyweight boxer who stood six foot tall and weighed two hundred pounds.

On February 9, a junior senator from Wisconsin named Joseph McCarthy stole whole paragraphs from Nixon’s “The Hiss Case—A Lesson for the American People” and used them in an address of his own, in which he claimed to have a list of subversives working for the State Department. In a nod to Nixon, McCarthy liked to say, when he was sniffing out a subversive: “I have found a pumpkin.”65

McCarthy had big hands and bushy eyebrows, and an unnerving stare. During the war, he’d served as a marine in the Pacific. Although he’d seen little combat and sustained an injury only during a hazing episode, he’d defeated the popular incumbent Robert La Follette Jr., in a 1946 Republican primary by running as a war hero, and had won a Senate seat against the Democrat, Howard McMurray, by claiming, falsely, that McMurray’s campaign was funded by communists, as if McMurray wore pink underwear, too.

The first years of McCarthy’s term in the Senate had been marked by failure and duplicity. Like Nixon, he tested the prevailing political winds and decided to make his mark by crusading against communism. In his Hiss speech, Nixon had hinted that not only Hiss but many other people in the State Department, and in other parts of the Truman administration, were part of a vast communist conspiracy. When McCarthy delivered his February 9 speech, before the Ohio County Republican Women’s Club, in Wheeling, West Virginia, he went further than Nixon. “While I cannot take the time to name all of the men in the State Department who have been named as members of the Communist Party,” he said, “I have here in my hand a list of two hundred and five . . . names that were made known to the Secretary of State as members of the Communist Party and who nevertheless are still working and shaping the policy of the State Department.”66 He had no list. He had nothing but imaginary pink underwear.

Three weeks after McCarthy’s Wheeling address, John Peurifoy, deputy undersecretary of state, said that while there weren’t any communists in the State Department, there were ninety-one men, homosexuals, who’d recently been fired because they were deemed to be “security risks” (another euphemism was men whose “habits make them especially vulnerable to blackmail”). It was, in part, Peurifoy’s statement that gave credibility to McCarthy’s charges: people really had been fired. One Republican representative from Illinois, getting the chronology all wrong, praised McCarthy for the purge: “He has forced the State Department to fire 91 sex perverts.”67

The purge had begun years earlier, in 1947, under the terms of a set of “security principles” provided to the secretary of state. People known for “habitual drunkenness, sexual perversion, moral turpitude, financial irresponsibility or criminal record” were to be fired or screened out of the hiring process. Thirty-one homosexuals had been fired from the State Department in 1947, twenty-eight in 1948, and thirty-one in 1949. A week after Peurifoy’s statement, Roy Blick, the ambitious head of the Washington, DC, vice squad, testified during classified hearings (on “the infiltration of subversives and moral perverts into the executive branch of the United States Government”) that there were five thousand homosexuals in Washington. Of these, Blick said, nearly four thousand worked for the federal government. The story was leaked to the press. Blick called for a national task force: “There is a need in this country for a central bureau for records of homosexuals and perverts of all types.”68

The Nixon-McCarthy campaign against communists can’t be separated from the campaign against homosexuals. There had been much intimation that Chambers, a gay man, had informed on Hiss because of a spurned romantic overture. By March of 1950, McCarthy’s charges had been reported in newspapers all over the country. The Senate Foreign Relations Committee convened hearings into “whether persons who are disloyal to the United States are or have been employed by the Department of State.” The hearings, chaired by Millard Tydings, a Democrat from Maryland, proved unilluminating. In the committee’s final report, Tydings called the charges “a fraud and a hoax.” This neither dimmed the furor nor daunted McCarthy, who masterfully manipulated the press and escalated fears of a worldwide communist conspiracy and a worldwide network of homosexuals, both trying to undermine “Americanism.” (So great was McCarthy’s hold on the electorate that, for challenging him, Tydings was defeated when he ran for reelection.)69

Who could rein him in? Few critics of McCarthyism were as forceful as Maine senator Margaret Chase Smith, the first woman to serve in both houses of Congress. In June 1950, she rose to speak on the floor of the Senate to deliver a speech later known as the Declaration of Conscience. “I don’t want to see the Republican Party ride to political victory on the Four Horsemen of Calumny—Fear, Ignorance, Bigotry, and Smear,” said Smith, a moderate Republican in the mold of Wendell Willkie. Bernard Baruch said that if a man had made that speech he would be the next president of the United States. Later, after Smith was jettisoned from the Permanent Subcommittee on Investigations, it was Nixon who took her place.70

In September 1950, Congress passed the Internal Security Act, over Truman’s veto, requiring communists to register with the attorney general and establishing a loyalty board to review federal employees. That fall, Margaret Chase Smith, who, despite her centrist leanings, had no qualms about the purging of homosexuals, joined North Carolina senator Clyde Hoey’s investigation into the “Employment of Homosexuals and Other Sex Perverts in Government.” The Hoey committee’s conclusion was that such men and women were a threat to national security.71

The crusade, at once against communists and homosexuals, was also a campaign against intellectuals in the federal government, derided as “eggheads.” The term, inspired by the balding Illinois Democrat Adlai Stevenson, was coined in 1952 by Louis Bromfield to describe “a person of spurious intellectual pretensions, often a professor or the protégé of a professor; fundamentally superficial, over-emotional and feminine in reactions to any problems.” The term connoted, as well, a vague homosexuality. One congressman described leftover New Dealers as “short-haired women and long-haired men messing into everybody’s personal affairs and lives.”72

One thing McCarthyism was not was a measured response to communism in the United States. Membership in the Communist Party in the United States was the lowest it had been since the 1920s. In 1950, when the population of the United States stood at 150 million, there were 43,000 party members; in 1951, there were only 32,000. The Communist Party was considerably stronger in, for instance, Italy, France, and Great Britain, but none of those nations experienced a Red Scare in the 1950s. In 1954, Winston Churchill, asked to establish a royal commission to investigate communism in Great Britain, refused.73

In 1951, McCarthy’s crusade scored a crucial legal victory when the Supreme Court upheld the Smith Act of 1940, ruling 6–2 in Dennis v. United States that First Amendment protections of free speech, press, and assembly did not extend to communists. This decision gave the Justice Department a free hand in rounding up communists, who could be convicted and sentenced to prison. In a pained dissent in Dennis, Justice Hugo Black wrote, “There is hope, however, that in calmer times, when present pressures, passions and fears subside, this or some later Court will restore the First Amendment liberties to the high preferred place where they belong in a free society.” That calm did not come for a very long time. Instead, McCarthy’s imagined web of conspiracy grew bigger and stretched further. The Democratic Party itself, he said, was in the hands of men and women “who have bent to the whispered pleas from the lips of traitors.” William Jenner, Republican senator from Indiana, said, “Our only choice is to impeach President Truman and find out who is the secret invisible government.”74

Eggheads or not, Democrats failed to defeat McCarthyism. Lyndon Johnson had become the Democratic Party whip in 1950 and two years later its minority leader; the morning after the 1952 election, he’d called newly elected Democrats before sunrise to get their support. “The guy must never sleep,” said a bewildered John F. Kennedy. Johnson became famous for wrangling senators the way a cowboy wrangles cattle. He’d corner them in hallways and lean over them, giving them what a pair of newspaper columnists called “The Treatment.” “Its velocity was breathtaking, and it was all in one direction,” they wrote. “He moved in close, his face a scant millimeter from his target, his eyes widening and narrowing, his eyebrows rising and falling.” Johnson despised McCarthy. “Can’t tie his goddam shoes,” he said. But, lacking enough support to stop him, Johnson bided his time.75

Liberal intellectuals, refusing to recognize the right wing’s grip on the American imagination, tended to dismiss McCarthyism as an aberration, a strange eddy in a sea of liberalism. The historian Arthur Schlesinger Jr., writing in 1949, argued that liberals, having been chastened by their earlier delusions about socialism and even Sovietism and their romantic attachment to the ordinary and the everyday, had found their way again to “the vital center” of American politics. Conservatives might be cranks and demagogues, they might have power and even radio programs, but, in the world of ideas, liberal thinkers believed, liberalism had virtually no opposition. “In the United States at this time, liberalism is not only the dominant but even the sole intellectual tradition,” insisted literary critic Lionel Trilling. “For it is the plain fact that nowadays there are no conservative or reactionary ideas in general circulation.”76

This assessment was an error. McCarthyism wasn’t an eddy; it was part of a rising tide of American conservatism.77 Its leading thinkers were refugees from fascist or communist regimes. They opposed collectivism and centralized planning and celebrated personal liberty, individual rights, and the free market. Ayn Rand, born Alisa Rosenbaum, grew up in Bolshevik Russia, moved to the United States in 1926, and went to Hollywood to write screenplays, eventually turning to novels; The Fountainhead appeared in 1943 and Atlas Shrugged in 1957. Austrian-born Friedrich von Hayek, after nearly twenty years at the London School of Economics, began teaching at the University of Chicago in 1949 (in 1961, he moved to Germany). While engaged in vastly different projects, Hayek and Rand engaged in many of the same rhetorical moves as Whitaker and Baxter, who, like all the most effective Cold Warriors, reduced policy issues like health care coverage to a battle between freedom and slavery. Whitaker and Baxter’s rhetoric against Truman’s health care plan sounded the same notes as Hayek’s “road to serfdom.” The facts, Whitaker said in 1949, were these:

Hitler and Stalin and the socialist government of Great Britain all have used the opiate of socialized medicine to deaden the pain of lost liberty and lull the people into non-resistance. Old World contagion of compulsory health insurance, if allowed to spread to our New World, will mark the beginning of the end of free institutions in America. It will only be a question of time until the railroads, the steel mills, the power industry, the banks and the farming industry are nationalized.

To pass health care legislation would be to reduce America to a “slave state.”78

But perhaps the most influential of the new conservative intellectuals was Richard M. Weaver, a southerner who taught at the University of Chicago and whose complaint about modernity was that “facts” had replaced “truth.” Weaver’s Ideas Have Consequences (1948) rejected the idea of machine-driven progress—a point of view he labeled “hysterical optimism”—and argued that Western civilization had been in decline for centuries. Weaver dated the beginning of the decline to the fourteenth century and the denial that there exists a universal truth, a truth higher than man. “The denial of universals carries with it the denial of everything transcending experience,” Weaver wrote. “The denial of everything transcending experience means inevitably—though ways are found to hedge on this—the denial of truth.” The only way to answer the question “Are things getting better or are they getting worse?” is to discover whether modern man knows more or is wiser than his ancestors, Weaver argued. And his answer to this question was no. With the scientific revolution, “facts”—particular explanations for how the world works—had replaced “truth”—a general understanding of the meaning of its existence. More people could read, Weaver stipulated, but “in a society where expression is free and popularity is rewarded they read mostly that which debauches them and they are continuously exposed to manipulation by controllers of the printing machine.” Machines were for Weaver no measure of progress but instead “a splendid efflorescence of decay.” In place of distinction and hierarchy, Americans vaunted equality, a poor substitute.79

If Weaver was conservatism’s most serious thinker, nothing better marked the rising popular tide of the movement than the publication, in 1951, of William F. Buckley Jr.’s God and Man at Yale: The Superstitions of “Academic Freedom,” in which Buckley expressed regret over the liberalism of the American university. Faculty, he said, preached anticapitalism, secularism, and collectivism. Buckley, the sixth of ten children, raised in a devout Catholic family, became a national celebrity, not least because of his extraordinary intellectual poise.

Russell Kirk’s The Conservative Mind appeared in 1953. Kirk, an intellectual historian from Michigan, provided a manifesto for an emerging movement: a story of its origins. The Conservative Mind described itself as “a prolonged essay in definition,” an attempt at explaining the ideas that have “sustained men of conservative impulse in their resistance against radical theories and social transformation ever since the beginning of the French Revolution.” The liberal, Kirk argued, sees “a world that damns tradition, exalts equality, and welcomes changes”; liberalism produces a “world smudged by industrialism; standardized by the masses; consolidated by government.” Taking his inspiration from Edmund Burke, Kirk urged those who disagreed with liberalism’s fundamental tenets to call themselves “conservatives” (rather than “classical liberals,” in the nineteenth-century laissez-faire sense). The conservative, he argued, knows that “civilized society requires orders and classes, believes that man has an evil nature and therefore must control his will and appetite” and that “tradition provides a check on man’s anarchic impulse.” Conservatism requires, among other things, celebrating the “mystery of human existence.”80

The battle, then, was a battle not so much for the soul of America as for the mind of America, for mystery over facts, for hierarchy over equality, for the past over the present. In 1955, Buckley founded the National Review. Whittaker Chambers joined the staff two years later. Kirk, who decried the “ritualistic liberalism” of American newspapers and magazines, contributed a regular column. In the first issue, Buckley said the magazine “stands athwart history, yelling Stop.”81

But if it was chiefly men who advanced the ideas and wrote the books of the new conservatism, it was women who carried the placards and worked in the precincts, not yelling, but politely whispering, “Stop, please.” Betty Farrington, head of the National Federation of Republican Women’s Clubs, filled those clubs with housewives who were ardent in their opposition to communism and support of McCarthy. After Dewey lost in 1948, Farrington had argued that the GOP needed a strong man: “How thankful we would have been if a leader had appeared to show us the path to the promised land of our hope. The world needs such a man today. He is certain to come sooner or later. But we cannot sit idly by in the hope of his coming. Besides his advent depends partly on us. The mere fact that a leader is needed does not guarantee his appearance. People must be ready for him, and we, as Republican women, in our clubs, prepare for him.” Farrington believed McCarthy was that man. It is no accident that McCarthy’s Wheeling, West Virginia, speech was an address, made by invitation, to a Republican women’s club, nor that his language was the language of the nineteenth-century female crusade. “The great difference between our western Christian world and the atheistic Communist world is not political—it is moral,” McCarthy said.82 Temperance, abolition, suffrage, populism, and prohibition weren’t part of Russell Kirk’s intellectual genealogy of conservatism, but they were the foundational experiences of its core constituency.

image
Suburban housewives served as the foot soldiers of the conservative movement; here, women rally in support of Joseph McCarthy.

Housewives were to the Republican Party infrastructure what labor union members were to the Democrats’. “If it were not for the National Federation of Republican Women, there would not be a Republican Party,” Barry Goldwater admitted. (Nixon couldn’t stand them: “I will not go and talk to those shitty ass old ladies!” he’d fume. All the same, gritting his teeth, he went.)83 By the 1950s, a majority of GOP activists were women, compared to 41 percent of Democratic Party activists. In 1950, Farrington launched the School of Politics, three-day sessions in Washington to train precinct workers; the sessions were open to men and women, but most who attended were women, while, at the same sort of sessions run by the DNC, most attendees were men. In the GOP, party work was women’s work, work that the party explained, structured, and justified by calling it housework. Republican Party aspirants were told to “be proud of the women who work on the home front, ringing the doorbells, filling out registration cards, and generally doing the housework of government so that the principles of the Republican Party can be brought to every home.” Republican women established Kitchen Kabinets, appointing a female equivalent to every member of the president’s cabinet, who shared “political recipes on GOP accomplishments with the housewives in the nation” by way of monthly bulletins on “What’s Cooking in Washington.”84 As a senator speaking to the federation of women’s clubs suggested, the elephant was the right symbol for the GOP because an elephant has “a vacuum cleaner in front and a rug beater behind.”85

By the mid-1950s, the conservative critique of the academy as godless and of the press as mindless were in place, along with a defense of the family, and of women’s role as housewives, however politicized the role of housewife had become. A moral crusade against homosexuality and in favor of a newly imagined traditional family had begun.

Meanwhile McCarthyism abided: mean-spirited, vulgar, and unhinged. McCarthy’s rise, the lunacy of his conspiracy theory, and the size of his following struck many observers as a symptom of a disease at the very heart of American politics. It left George Kennan with a lasting doubt: “A political system and a public opinion, it seemed to me, that could be so easily disoriented by this sort of challenge in one epoch would be no less vulnerable to similar ones in another.”86 What had made so many Americans so vulnerable to such an implausible view of the world?

INSIDE CBS, the plan was known as “Project X.” It was top secret until, a month before Election Day in 1952, the television network announced that it would predict the winner using a “giant brain.” One local station took out a newspaper ad promising that “A ROBOT COMPUTER WILL GIVE CBS THE FASTEST REPORTING IN HISTORY.”87

That giant brain was called UNIVAC, the Universal Automatic Computer, and it was the first commercial computer in the history of the world. In May 1951, John Mauchly and Presper Eckert, who’d unveiled ENIAC in 1946, invited members of the press to a demonstration of their new machine; they’d built it for the U.S. Census Bureau. Half the size of ENIAC, UNIVAC was even faster. This lickety-split sorting of the population would prove invaluable to the Census Bureau. Soon, all calculations relating to the federal census were completed by UNIVAC, work that was called “data processing.” Commercially applied, UNIVAC and its heirs would transform American business, straightaway cutting costs and accelerating production by streamlining managerial and administrative tasks, such as payroll and inventory, and eventually turning people into consumers whose habits could be tracked and whose spending could be calculated, and even predicted. Politically, it would wreak havoc, splitting the electorate into so many atoms.

The technology that made it possible to sort citizens by “sex, marital status, education, residence, age group, birthplace, employment, income and a dozen other classifications” would make it possible to sort consumers, too. Businesses found that they could both reduce prices and increase profits by sorting markets into segments and pitching the right ad and product to exactly the right consumer. In much the same way that advertisers segmented markets, political consultants would sort voters into different piles, too, and send them different messages.88

When Mauchly and Eckert staged their unveiling in 1951, all of this was in the future, and the press was not excited. In a one-paragraph story on the bottom of page twenty-five, the New York Times only dutifully took notice of the “eight-foot-tall mathematical genius,” as if it were nothing more than a stunt, like Elektro the Moto-Man.89

UNIVAC made its debut at a moment when Americans were increasingly exasperated by automation, the very year that readers waded through White Collar, the sociologist C. Wright Mills’s indictment of the fate of the people who worked, surrounded by telephones and Dictaphones, intercoms and Mimeographs, in fluorescent-lit, air-conditioned offices in steel-and-glass skyscrapers or in suburban office parks. Mills argued that machine-driven office work had created a class of desperately alienated workers and that the new office, for all its gadgets, was no less horrible than the old factories of brick and steam. “Seeing the big stretch of office space, with rows of identical desks,” Mills wrote, “one is reminded of Herman Melville’s description of a nineteenth-century factory: ‘At rows of blank-looking counters sat rows of blank-looking girls, with blank, white folders in their blank hands, all blankly folding blank paper.’” Melville had been describing a New England paper mill in 1855; Mills described a modern office a century later: “The new office is rationalized: machines are used, employees become machine attendants; the work, as in the factory, is collective, not individualized,” he wrote. “It is specialized to the point of automatization.”90 The minutes of white-collar workers’ lives were tapped out by typewriters and adding machines. They had the cheerfulness of robots, having lost the capacity to feel anything except boredom.91

“Robotic” having become a term of opprobrium, the people interested in explaining the truly revolutionary capabilities of the UNIVAC had to do something more than write numbers on Ping-Pong balls. Mauchly, disappointed at the bland coverage of UNIVAC’s unveiling, wrote a paper called “Are Computers Newsworthy?” Given that the novelty of computers as front-page news had worn off, the best approach would be to find ways to showcase their application to real-world problems, he suggested. He hired a public relations firm. “We must aim our publicity at the public in general because our object is to expand the market until computers become as ordinary as telephone switchboards and bookkeeping machines,” he explained. Mauchly’s PR team then came up with the very clever plan of bringing to CBS a proposal to predict the outcome of the upcoming election on live television, on Election Night.92

In 1948, less than 3 percent of American homes had a television; by 1952, the number was up to 45 percent. By the end of the decade, 90 percent of American homes had a television. The year 1952 marked the first coverage of a presidential election by television, and, if Mauchly had his way, it would be the first whose result would be predicted on television.

It looked to be an especially fascinating election. General Dwight D. Eisenhower, a lifelong military man, a five-star general who during the Second World War had served as Supreme Allied Commander Europe, had refused to run in 1948, on the grounds that professional soldiers ought to abstain from political officeholding. In 1952, at the age of fifty-seven, he was finally persuaded to run against Truman in an election expected to amount to a referendum on U.S. involvement in Korea. In June 1950, North Korean communist forces crossed the thirty-eighth parallel to attack South Korea. Truman sent in troops, led by General Douglas MacArthur, who drove the North Koreans nearly back to the border with China. China responded by providing resources to North Korea, and the American forces lost all of the ground they’d gained. The war was protracted, costly, and unpopular. Eisenhower, a hero from a better war, appeared a perfect candidate for the times.

Clem Whitaker and Leone Baxter managed his campaign. Having worked behind the scenes since its founding in 1933, Campaigns, Inc., had attracted not altogether wanted attention as a consequence of its phenomenal defeat of Truman’s national health insurance plan. A three-part exposé, written by Carey McWilliams, had appeared in The Nation in 1951. McWilliams admired Whitaker and Baxter, and he also liked them. But he believed that they had too much power, and that they were dangerous, and that what they had created was nothing less than “government by Whitaker and Baxter.” After McWilliams’s story ran, a number of notable doctors resigned from the AMA, including the head of Massachusetts General Hospital, who explained, in his letter of resignation, that he was no longer willing to pay dues used to support “an activity, which I consider contrary to public welfare and unworthy of a learned profession.” That fall, the AMA fired Whitaker and Baxter. That’s when Whitaker and Baxter went to work for Eisenhower.93

They decided to put Ike on TV. Republicans spent $1.5 million on television advertising in 1952; Democrats spent $77,000. Polls drove the ads; ads drove the polls. George Gallup chose the themes of Eisenhower’s TV spots, which took the form of fake documentaries. In “Eisenhower Answers America,” a young black man (plucked off the street from Times Square and reading a cue card) says, “General, the Democrats are telling me I never had it so good.” Eisenhower replies, “Can that be true, when America is billions in debt, when prices have doubled, when taxes break our backs, and we are still fighting in Korea?” Then, he looks, sternly, straight into the camera. “It’s tragic, and it’s time for a change.”94

Eisenhower’s politics were moderate, as was his style. He described himself as a “dynamic conservative”: “conservative when it comes to money and liberal when it comes to human beings.” His Democratic opponent, Illinois governor Adlai Stevenson, found that account of Eisenhower’s political commitments wanting: “I assume what it means is that you will strongly recommend the building of a great many schools to accommodate the needs of our children, but not provide the money.” Critics called the bald and effete Stevenson an egghead and “fruity”; rumors spread that he was gay. “Eggheads of the world unite!” Stevenson would joke, “You have nothing to lose but your yolks!,” not quite appreciating the malice of the campaign against him.95

Television became to the 1950s what radio had been to the 1930s. The style of news reporting that had been developed on the radio adapted poorly to the screen, but the audience was so huge that news organizations had every incentive to adapt. In 1949, the Federal Communications Commission established the Fairness Doctrine, a standard for television news that required a “reasonable balance” of views on any issue put before the public. CBS sent Walter Cronkite, a thirty-five-year-old newsman from its Washington affiliate, WTOP-TV, to cover both nominating conventions.

Richard Nixon went to the Republican National Convention in Chicago on board a chartered train from California called the Earl Warren Special, allegedly supporting Warren’s bid for the presidential nomination. Whitaker and Baxter had never forgiven Warren for firing them in 1942, and even scuttling his statewide health insurance plan in 1945 had not slaked their thirst for vengeance. During the train ride to Chicago, Nixon secretly swayed California delegates to throw their support behind Eisenhower—a scheme forever after known as the “great train robbery”—and the general had rewarded him with a spot on the ticket. Warren would later call Nixon a “crook and a thief.” Eisenhower would find a place for Warren in his administration, as solicitor general.96

Nixon had managed to secure the GOP vice presidential nomination, but, weeks later, he’d go on television to try to hold onto it. After the convention, the press revealed that Nixon had an $18,000 slush fund. Eisenhower’s advisers urged him to dump Nixon, and asked Nixon to step down. Nixon, facing the end of his political career, decided to make his case to the public. He labored over it, writing the speech of his life. On September 23, 1952, sitting at a pine desk, with his wife looking on from a chintz chair, in what appeared to be his own den but was a stage built at an NBC studio in Los Angeles, he gave a remarkable performance, pained and self-pitying. It reached the largest television audience television ever recorded. Nixon said he intended to do something unprecedented in American politics. He would provide a full financial disclosure, an accounting of “everything I’ve earned, everything I’ve spent, and everything I owe.” Nearly down to the penny, he then listed his modest income, his loans, and his wealth (“this will surprise you, because it is so little”). He had no stocks, no bonds, a two-year-old Oldsmobile, mortgages, debts to banks, and even a debt to his parents that he was paying back, every month, with interest. Yes, he’d accepted gifts to a campaign fund. But no contributors had gotten special favors for their donations, and “not one cent of the eighteen thousand dollars” had gone to him for his private use. He’d spent it on campaign expenses. He covered his face for a moment, as if offering up a final, humiliating confession. There was one gift he must acknowledge: a man in Texas has sent his daughters a black-and-white spotted cocker spaniel puppy, and his six-year-old daughter, Tricia, had named the dog Checkers. “Regardless of what they say about it,” he said, feigning injury, “we’re gonna keep it.”97

Liberals were disgusted, partly because it was something of a sham, but mostly because it was maudlin. Eisenhower was, at the time, president of Columbia University; twenty-three full professors at Columbia, including Allan Nevins, Lionel Trilling, and Richard Hofstadter, issued a statement in which they denounced the Checkers speech, which Nevins described as “so essentially dishonest and emotional an appeal that he confused a great many people as to the issues involved.”98 Walter Lippmann said that watching it was “one of the most demeaning experiences my country has ever had to bear.” But the overwhelming majority of people who watched it loved it. Nixon spoke to their experiences and their quiet lives, and to their grievances, too. Plainly, Nixon had saved his career, and more. “In 30 minutes,” Time reported, “he had changed from a liability to his party to a shining asset.”99

Nixon had accomplished something else, of greater and more lasting importance. Since the days of Harding and Hoover, the Republican Party had been the party of businessmen, of country club members and stockholders. The Democratic Party had been the party of the little guy, from Andrew Jackson’s self-made man to William Jennings Bryan’s farmer to FDR’s “forgotten man.” Nixon, with that speech, reversed this calculus. That was what so galled liberals: they were no longer the party of the people. Populism had shifted to the right.100

The Checkers speech was a landmark in the history of television, and it became a watchword in the history of American politics. Lost in the fog of memory was another epic turn during that election. Nixon decided, after the Checkers speech, that he loved television. As his friend the Hollywood producer Ted Rogers said, “He was the electronic man.”101 But the real electronic man of that year’s political season was the UNIVAC.

After the conventions, all three network television broadcasters were looking for a way to do a better job covering election night than they’d done in 1948, which had been widely seen as a dismal failure. There hadn’t been much to look at. As one critic put it, “Counting ballots is hardly a function which lends itself to much visual excitement.” Added to the clumsiness of the television coverage was the lingering embarrassment of the error of everyone’s prediction of the outcome. Broadcasters had made the same error as the DEWEY-BEATS-TRUMAN Chicago Tribune; by the time Truman pulled ahead, CBS had already closed down for the night.102

CBS agreed to commission UNIVAC as its special guest on Election Night. On November 4, the actual UNIVAC—there was only one—was in Philadelphia, while CBS’s Charles Collingwood sat at a blinking console at the network’s flagship studio in New York, giving viewers the illusion that he was controlling a computer. “A UNIVAC is a fabulous electronic machine, which we have borrowed to help us predict this election from the basis of early returns as they come in,” Collingwood told his audience as the evening’s coverage began. “This is not a joke or a trick,” he went on, “It’s an experiment. We think it’s going to work. We don’t know. We hope it will work.”

Thirty-six-year-old Walter Cronkite read the early, East Coast returns; Edward R. Murrow provided the commentary. Cronkite, born in Missouri, spoke with a gentlemanly midwestern twang. Not long after the East Coast polls closed, CBS announced that Eisenhower was ahead in the popular vote, Stevenson in the electoral vote. Cronkite then said, “And now to find out perhaps what this all means, at least in the electronic age, let’s turn to that electronic miracle, the electronic brain, UNIVAC, with a report from Charles Collingwood.”

image
CBS News, whose team included Walter Cronkite (right), commissioned the first commercial computer, UNIVAC, to predict the outcome of the election of 1952.

UNIVAC had been attempting to calculate the likely outcome of the election by comparing early returns to results from the elections of 1944 and 1948. When the camera turned to Collingwood, though, he could get no answer from UNIVAC. Murrow ventured that perhaps UNIVAC was cautious. After all, it was still early in the night. “It may be possible for men or machines to draw some sweeping conclusions from the returns so far,” Murrow said, “but I am not able to do it.” But then, eyeing the returns from Connecticut, where a great many Democrats had surprisingly voted for the Republican, Murrow, while not offering a sweeping conclusion, suggested that the momentum appeared to be very much in Ike’s favor.

At 10:30, Cronkite turned again to Collingwood. UNIVAC was having “a little bit of trouble,” Collingwood said with evident embarrassment. At one point UNIVAC predicted that Eisenhower would win by a sizable margin, at another that Stevenson might eke out a win. After Murrow called the election for Eisenhower, UNIVAC changed its mind again and said that the race was close. Cronkite turned to Murrow, who said, “I think it is now reasonably certain that this election is over.” Fifteen minutes later, Cronkite offered this update:

And now, UNIVAC—UNIVAC, our electronic brain—which a moment ago, still thought there was a 7 to 8 for Governor Stevenson, says that the chances are 100 to 1 in favor of General Eisenhower. I might note that UNIVAC is running a few moments behind Ed Murrow, however.

Ike won in a landslide. UNIVAC called it right, in the end, and so did George Gallup, who had gotten the vote wrong by 5 percent in 1948, and got it wrong by 4 percent again in 1952, but this time, Eisenhower’s margin of victory was so big that Gallup’s margin of error hadn’t led him to predict the wrong winner.103

The next day, Murrow, speaking on CBS Radio, delivered a sermon about the civic importance of voting, as against the political mischief of polling, political consultants, and electronic brains. “Yesterday the people surprised the pollsters, the prophets, and many politicians,” Murrow said. “They demonstrated, as they did in 1948, that they are mysterious and their motives are not to be measured by mechanical means.” The election, he thought, had returned to the American voter his sovereignty, stolen by “those who believe that we are predictable.” Murrow said, “we are in a measure released from the petty tyranny of those who assert that they can tell us what we think, what we believe, what we will do, what we hope and what we fear, without consulting us—all of us.”104

Murrow’s faith in the American creed, in the triumph of reason over fear, in progress over prophecy, was a hallmark of mid-twentieth-century liberalism. But it was also a shaken faith. Between the unreasoning McCarthy and the coldly calculating computer, where was the independent-minded American voter, weighing facts and searching for truth? The questions about the malleability of public opinion raised by radio were revisited during the rise of television. “Brainwashing” became a household word in the 1950s, when it was used to refer not only to the psychological torture during the Korean War but also to the persuasive powers of television.

When Americans talk about “public opinion,” C. Wright Mills argued, they meant the eighteenth-century idea of informed people engaging in free, rational discussion to arrive at truth—the right understanding of an issue—before urging their representatives to take action. But in the middle of the twentieth century, Mills said, this idea had become nothing more than a “fairy tale,” as fanciful as Disneyland, because “the public of public opinion is recognized by all those who have considered it carefully as something less than it once was.” Like many social scientists of his generation, Mills argued that the United States was far along the road to becoming a fully mass society rather than a community of publics. The way to tell the difference between a mass society and a community of publics is the technology of communication: a community of publics is a population of people who talk to one another; a mass society receives information from the mass media. In a mass society, elites, not the people, make most decisions, long before the people even know there is a decision to be made. The formation of what Mills called “power elites” was directly related to technological shifts, especially the rise of computing. “As the means of information and of power are centralized,” Mills wrote, “some men come to occupy positions in American society from which they can look down upon . . . and by their decisions mightily affect the everyday lives of ordinary men and women.”105

Yet for all the concern about “mass media”—a term coined in derision—there remained sources of optimism, especially in the undeniable observation that investigative television reporting and broadcast television news were usefully informing the electorate, introducing them to candidates and issues, and helping Americans keep abreast of national and world affairs. And McCarthy’s own end, after all, came on television.

On February 18, 1954, McCarthy questioned General Ralph Zwicker, a holder of a Purple Heart and a Silver Star. The senator told him the general didn’t have “the brains of a five-year-old child” and that his testimony was “a disgrace to the army.”106 Eisenhower had long since lost patience with McCarthy and the damage he had done. But going after the army was the last straw. The next month, on CBS Television’s See It Now, Murrow narrated an edited selection of McCarthy’s speeches before the public and during congressional hearings, revealing the cruelty of the man, his moral shabbiness and pettiness, his brutality. Murrow’s thirty-minute presentation of evidence took the form of a carefully planned prosecution. “And upon what meat doth Senator McCarthy feed?” Murrow asked. “Two of the staples of his diet are the investigation, protected by immunity, and the half-truth.” (McCarthy was given an opportunity to reply, which he took up, feebly, two weeks later.) Murrow closed with a sermon. “We will not walk in fear, one of another,” he said. “We will not be driven by fear into an age of unreason, if we dig deep in our history and our doctrine and remember that we are not descended from fearful men.”107

image
U.S. Army Chief Counsel Joseph Welch holds his head in his hand as Joseph McCarthy speaks during the Army-McCarthy hearings in 1954.

One week after Murrow’s broadcast, the Senate convened the Army-McCarthy hearings, to investigate charges that McCarthy’s chief counsel, Roy Cohn—later Donald Trump’s mentor—had attempted to obtain military preferment for another McCarthy aide, David Shine. Lyndon Johnson slyly arranged for the hearings to be televised. The hearings lasted fifty-seven days, of which thirty-six were broadcast. On June 9, when Army Chief Counsel Joseph Welch asked McCarthy if, finally, he had any decency, viewers had seen for themselves that he hadn’t. Cohn resigned. Johnson, reelected by a landslide in the fall of 1954, when Democrats regained control of the Senate, decided the moment to strike had finally come. He named a special committee to investigate McCarthy and made sure the committee was dominated by conservatives, so that no one could question that the investigation had been partisan. The committee recommended disciplining McCarthy. That December, the Senate voted 65–22 to censure him. John F. Kennedy, whose brother Robert worked as a McCarthy aide, and whose father had long supported McCarthy, was the only Democrat to not publicly support censure. McCarthy’s fall had come.108

“It’s no longer McCarthyism,” said Eisenhower. “It’s McCarthywasm.”109 McCarthy, struggling with drinking, died three years later, only forty-eight.

“THIS COUNTRY NEEDS a revival,” House Speaker Sam Rayburn said, “and I believe Billy Graham is bringing it to us.” Against the godlessness of communism, before and after McCarthy’s fall, Americans turned anew to religion. In the decade following the end of the war, church membership grew from 75 million to 100 million.110 Much of the growth was driven by Southern Baptists, like Billy Graham, who asserted a growing influence on American life and politics. Between 1941 and 1961, membership in the Southern Baptist Convention doubled. In eight days in the fall of 1949, Graham preached to more than 350,000 people in Los Angeles.

Broad-shouldered and Brylcreemed, Graham left audiences swooning. But he didn’t only draw new members to the Southern Baptist Convention; he brought together all manner of white conservative Protestants, North and South, into a new evangelism. For Graham, the Cold War represented a Manichaean battle between Christ and communism. “Do you know that the Fifth Columnists, called Communists, are more rampant in Los Angeles than in any other city in America?” he demanded. “The world is divided into two camps!” Communism “has declared war against God, against Christ, against the Bible, and against all religion! . . . Unless the Western world has an old-fashioned revival, we cannot last!” Communists became the new infidels.111

Graham, who’d grown up in North Carolina, romanticized rural America, calling the shepherds of the Bible “hillbillies.” His anti-intellectualism aligned well with a broader critique of liberalism. “When God gets ready to shake America, he might not take the Ph.D. and the D.D. and the Th.D.,” Graham preached. “God may choose a country boy! God may choose a man no one knows . . . a hillbilly, a country boy! Who will sound forth in a mighty voice to America, ‘Thus saith the Lord!’”

image
Reverend Billy Graham, here preaching in Washington, DC, in 1952, reached a nationwide audience but boasted an especially strong following in Congress.

Graham himself, though, traveled in powerful, cosmopolitan circles. In 1950, he began praying before Congress. He held prayer meetings with senators. He met with presidents. He preached evangelism as Americanism. “If you would be a loyal American,” he said, “then become a loyal Christian.” To Graham, the tool of the enemy (and of the devil) was “the sin of tolerance.” “The word ‘tolerant’ means ‘liberal,’ ‘broad-minded,’” he said, and “the easy-going compromise and tolerance that we have been taught by pseudo-liberals in almost every area of our life for years” means nothing so much as appeasement to communism. “My own theory about Communism,” he said, “is that it is master-minded by Satan.”112

As Graham’s influence grew, Eisenhower came to see his lack of membership in any church as a political liability. Raised a Mennonite, he decided to convert to Presbyterianism, becoming the first president to be baptized while in the White House. His administration inaugurated the practice of national prayer breakfasts. “Our form of government has no sense unless it is founded in a deeply religious faith, and I don’t care what it is,” Eisenhower said. During his administration, Congress mandated the inclusion of “In God We Trust” on all money and added “under God” to the Pledge of Allegiance.113

For more reasons, too, conservatives had high hopes for Eisenhower, whose 1952 campaign had included a promise to repeal New Deal taxes that, he said, were “approaching the point of confiscation.”114 Eisenhower’s cabinet included the former president of General Motors. (With Eisenhower’s pro-business administration, Adlai Stevenson said, New Dealers made way for car dealers.) Eisenhower was also opposed to national health care, as was his secretary of Health, Education, and Welfare, a longtime conservative Texas Democrat named Oveta Culp Hobby, who’d recently switched parties. She liked to say she’d come to Washington to “bury” socialized medicine. Both Eisenhower and Hobby considered free polio vaccinations socialized medicine, and Hobby argued against the free distribution of the vaccine, a position that would have exposed millions of children to the disease. In the end, after a related scandal, Hobby was forced to resign.115

But Eisenhower proved a disappointment to conservatives. From the start, he had his doubts about the nature of the Cold War. A decorated general, Eisenhower was nevertheless the child of pacifists who considered war a sin. And, even as he oversaw a buildup of nuclear weapons, he questioned the possibility of the world surviving an atomic war. “There just aren’t enough bulldozers to scrape the bodies off the street,” he said. Nor was he so sure that any part of the manufacture of so many weapons could possibly make any kind of sense. In his first major address as president, delivered on April 16, 1953, weeks after Stalin’s death—when he may have hoped for warmer relations with the Soviet Union—he reckoned the cost of arms. “Every gun that is made, every warship launched, every rocket fired signifies in the final sense a theft from those who hunger and are not fed, those who are cold and not clothed,” he said. “This world in arms is not spending money alone; it is spending the sweat of its laborers, the genius of its scientists, the hopes of its children.” He invoked, of all people, William Jennings Bryan, and his cross-of-gold speech. “This is not a way of life at all in any true sense,” Eisenhower went on. “Under the clouds of threatening war, it is humanity hanging from a cross of iron.”116 It was Eisenhower’s best speech about the arms race, if by no means his last.

“WERE ALL IN AGREEMENT on the format,” moderator Quincy Howe said in 1956, introducing the first-ever televised debate between two presidential candidates. “There’s going to be a three-minute opening statement from each of the two gentlemen here and a five-minute closing.” Radio hosts had tried fighting fascism in the 1930s by holding debates over the radio. In the 1950s, television hosts tried to fight communism—and McCarthyism—by doing the same on TV. Howe, a former CBS Radio broadcaster, had been director of the American Civil Liberties Union. In the 1930s, he’d served as a panelist on NBC Radio’s America’s Town Meeting of the Air.117 He cared about the quality of an argument; he cherished public debate. In 1956, he served as moderator of a debate between Adlai Stevenson and another Democratic presidential prospect, former Tennessee senator Estes Kefauver, broadcast on ABC.

The idea had come from Stevenson and his adviser Newton Minow—later the head of the FCC. In the spirit of radio debates hosted by the League of Women Voters since the 1920s, Stevenson and Minow were convinced that television could educate American voters and model the free and open exchange of political ideas. Stevenson challenged Kefauver; Kefauver agreed, and the two met in a one-hour debate at a studio in Miami. In between opening and closing statements, Howe explained, he’d allow “free-wheeling talk in which I act as a kind of a traffic cop, with the power to hand out parking tickets if anyone stays too long in one place or to enforce speed limits if anyone gets going too fast.” The debate took place the day after the United States dropped on the Bikini Atoll a bomb far more powerful than the bomb dropped on Hiroshima. Stevenson said, about the new bomb, “The future is either going to be a future of creativity and of great abundance, or it’s going to be a future of total incineration, death and destruction.”118

The Stevenson-Kefauver debate, like the H-bomb, had been a test. The Republican National Committee chairman called the debate “tired, sorry, and uninspiring.” But debating his opponent didn’t hurt Stevenson, who won the nomination, and began making a case to the nation that presidential candidates ought to debate one another on television regularly. “I would like to propose that we transform our circus-atmosphere presidential campaign into a great debate conducted in full view of all the people,” he later wrote, calling for regular half-hour debates between the major-party candidates.119

Meanwhile, Stevenson squared off against Eisenhower and his running mate, Richard Nixon, who’d drawn inward, convinced that the print press was conspiring against him, even though, for a long time, he’d been something of a media darling. “The tall, dark, and—yes—handsome freshman congressman who has been pressuring the House Un-American Activities committee to search out the truth in the Chambers-Hiss affair,” is how the Washington Post had described him at the beginning of his career. “He was unquestionably one of the outstanding first-termers in the Eightieth Congress.” All the same, newspaper columnists had badly drummed Nixon after his Checkers speech, and especially after McCarthy’s very bad end, and not always fairly. Syndicated newspaper columnist Drew Pearson had reported that Nixon had taken a bribe from an oil company; the report was based on a letter that turned out to be a forgery. Then there were stories that were simply unwarranted, dumb, and mean. Time had gleefully reported that Checkers was not housebroken and had not been spayed and had gotten pregnant by a neighborhood dog. Nixon, fed up, said he wanted to write a memoir called I’ve Had It. But then, in September 1955, Eisenhower had a heart attack, and Nixon decided to hold on, though he had to fight for a spot on the ticket.120

He won that spot in San Francisco in 1956 at a Republican convention managed by Whitaker and Baxter. “The key political fact about the gathering now breaking up is that it has made Richard M. Nixon the symbol, if not the center, of authority in the Republican Party,” reported Richard Rovere in The New Yorker. Campaigns, Inc., had teamed up with the California firm of Baus and Ross. Whitaker and Baxter wrote the copy; Baus and Ross produced the radio and television spots. That same season, they campaigned on behalf of Proposition 4, a ballot measure favoring the oil industry and giving them more license to drill. The measure was written by attorneys for Standard Oil. Whitaker and Baxter succeeded in getting the referendum’s name changed to the Oil and Gas Conservation Act. “Political campaigns are too important to leave to politicians,” Baus and Ross said.121

In a 1956 campaign speech written by economist John Kenneth Galbraith, Stevenson described “Nixonland” as the “land of slander and scare; the land of sly innuendo, the poison pen, the anonymous phone call and hustling, pushing, shoving, the land of smash and grab and anything to win.” (“I want you to write the speeches against Nixon,” Stevenson had written Galbraith. “You have no tendency to be fair.”)122 But Nixonland was Whitaker and Baxter–land.

In television ads, both the Republican and Democratic presidential campaigns of 1956 acknowledged the confusion that television advertising had itself sown. In one Republican ad, a cartoon voter despairs, “I’ve listened to everybody. On TV and radio. I’ve read the papers and magazines. I’ve tried! But I’m still confused. Who’s right? What’s right? What should I believe? What are the facts? How can I tell?” A comforting narrator calms down the worried voter and convinces him to like Ike.123

Stevenson, in his own television ad, haplessly tried to indict what he considered the callowness and fakery of the medium by exposing the camera, cables, and lights that had been installed in a room in his house in Illinois. He wanted to save Americans from themselves by showing them how what they saw on their screens was produced. “I wish you could see what else is in this room,” he said, speaking directly into the camera. “Besides the camera, and the lights over here, there are cables all over the floor.” The ad is positively postmodern: self-conscious, uncertain, and troubling. “Thanks to television, I can talk to millions of people that I couldn’t reach any other way,” Stevenson said, and then he quavered. “I can talk to you, yes, but I can’t listen to you. I can’t hear about your problems. . . . To do that, I’ve got to go out and see you in person.”124

But when Stevenson did go out on the campaign trail, he proved unpersuasive. In Los Angeles, speaking before a primarily black audience, he was booed when he said, “We must proceed gradually, not upsetting habits or traditions that are older than the Republic.”125 In 1952, Eisenhower had beaten Stevenson in the Electoral College 442 to 89; in 1956, he won 457 to 73.

The parties began to drift apart, like continents, loosed. The Republican Party, influenced by conservative suburban housewives, began to move to the right. The Democratic Party, stirred by the moral and political urgency of the struggle for civil rights, began moving to the left. The pace of that drift would be determined by civil rights, the Cold War, television, and the speed of computation.

How and where would Americans work out their political differences? In Yates v. United States, the Supreme Court gutted the Smith Act, establishing that the First Amendment protected all political speech, even radical, reactionary, and revolutionary speech, unless it constituted a “clear and present danger.” But television broadcasters began to report that their audiences seemed to have an aversion to unpleasant information. “Television in the main is being used to distract, delude, amuse and insulate us,” Murrow complained. Magazine and newspaper writers made much the same complaint, finding that their editors were unwilling to run stories critical of American foreign policy. In Guatemala, when the CIA arranged to overthrow the democratically elected government of Jacobo Árbenz Guzmán, who had seized hundreds of thousands of acres of land owned by the United Fruit Company, an American business, American reporters provided only the explanation given by Secretary of State Dulles, who insisted that Árbenz had been overthrown by a popular uprising. Correspondents from China, including John Hersey, protested at the editing of their own reports. From Luce’s Time, Theodore White threatened to resign.126

In a national security state where dissent was declared un-American and political contests were run by advertising firms, it was hard to know what was true. That bewildered cartoon voter had asked, “Who’s right? What’s right? What should I believe? What are the facts? How can I tell?” Maybe computers could tell. Screenwriters Phoebe Ephron and Henry Ephron toyed with that claim in the 1957 film Desk Set, starring Spencer Tracy and Katharine Hepburn, and made with the cooperation of IBM. Tracy plays an MIT engineer, a modern Frederick Winslow Taylor, who’s invented an “electronic brain.” He turns up with a tape measure in the fact-checking department on the twenty-eighth-floor of the Federal Broadcasting Company building. Hepburn, who plays the head of the department, invites him into her office.

“I’m a methods engineer,” he says.

“Is that a sort of efficiency expert?”

“Well, that term is a bit obsolete now.”

“Oh, forgive me,” says Hepburn. “I’m so sorry. I’m the old-fashioned type.”

He’s come to Hepburn’s department to install a giant machine called Electromagnetic MEmory and Research Arithmetical Calculator, EMERAC, or Emmy for short, which requires pushing aside the desks of her assistants. Hepburn expects that her entire staff, replaced with this newest Office Robot, will be fired. Demonstrating how EMERAC works, Tracy makes a speech to a group of corporate executives.

“Gentlemen, the purpose of this machine of course is to free the worker—”

(“You can say that again,” Hepburn mutters.)

“—to free the worker from the routine and repetitive tasks and liberate his time for more important work.” He points to the walls of books. “You see all those books there? And those up there? Well, every fact in them has been fed into Emmy.”

No one will ever need to consult a book again, Tracy promises. In the future, the discovery of facts will require nothing more than asking Emmy. Hepburn, asked what she thinks of Emmy, answers archly: “I think you can safely say that it will provide more leisure for more people.”127

Desk Set played on its audience’s fear of automation, of machines that would make workers redundant. But, more bracingly, it offered a proposal about mass democracy and the chaos of facts. Citizens find it impossible to gather all the information they need to make an informed decision about a political issue; they are easily deluded by television and other forms of mass media and mass advertising; they struggle to sort through fact and fiction. But computers have no problem handling a vast store of knowledge; they are animated only by logic; they are immune to persuasion. It seemed possible—it had certainly been Mauchly’s dream—that computers would help people become better citizens, that the United States would become a techno-utopia. Desk Set wondered, instead, whether computers had about them the whirring mechanical menace of totalitarianism, another cross of iron.

IV.

THOROUGHGOOD MARSHALL WAS born in Baltimore in 1908, the son of a steward who served at an all-white resort and a kindergarten teacher who taught at all-black schools. He knew all about the color line; he knew about it as intimately as a prisoner knows the walls of his cell. Marshall, who started spelling his name “Thurgood” in the second grade because it was simpler, first read the Constitution when he was made to study it as punishment for raising hell at school. “Instead of making us copy out stuff on the blackboard after school when we misbehaved,” Marshall later explained, “our teacher sent us down into the basement to learn parts of the Constitution.” He pored over every word. He figured he’d found the key to the lock on that cell door. His parents wanted him to become a dentist, but after working his way through college as a dining-car waiter on the B&O Railroad, he decided he wanted to be a lawyer. He’d learned his pride, and how to argue, from his father during arguments at the dinner table. Whenever he’d say something smart, his father would say, “Why, that’s right black of you.”128

Unable to attend the segregated University of Maryland Law School—a ten-minute trolley ride from his family’s house—he instead went to Howard, which required riding in segregated railroad cars, forty miles each way. Graduating first in his class in 1933, he two years later successfully sued, as counsel, the state of Maryland, arguing that, because the state provided no law school for African Americans, it had defied the “separate but equal” doctrine of the Supreme Court’s 1896 ruling in Plessy v. Ferguson. By 1950, Marshall had convinced the NAACP to abandon this line of argument—demanding equal facilities—in favor of arguing against separation itself.

Marshall started the NAACP’s legal and educational defense fund right after he won his case against the state of Maryland. As its chief counsel, he argued hundreds of cases across the South as part of a years-long strategy to end Jim Crow, at one point carrying as many as 450 cases at once. He started with higher education—law schools and professional schools—and worked his way down to colleges with the idea of eventually challenging segregation all the way down to the kind of kindergarten classrooms where his mother had taught. It had taken him a long time to convince colleagues at the NAACP to abandon “equalizing” arguments in favor of integration. (Equalizing had always been a means to end segregation, if gradually, the idea being that states would eventually be broken by the cost of maintaining separate schools if they had to be genuinely equal.) But by 1950, African Americans had challenged Jim Crow in the military and in housing and had also gained more political power. The Great Migration of blacks to the north and west meant that, nationally, anyway, large numbers of black men and women could vote, even if 80 percent of blacks in the South were still disenfranchised. By the middle of the decade, television, too, would argue in favor of making a leap in civil rights litigation: southern racial violence and intimidation, long hidden from view outside the South, could now be seen in living rooms across the country.

Aiming to bring a challenge to segregation in the nation’s public schools to the Supreme Court, an objective endorsed by Truman’s Justice Department, Marshall began building a docket of cases in 1951. Several were eventually consolidated under a title case concerning a third grader named Linda Brown, who lived in Topeka, Kansas. Her father, Oliver L. Brown, a welder and part-time pastor, wanted her to go to a school blocks away from their house. But Topeka’s segregated school system assigned Linda to a school a long walk and a bus ride away, an hour of travel each way. Oliver Brown agreed to join a civil suit against the Topeka Board of Education, filed by the NAACP’s Legal Defense Fund. The case was called Brown v. Board of Education.

On the eve of oral arguments in December 1952, Marshall was near to physical collapse from overwork. At the Supreme Court building, a line began to form before dawn, men and women bundled against the morning frost in winter coats and hats. Oral arguments lasted three days. Justice Stanley Reed asked Marshall whether segregation wasn’t in the interest of law and order. Marshall was willing to stipulate, for the purpose of argument, that maybe it had been when the court decided Plessy. But “even if the concession is made that it was necessary in 1895,” he said, “it is not necessary now because people have grown up and understand each other.” Marshall offered the court a singularly hopeful picture of American race relations. “I know in the South, where I spent most of my time,” he said, “you will see white and colored kids going down the road together to school. They separate and go to different schools, and they come out and they play together. I do not see why there would necessarily be any trouble if they went to school together.”

Justice Felix Frankfurter asked Marshall what he meant by “equal.” Marshall, six foot four, his wavy black hair slicked back, his thin mustache as pointed as a punctuation mark—Newsweek once described him as “a rumpled bear of a man”—answered, with his slight southern drawl, “Equal means getting the same thing, at the same time, and in the same place.”

John W. Davis, the seventy-eight-year-old former solicitor general, U.S. ambassador to Britain, and Democratic presidential candidate in 1924, argued the other side, stressing states’ rights and precedent. A formidable opponent, Davis had made 139 appearances before the court; this would be his last. He asked, “Is it not a fact that the very strength and fiber of our federal system is local self-government in those matters for which local action is competent?” And, on tradition: “There is no reason assigned here why this Court or any other should reverse the findings of ninety years.”129

But Marshall’s argument, strenuous and intricate, aimed to lift from the shoulders of African Americans the weight of history. Instead of arguing from precedent, Marshall borrowed from Louis Brandeis: he presented the findings of social science. In establishing the constitutionality of Jim Crow laws, Plessy v. Ferguson had cited the “customs and traditions of the people.” Marshall presented the court with reams of empirical research on the consequences for black children of separate schooling. Jim Crow laws, Marshall told the court, are Black Codes, and the only way the court could possibly uphold them, he said, would be “to find that for some reason Negroes are inferior to all other human beings.”130

As the court was keenly aware, the case to end segregation was aided by the conditions of the Cold War itself. The United States billed itself as the leader of the “free world,” and fought against the Soviets for influence in emerging polities in the third world, but frequently found itself indicted for its racial order at home. When the finance minister of Ghana, on a visit to the United States, stopped at a Howard Johnson’s in Delaware and tried to order orange juice, he was told that blacks were not allowed in the restaurant. When the Haitian secretary of agriculture was invited to Biloxi, Mississippi, for a conference, he was told he was unable to stay at the conference hotel. “Can serious people still speak of American democracy?” asked one Haitian newspaper. Newspapers from Bombay to Manila reported on Jim Crow. “The Negro question” was one of the principal themes of Soviet propaganda, the U.S. embassy in Moscow reported. And so, when the Topeka case first reached the Supreme Court, Truman’s Justice Department urged the court to overturn Plessy, partly on the grounds that legally sanctioned racial discrimination in the United States undermined American foreign policy aims. “Racial discrimination furnishes grist for the Communist propaganda mills,” said Attorney General James P. McGranery, “and it raises doubts even among friendly nations as to the intensity of our devotion to the democratic faith.” In his brief, the attorney general included two pages written by Dean Acheson, the secretary of state, emphasizing the cost of Jim Crow at home to the United States’ reputation around the world. “Racial discrimination in the United States remains a source of constant embarrassment to this Government in the day-to-day conduct of its foreign relations,” Acheson reported, “and it jeopardizes the effective maintenance of our moral leadership of the free and democratic nations of the world.” Desegregation had become a matter of national security.131

As the oral arguments ended, Davis was overheard saying, “I think we’ve got it won, 5–4, or maybe 6–3.” He’d read the bench well. When the justices began their deliberations in closed session, Chief Justice Fred Vinson, a Kentucky Democrat, opened by noting that precedent did indeed support segregation. Vinson thought it would be better if the desegregation of schools came from Congress, and that if the court acted ahead of popular opinion, public schooling in the South might be effectively abolished because segregationists would rather close their schools than admit blacks. Reed, also from Kentucky, said that he thought the time to end segregation would come when the “body of people” thought it was unconstitutional, which hadn’t happened yet. Like Reed, Justice Robert Jackson said he thought that if the court had to decide this question, “then representative government has failed.” Frankfurter, a longtime liberal who, once on the court, had become its most dogged opponent of judicial activism, wanted—like Texan Tom C. Clark—to delay. Frankfurter had served on the NAACP’s Legal Defense Committee and had hired a black law clerk, the court’s first, in 1948, but, as much as Frankfurter wanted segregation to end, Marshall hadn’t convinced him that it was unconstitutional. Roosevelt appointee and former Columbia University law professor William O. Douglas thought the whole thing was “very simple”: the “14th amendment prohibits racial classifications.” Hugo Black, from Alabama, was one of the strongest voices in opposition to segregation, even though he himself had been a member of the Klan in the 1920s—a blot that he strained to scrub clean. Had the justices then taken a straw vote (which they did not), it appears likely that four would have found segregation unconstitutional, two would have reaffirmed Plessy, and three would have been uncertain. Worried about the political consequences of a divided decision—a worry that extended to mass violence—Vinson decided to reschedule the case, to be reargued in December 1953.133

All bets on the outcome of the case were called off, though, when, on September 8, 1953, Vinson died, altogether unexpectedly, of a heart attack. Eisenhower, who had, in an effort to unite the divided Republican Party, named his rival and Nixon’s great political enemy Earl Warren as his solicitor general, had also, at the time, promised Warren a seat on the court. When Vinson died, Eisenhower appointed Warren as chief justice, a position Warren would hold for sixteen years, presiding over the most liberal bench in the court’s history. Brown v. Board was the first case the Warren Court tackled.

Warren, opening the discussion, saw the case entirely differently than had Vinson. “Separate but equal doctrine rests on [the] basic premise that the Negro race is inferior,” he began, agreeing with Marshall that the “only way to sustain Plessy” was to agree with the premise of racial inferiority, which was impossible, he said, because “the argument of Negro counsel proves they are not inferior.” Warren’s vote, added to the four justices who in the earlier session made clear that they believed segregation to be unconstitutional, meant that Warren’s argument would prevail, 5–4, even if no other justices joined his side. The justices’ clerks nearly unanimously supported Warren’s position, all but a young William Rehnquist, as he made plain in a memo to his boss, Justice Jackson. “I realize it is an unpopular and unhumanitarian position, for which I have been excoriated by ‘liberal’ colleagues,” Rehnquist wrote, “but I think Plessy v. Ferguson was right and should be reaffirmed.”133 (Nixon would appoint Rehnquist to the court in 1971.)

The court was scheduled to hand down its decision on May 17, 1954. The NAACP was so uncertain how the court would decide that it prepared two press releases, one for either possible decision. Reporters flooded the galleries. The decision had been made unanimous. Justice Jackson, in the hospital recovering from a heart attack, came to court that day, so committed was the court to a display of unity.134 Warren delivered the opinion he’d written about the nature of change over time. “In approaching this problem, we cannot turn the clock back to 1868, when the Amendment was adopted, or even to 1896 when Plessy v. Ferguson was written,” he insisted. “We must consider public education in the light of its full development and its present place in American life throughout the Nation. Only in this way can it be determined if segregation in public schools deprives these plaintiffs of the equal protection of the laws.” In assessing the evidence not of the past but of the present—the conditions in American schools—he concluded that “separate educational facilities are inherently unequal.”135 At least on paper, Jim Crow was over.

Much of the public greeted the decision with elation and joy, nowhere better captured than in a photograph of a young mother sitting on the steps of the Supreme Court, cradling her young daughter in the crook of her arm, holding in her lap the next day’s newspaper, with its outsized front-page headline HIGH COURT BANS SEGREGATION IN NATION’S SCHOOLS. Warren’s opinion was greeted with near equal pleasure by Cold Warriors, who called it a “blow to communism.” Even the Republican National Committee—granting Eisenhower credit for a decision that Truman’s Justice Department had pursued—celebrated the court’s ruling, stating that “it helps guarantee the Free World’s cause.”136

But not all civil rights activists had supported Marshall and the NAACP’s legal strategy, not all African Americans wanted their schools to be desegregated (which often resulted in black teachers losing their jobs), and many who did nevertheless placed greater priority on other political goals. In a 1935 essay called “Does the Negro Need Separate Schools?” W. E. B. Du Bois had written about something almost ineffable in a teacher’s understanding of the world of her students. Dissenters within the NAACP found that its willingness to bring the fight for civil rights to the courts came at the expense of securing better jobs, equal pay, and fair housing. In Atlanta, home to five historically black colleges and universities, nearly half the city’s public school teachers were black and, of those, three-quarters were black women. Black teachers had been lobbying the legislature for equal pay and for equal funding for black schools. Atlanta lawyer A. T. Walden had begun filing pay equity suits on behalf of the city’s teachers in 1942 and the next year had filed a class action suit with Thurgood Marshall. In 1950, when Marshall turned the NAACP strategy to integration and Walden began pursuing desegregation cases, the editor of the Atlanta Daily World was among the most outspoken of those black leaders who objected, arguing that much would be lost for black children sent into white schools, especially at a time when the legislature, under growing grassroots pressure, was beginning to move on equalizing funds and opening new black schools. The strongest reservations were those of black schoolteachers; even in Topeka, they “wanted no part of the effort to desegregate the schools.” After Brown, they continued to be skeptical. Marshall did not hide a frustration laced with contempt. “We will try to convert them to our way of thinking,” he said, days after the ruling. “But we will walk over them if they get in our way.”137

Among whites, especially in the Jim Crow South, Brown was met with swift and sustained resistance. Eisenhower had been dismayed by the ruling: “I am convinced that the Supreme Court decision set back progress in the South at least fifteen years,” he said privately. “The fellow who tries to tell me you can do these things by FORCE is just plain NUTS.” Segregationists prepared for battle. “There is nothing in the United States Constitution that gives the Congress, the President, or the Supreme Court the right to declare that white and colored children must attend the same public schools,” said Mississippi senator James Eastland. And a new movement began, called “Impeach Earl Warren.”138

The court urged schools to desegregate “with all deliberate speed.” Some schools in cities and towns like Washington and Baltimore complied. The overwhelming majority did not. In some cities, like Atlanta, where many black families were deeply ambivalent about the NAACP’s legal strategy, the school board dragged its feet, and black activists and black teachers’ unions didn’t press them. In other cities, all-white school boards simply refused to budge. In 1955, in eight states in the South, not a single black child attended school with a white child. The Richmond News Leader wrote that year: “In May of 1954, that inept fraternity of politicians and professors known as the United States Supreme Court chose to throw away established law. These nine men repudiated the Constitution, spit upon the Tenth Amendment and rewrote the fundamental law of this land to suit their own gauzy concepts of sociology. If it be said now that the South is flouting the law, let it be said to the high court, You taught us how.”139

The court could disavow Jim Crow, but it would take a fight to dismantle it. Sometimes that fight took place at the very doors of public schools, where black children were placed on the front lines. It also took place on buses and in restaurants, in the acts of defiance that had become commonplace in the 1940s, even if they had been rarely reported. After Brown, reporters took notice. On December 1, 1955, in Montgomery, Alabama, Rosa Parks, a forty-two-year-old seamstress, refused to give up her seat on a bus to a white man. Parks, born in Tuskegee, had joined the NAACP in 1943, when she was thirty; secretary of her chapter, she’d worked on voter registration and the desegregation of transportation. Parks had made a purposeful decision to challenge segregated seating on the city’s buses. The driver stopped the bus and asked her to move, and when she again refused, he called for police, who arrested her.

The next night, a twenty-six-year-old minister named Martin Luther King Jr. was drafted to lead a citywide protest that would begin the following Monday, December 5. Born in Atlanta in 1929, the son of a minister and NAACP leader, King had been inspired by American evangelical Christianity, by the liberal theologian Reinhold Niebuhr, and by anticolonialism abroad, particularly by the rhetoric and tactics of nonviolence practiced by Mahatma Gandhi. King had wide-set eyes, short hair, and a pencil mustache. Ordained in 1948, he’d attended a theological seminary in Pennsylvania and then completed a doctoral degree at Boston University in 1955 before becoming pastor at the Dexter Avenue Baptist Church in Montgomery. Lean and quiet as a young man, he’d grown sturdier, and more stirring as he mastered the ancient art of preaching.

On the fifth, with less than half an hour to pull together a speech for a mass meeting to be held at Montgomery’s Holt Street Baptist Church, he found himself with a few moments to spare when, on his ride to the church, traffic all but stopped. Cars snaked through the city. More than five thousand people had turned up, thousands more than the church could fit. King climbed to the pulpit. The crowd, while attentive, remained hushed until he found his rhythm. “As you know, my friends,” King said, his deep voice beginning to thrum, “there comes a time when people get tired of being trampled over by the iron feet of oppression.” Pressed into benches, people began stomping their feet and calling out, “Yes!”

“I want it to be known throughout Montgomery and throughout this nation that we are Christian people,” King said as the crowd punctuated his pauses with cries. “The only weapon we have in our hands this evening is the weapon of protest.” Joining a tradition of American oratory that dated back to the day Frederick Douglass concluded that he could make a better argument against slavery if he decided the Constitution was on his side instead of against him, King called this protest an American protest. “If we were incarcerated behind the iron curtains of a communistic nation—we couldn’t do this,” he said, pausing for the thunder of assent. “If we were trapped in the dungeon of a totalitarian regime—we couldn’t do this.” It was as if the roof might fall. “But the great glory of American democracy,” his voice swelled, “is the right to protest for right.”

Parks had been arrested on a Thursday; by Monday, 90 percent of the city’s blacks were boycotting the buses.140 Over 381 days, blacks in Montgomery, led by Parks and King, boycotted the city’s buses. King, indicted for violating the state’s antiboycott law, said, “If we are arrested every day, if we are exploited every day, if we are trampled over every day, don’t ever let anyone pull you so low as to hate them.” On November 13, 1956, the Supreme Court ruled that the Montgomery bus law was unconstitutional.141

Early the next year, King founded the Southern Christian Leadership Conference (SCLC). If the civil rights struggle of the 1950s was aided by the Cold War, it was fueled by a spirit of prophetic Christianity. A political movement and a legal argument, civil rights was also a religious revival. “If you will protest courageously, and yet with dignity and Christian love,” King promised his followers, “when the history books are written in future generations, the historians will have to pause and say, ‘There lived a great people—a black people—who injected new meaning and dignity into the veins of civilization.’” The historians have obliged: under King’s leadership, and by the courage of those who followed him, and those who’d paved the way for him, a commitment to civil rights became not only postwar liberalism’s core commitment but the nation’s creed.142

But blood would be shed. Justice William O. Douglas always blamed Eisenhower for the years of violence that followed the court’s ruling in Brown, a decision the president, who did not ask Congress for a stronger civil rights bill, never publicly endorsed. Eisenhower, Douglas said, was a national hero, worshipped and adored. “If he had gone to the nation on television and radio telling people to obey the law and fall into line, the cause of desegregation would have been accelerated,” Douglas said. Instead, “Ike’s ominous silence on our 1954 decision gave courage to the racists who decided to resist the decision, ward by ward, precinct by precinct, town by town, and county by county.”143

Orval Faubus, Democratic governor of Arkansas, wasn’t personally opposed to integration; he sent his own son to an integrated college outside of town. But the sentiment of his constituents—who were nearly all white, in a state where blacks were regularly blocked from voting—led him to consider opposition to school desegregation a political opportunity too good to miss. He sought an injunction against desegregation of the schools, and the state court agreed to grant it. Thurgood Marshall got a federal district court to nullify the state injunction, but on September 2, 1957, Faubus went on television to announce that he was sending 250 National Guardsmen to Central High School in Little Rock. If any black children tried to get into the school, Faubus warned, “blood will run in the streets.”

image
Elizabeth Eckford was turned away from Central High School in Little Rock, Arkansas, in 1957, by order of the state’s governor, Orval Faubus.

The next day, before any black children had even arrived, a white mob attacked a group of black newspaper reporters and photographers. Alex Wilson said, as he was knocked to the ground, “I fought for my country, and I’m not running from you.” On September 4, when fifteen-year-old Elizabeth Eckford tried to walk to the school, the white students cried, “Lynch her! Lynch her!” Television coverage of black students confronted by armed soldiers and a white mob wielding sticks and stones and worse stunned Americans across the country. The state of Arkansas had authorized armed resistance to federal law.144

While Eisenhower dithered over how to handle the crisis in Little Rock, Congress debated the 1957 Civil Rights Act, the first civil rights legislation since Reconstruction. It established a Civil Rights Commission to hear complaints but granted it no authority to do anything about them. It was like “handing a policeman a gun without bullets,” said one Justice Department official. Eleanor Roosevelt, as distinctive and influential as an ex–First Lady as she’d been when in the White House, called the law “mere fakery.” One senator said it was about as substantial as “soup made from the shadow of a crow which had starved to death.” Longtime advocates of civil rights, including Richard Nixon, argued for stronger legislation, to no avail. But the 1957 Civil Rights Act set a precedent, and it was galling enough to segregationists that Strom Thurmond, who filibustered against it for more than twenty-four hours, set a new record. The bill was made possible by the wrangling of Lyndon Johnson. True to his Texas constituency, if not to his principles, Johnson had voted against every civil rights bill that had faced him in his career in the House and Senate, from 1937 to 1957. But he’d never been a segregationist, he’d publicly supported the court’s decision in Brown v. Board, and he believed the time had come for the Democratic Party to change direction. Johnson was also eyeing a bid for the presidency, and he needed to be seen as a national politician, not a southern Democrat. He courted and counted votes better than any other Senate majority leader ever had, and the bill passed.145

“Mob rule cannot be allowed to overrule the decisions of our courts,” Eisenhower said on television, and ordered a thousand paratroopers from the 101st Airborne Division to Arkansas, the same division that had dropped from the sky over Normandy on D-Day. On September 25, 1957, U.S. federal troops escorted nine black teenagers to high school. Americans watching on television reeled. They reeled again when, on October 4, 1957, the USSR launched a satellite into orbit. Anyone with a shortwave radio, anywhere in the world, could listen to it, as it made its orbit: it emitted a steady beep, beep, beep, like the ticking of a heart. In the United States, a nation already on edge at the specter of armed paratroopers escorting children into a school, Sputnik also created a political panic: the next obvious step was putting a nuclear weapon in a missile head and firing it by rocket. In both the race to space and the arms race, the Soviets had pulled ahead.

The Cold War would keep overshadowing the civil rights movement, and also propelling it forward. The battle to end segregation in education was far from over. Faubus—who’d earned the nickname “that sputtering sputnik from the Ozarks”—decided to shut down Little Rock’s high schools rather than integrate them. He declared, “The federal government has no authority to require any state to operate public schools.”146

image
On the cover of Life, MIT scientists attempt to calculate the orbit of the Soviet satellite Sputnik while the magazine promises to explain “Why Reds Got It First.”

Two weeks after Sputnik was launched, Eisenhower met with the nation’s top scientific advisers, asking them “to tell him where scientific research belonged in the structure of the federal government.” That meeting led, in 1958, to the creation of the National Aeronautics and Space Agency. NASA would establish operations in Florida and Texas, and fund research in universities across the former Cotton Belt, the science-and-technology, business-friendly New South, the Sun Belt.147 That meeting also led to the creation of the Advanced Research Projects Agency as a branch of the Department of Defense. It would be based in the Pentagon. One day, it would build what became the Internet. In February 1958, after Sputnik, and one month after Eisenhower announced ARPA, the Bulletin of the Atomic Scientists’ Science and Security Board moved the atomic Doomsday Clock to two minutes before midnight.

The hands of time seemed, at once, to be moving both forward and backward. Thurgood Marshall looked back at the late 1950s in dismay. “I had thought, we’d all thought, that once we got the Brown case, the thing was going to be over,” he said bitterly. “You see, we were always looking for that one case to end it all. And that case hasn’t come up yet.”148

That case did not come. Equality was never going to be a matter of a single case, or even of a long march, but, instead, of an abiding political hope.