TOWARD THE END of the 1950s the Soviet Union had about 3.6 million persons under arms, the United States about 2.5 million. Each nation maintained stupendous nuclear arsenals capable of devastating the other. With its more than 500 long-range B-52S, almost 1,800 medium-range bombers, twenty-three aircraft carriers, and three Polaris submarines, the United States was far ahead in the capability for continental attack or massive retaliation, while Soviet troops, tanks, land-based aircraft, and medium-range ballistic missiles confronted the European allies. British and other friendly forces bolstered the power of the West. Amid a remarkable variety of nuclear weapons—from recoilless riflelike Davy Crocketts for close-up infantry support to eight-inch howitzers to ballistic missiles— the Strategic Air Command wielded the most usable military might, for its “bombers could burst through the pervious screen that protected our opponent.”
The nation’s relative economic power was even more marked than its military. National economic productivity had grown strongly during the 1950s, as had the gross national product. Americans produced almost half the world’s generation of electricity and large shares of its steel, copper, and coal. The nation’s exports and imports dwarfed those of all other countries. In 1955 a congressional report had shown the United States leading the Soviet Union in virtually every dimension of economic power. “For the United States,” economic historian David F. Noble summed it up, “the postwar decades were an expansive time, fertile ground for technological achievement and enchantment.” Assured by their leaders—in this era before Sputnik—of their unrivaled military, economic, and industrial might, “infused with the pride, confidence, and triumphant optimism of victory, relatively unscarred by the actual horrors of war, and with the ruins of failed empires at their feet, Americans embarked upon their own ambiguous fling at empire.”
An empire? Americans hardly thought in such grandiose terms. Yet their influence reached to the Caribbean, to Alaska and Hawaii, both of which became states in 1959, to old and to newly reconquered bases in the Pacific, to their military protectorate Taiwan. Military, economic, diplomatic ties intertwined American power with Latin American nations through the Inter-American Treaty of Reciprocal Assistance of 1947, with European and Middle Eastern nations through the North Atlantic Treaty of 1949, with Australia, New Zealand, Pakistan, Thailand, and other nations through the Southeast Asia Collective Defense Treaty of 1954, with Japan and South Korea through mutual defense treaties. Dulles spent much of his time rushing from capital to capital repairing and refurbishing these ties. On the vast chessboard of Atlantic and Pacific power few doubted which nation held queen, castles, and knights.
European allies viewed American military and technological prowess with awe and fear. The Yanks’ advanced aircraft, their H-bombs and capacity to deliver them, their magnificent flattops, along with their fancy big automobiles, refrigerators, computers, and other gadgetry, were the talk of European capitals. But foreigners feared that Americans with their awesome military power were like children playing with dangerous toys. European pundits wrote scorchingly of American pretensions of leading the “free world,” of American soldiers and other visitors who scattered their dollars, Cokes, slang words, and bastard children across the Continent. French intellectuals attacked the new barbarism, while nourishing the consoling thought that Europe could serve, in Max Lerner’s words, “the role of a cultural Greece to the American Rome—a Greece which, while conquered, takes the conqueror captive.” Still, in the aftermath of the Marshall Plan most people in countries benefiting by the Plan admired America’s “free elections” and other democratic institutions, saw the “real issue” as “communism and dictatorship versus democracy and freedom,” and had faith in the Plan itself as aiding European economic recovery.
The Kremlin responded to Western militancy with its own provocations and interventions, threats and bluster, combined with occasional essays at détente. Behind its actions lay deep fears—of an increasingly independent and hostile China, a turbulent and unpredictable Middle East, unrest in its satellites that might lead to outbreaks of Titoism, a resurgent Germany. Above all Moscow feared renewal of the old threat of Western—now “capitalist”—encirclement. The view from Moscow was of American power stretching from the Aleutians through Korea and Japan to the Philippines, from Southeast Asia to Turkey and Greece up through Europe to Scandinavia. Acutely aware of his strategic nuclear inferiority, Khrushchev resorted even to bluff to conceal it—most notably when he flew the same squadron of his Bison bombers in circles around a reviewing stand at a 1955 Aviation Day ceremony.
Yet even uneasier and more mistrustful than the Russians in the late 1950s were most Americans. If a leader as levelheaded as Eisenhower could have been “obsessed” by fear of a communist Iran on the borders of the Soviet Union, it was hardly strange that many Americans, saturated from school days by talk of Soviet power and Bolshevik evil, should match fear with fear, anger with anger. Thus Khrushchev’s paper-fort deception with his Bisons triggered in Washington a sharp “bomber gap” scare that in turn produced a quick boost in B-52 bomber-building. Visiting the United States, Europeans who viewed themselves as occupying the nuclear front lines were amused to find Americans huddling—intellectually if not physically—in bomb shelters. Americans during the 1950s found themselves feared in their image around the globe but fearful themselves of the future.
Not that most Americans worried about their survival as a nation. Having appointed themselves guardians of liberty, however, they feared for the survival of freedom in the Western world. As in the past, they were far more effective in saluting freedom than in defining it. But definition was crucial. FDR’s Four Freedoms—of speech and religion, from fear and want—were only a starting point. What kind of freedom—individual, civil, economic, religious, ethnic? Freedom for whom—minorities, blacks, women, artists, intellectuals, censors of textbooks, extremists, pornographers, noncitizens? Freedom when—after World War II, after the cold war? Freedom from whom? The reds abroad? The “commies” at home? The “feds”? Corporation chiefs? Foremen? Deans? Religious zealots? Group and community pressures? It would become clear in the 1950s and 1960s that threats to these freedoms emanated from far more complex and numerous sources than the Politburos of Moscow and Peking.
The most striking aspect of freedom in America in the 1950s was its grounding in the nation’s technological might and economic abundance. During the half century past, Americans had proved their capacity to outproduce their rivals in automobiles, domestic appliances, and a host of other manufactures; during the 1940s they had astonished the world with their feats in building ships and weapons. By the early 1950s Americans could—and did—boast that their per capita income of approximately $1,500 was roughly double that of the British and the Swedes, more than four times that of the Russians. With 7 percent of the world population, the United States had over 40 percent of the world’s income.
Would freedom flower in America during the fifties amid such plenty? The prospects were highly mixed. Historically freedom had flourished not where life was nasty, brutish, and short but in expanding economies that fostered equality of opportunity, more sharing, vertical and horizontal mobility. Still horrified by revelations about Nazi mass slaughter, shocked by new revelations about the monstrous “crimes of the Stalin era,” many Americans valued their own liberties all the more. Yet the 1950s, at the height of cold war anxieties, turned out to be a decade of intolerance of other Americans’ ideas. Individualism in the economic marketplace was not matched by individual liberty in the political and intellectual marketplace.
Advocates of the free market were happy with their economic freedoms during the Age of Eisenhower, however, and even more with their economic successes. “We have entered a period of accelerating bigness in all aspects of American life,” proclaimed Eric Johnston confidently in 1957. “We have big business, big labor, big farming and big government.” The former head of the United States Chamber of Commerce even mused whether this was the start of an age of “socialized capitalism.” Certainly American business, if not “socializing,” was consolidating, bureaucratizing, innovating, and proliferating at home and overseas. Over 4,000 mergers and acquisitions of manufacturing and mining concerns occurred during the 1950s—a dramatic number though almost 3,000 fewer than in the 1920s. Large firms took over smaller ones less to run risks than to minimize them; despite Joseph Schumpeter’s warning of the “perennial gale of creative destruction,” the survival rate of large firms during the decade was almost 100 percent. Elaborate systems of recruitment, personnel, information, and leadership training expanded in the big corporations, with the help of complex office machines and business school graduates.
The power of the American economy, however, lay far less in bigness and organization than in technological and scientific advances stemming from a century of experimentation and invention and later propelled by the imperative demands of two world wars and the cold war. And just as nineteenth-century army ordnance needs had promoted such important innovations as interchangeable parts, so world war needs fueled such varied practical achievements as penicillin, jet propulsion, and radar. Massive federal spending for invention and development carried on through the cold war years; by the late 1950s Washington was financing nearly 60 percent of the nation’s total research and development budget. In one major respect, however, twentieth-century technology was more than simply a wider and more varied activity than that of the nineteenth. In Nathan Rosenberg’s words, “an increasing proportion of technological changes” were now “dependent upon prior advances in systematized knowledge.” Innovators were more dependent than in Edison’s day on scientific disciplines such as physics and chemistry.
Some of the postwar advances in specific fields were spectacular. In October 1947, Captain Charles E. Yeager burst through the invisible barrier that had seemed to fix a limit to the speed of flight by flying the experimental rocket-powered X-1 faster than the speed of sound. In September 1948, an air force Sabre set a world speed record for jet fighters at 670 miles an hour; five years after that a Super Sabre became the first jet to cross the sound barrier, hitting 755, and in 1957 a Voodoo jet topped 1,200. The nuclear submarine Nautilus was reported to have used 8.3 pounds of uranium fuel to travel 60,000 miles. After the inglorious “Kaputnik” of the first Vanguard, the American space program made steady progress. And by 1960 the X-15 rocket plane was flying almost twice as fast as the Voodoo.
More down-to-earth, but crucial to a wider technology, were advances in the sector in which Yankee tinkerers had pioneered a century earlier with their milling and grinding machines. This was the machine tool industry. Severely depressed after its World War II expansion—300,000 machine tools were dumped onto the market after the war—the industry burgeoned during the cold war. Aircraft manufacture, a voracious consumer of machine tools, also became increasingly interlinked with the electronics industry, which was now producing its own miracles. Though eventually developing its own huge domestic market, for years electronics reflected wartime need for miniaturization of electrical circuits in proximity fuses for bombs, gunfire control mechanisms, radar and sonar. As late as the mid-1960s the federal government was still providing two-thirds of the “R&D” costs of the electrical equipment industry, which included such giants as General Electric and American Telephone and Telegraph.
Earthiest of all—and perhaps most important of all for its worldwide implications—was innovation in farming. Improved harvesters and other machines, combined with better fertilizers and sprays and new plant strains, produced higher output per acre, a vast increase in production, and a steep decrease in the total work hours in the United States devoted to agriculture. By 1960 8 percent of the labor force was occupied with farming, compared with 63 percent a century before. Hybrid corn, a systematic crossing of selected inbred lines, resulted in an increase in the average yield of corn per acre from 23 bushels in 1933 to 62 bushels in the mid-1960s. Thus hybrid corn research paid off handsomely, returning, it was estimated, seven times its cost by the mid-1950s. The lion’s share of the boost in farm yield came from—and profited—huge family farms, commercial farms, and other components of “agribusiness” that controlled the production and marketing of key foods and fibers through vertical integration, while millions of small farmers and migrant farm workers clung to a precarious livelihood.
Out of the “Enormous Laboratory,” as Max Lerner called it, poured not only new machines and gadgets but the makings of wholly new or immensely enlarged industries—television, antibiotics, electronics, jet aircraft, rocketry. But the actual laboratories that produced this cornucopia of hardware were also the scenes of quiet encounters in one of the oldest intellectual conflicts in America—between the ideal of pure science and the practices of applied science.
Many Americans still venerated the ideal of committed, disinterested science, of free, undirected research, of idle speculation and inspired hunch, of lack of pressure for immediate “practical” results, of a clear separation from the cash nexus—all the more because they could claim only one American in the past century who was comparable to such luminaries of European science as Darwin, Mendel, and Faraday. This was Josiah Willard Gibbs, the Yale mathematician whose work in thermodynamics, vector analysis, and statistical mechanics had belatedly won him an international reputation and whose laws of chemical energetics had enormous impact on processes as varied as the refining of oil, the synthesizing of rubber, and the separation of metals from their ores.
Of scientific eminences the postwar United States had its share—Isador Isaac Rabi and J. Robert Oppenheimer in physics, Hermann Joseph Muller in genetics, George Gaylord Simpson in evolutionary biology, Harlow Shapley in astrophysics, and scores of others. Yet most of these scientists largely depended on the theoretical work of Europeans. Most notably, it was the transformation of theoretical physics undertaken by Einstein, Heisenberg, and others in Germany that had laid the groundwork for atomic fission. Now, as the United States basked in its world economic supremacy, had the time and occasion come for Americans to make great theoretical contributions to pure science?
A century before, Karl Marx had warned that science could not for long be autonomous, that it was a social activity, that the nature of the demand for science was even more important than the quality of its supply. In America, science had to pay the piper. Giant corporations were eager to put vast sums of money into research, but of a special kind, really research and development. While the firms varied in their toleration of free research, sooner or later they expected a payoff in new inventions, patents, profits. The R&D departments emphasized team research, committee decisions, pooled facilities, narrowly focused investigation. There was little encouragement of idle curiosity, messing around, just looking out the window. “The underlying principle, rarely formulated precisely but ever present,” a study concluded, “has been that originality can be organized; that, provided more people can be equipped with technical knowledge and brought together in larger groups, more new ideas must emerge; that mass production will produce originality just as it can produce sausages.” Military needs created even heavier demands for scientific group-think and the organization man.
Politicians and scientists alike attacked the restrictions on Soviet science, but Americans could hardly be complacent. Aside from confronting seductive commercial and military demands on R&D, scientists had to contend with a popular double impulse to worship them and to fear them—the worship leading to unduly high popular expectations followed by disappointments, the fear leading to suspicion of their unorthodoxy and associations, as witness the classification of Robert Oppenheimer as a “security risk.” Pleased by statements such as that of Harvard’s president, James Conant—subsidies should go to persons, not projects—some scientists sought to protect their freedom of inquiry and communication by remaining in the universities. But scholars in the groves of academe were not free from political and press attacks, outside pressures for directed research, the temptations to undertake team projects and group authorship, the enticements of big corporate and military money.
Perhaps the major obstacle to “free science,” however, was the empirical tradition in American scientific thought. The heroes of American popular science were the Thomas Edisons who disdained formal abstract knowledge or theorizing and preferred to tinker “by guess and by God” in their labs. It was this feet-on-the-ground compulsion that had channeled American genius into technology and engineering. If the nation were now to make as well a truly substantial contribution to scientific progress, greater freedom to reflect and to brood, freer play for the creative imagination, were crucial.
Possibly some of the applied scientists, ensconced in their big laboratories and snug in their teams, recalled the lot of Professor Gibbs. He had worked at Yale almost alone and undisturbed. He had no team. He had few close friends and few students. He had no wife or children. He had no pay from Yale for a decade or so, until Johns Hopkins in 1880 offered him a professorship with salary, at $3,000 a year. Only then did Yale put him on its payroll, at $2,000, “with prospects of an early increase.”
One controversial application of “science” related to the men and women who in turn related to machines. Initially called “scientific management,” it was first popularized by Frederick W. Taylor. After brilliant inventions of automatic grinding, forging, and tool-feeding mechanisms, Taylor had moved on at the turn of the century to time-and-motion studies designed to fit workers more closely to the imperatives of the machines and thereby increase industrial efficiency. The production process was functionalized and standardized by dividing it into measurable and controllable units of time and motion. Under Taylor’s leadership the idea was picked up by a host of large corporations, including American Locomotive, Brighton Mills, Yale and Towne Lock. Machines, however, proved more easily manageable than men. Most workers preferred to follow their own motivations, rhythms, craft routines, group standards. A strike of molders in 1911 at the huge Watertown arsenal near Boston led to a government investigation and later a ban on Taylorism in government arsenals. A young assistant secretary, Franklin D. Roosevelt, imposed the ban in navy yards.
Turning away from Taylorism as a system of managerial dictation— Taylor himself declared each worker must become “one of a train of gearwheels”—some “industrial scientists” tried to civilize the production process by “human engineering” or “human relations.” Psychologists and other social scientists were enlisted in this cause. Often benign in intent while manipulative in technique, “humanizing” turned out to be an effort to motivate workers through their own psychological processes rather than through managerial controls. Advocates of the method said that it promoted better communication, involved workers in at least minor decisions, enhanced “group feeling” and a sense of teamwork, fostered “leadership” as opposed to “control.” During and after World War II, the idea of human relations in industry flourished.
Still the workers resisted. When Henry Ford II said that solving “the problem of human relations” would immensely speed up “progress toward lower costs,” men and women on the line could wonder whether their welfare or lower costs and higher profits were the goal. Union heads spoke sarcastically of foremen receiving training in the art of convincing workers “that they really are deeply beloved by the boss,” of employers “trooping to the special classes at Harvard” to learn that while the bosses were in business for a fast buck, workers reported to the plant each morning “for love, affection, and small friendly attentions.”
A thirty-seven-year-old worker, interviewed at home, described what real life was like “on the line.” His job was to spot-weld the front cowling onto an automobile underbody.
“I take a jig off the bench, put it in place and weld the parts together.” The jig was all made up in advance. “Takes me one minute and fifty-two seconds for each job. I walk along the line as it moves. Then I snap the jig off, walk back down the line, throw it on the bench, grab another just in time to start on the next car.”
He did this eight hours a day, with a breather in the morning and afternoon and a half-hour for lunch. “Sometimes the line breaks down. When it does we all yell ‘Whoopee!’”
He hated his work. “I like a job where you feel like you’re accomplishing something and doing it right.” But everything was laid out for him. “The big thing is that steady push of the conveyor—a gigantic machine which I can’t control.” He had ideas for improvements but no one asked him. “You go by the bible.”
Why not quit? “I’ll tell you honest. I’m scared to leave.” He was getting good pay, was on the pension plan, the lighting and ventilation were good, he could use the plant hospital. “Sorta trapped—you get what I mean?”
So how did he cope? By sharing the “misery” with his partner. “We gripe about the job 90 percent of the time.” By walking out with the others when something intolerable happened—like when a guy was “bounced” because he was slow on the line. By snapping at his family when he got home, his wife added. The people who ran the plant, the worker said finally, were “pretty good guys themselves.” But “you’re just a number to them. They number the stock and they number you.” He was just so much horsepower. “You’re just a cog in the wheel.”
His wife often wished he’d get another job. “He comes home at night, plops down in a chair and just sits.…”
If workers were not happy with their machines, applied scientists could invent a new machine that had less need of workers. This was automation. Mushrooming during the 1950s, the automatic equipment industry reached annual sales of over $6 billion by the end of the decade. World War II needs had hastened the development of electrical servomechanisms that operated on the principle of input-output flow and feedback in a continuously self-correcting control loop. Stimulated by such advances, the industry took off after the war and was soon integrating digital computers, sophisticated programming techniques, and vast data and memory banks into elaborate remote-control systems, including the automation of whole factories. By 1951 a Ford engine plant was feeding castings, already produced in an automated foundry, into precision broachers that machined the top and bottom of a cylinder block in thirteen seconds. Exclaimed an observer, “It just goes ‘whoosh’ and it is done.”
“Automation is a magical key of creation,” proclaimed the National Association of Manufacturers. “Guided by electronics, powered by atomic energy, geared to the smooth, effortless workings of automation, the magic carpet of our free economy heads for distant and undreamed of horizons.” Others were less euphoric but argued that automation would shrink the number of boring and degrading repetitive tasks, raise educable workers to higher levels of skill and pay, lessen worker fatigue, depression, and unrest.
Still others were not at all enchanted by the “whoosh.” Union leaders stood en garde. The problem was not whether unions were for or against automation, said James B. Carey, president of the International Union of Electrical Workers. “The problem is whether or not the American people and our free society will be subjected to vast dislocations during the coming ten to twenty years, when the automatic operation of many industrial and clerical processes will be introduced.” Fortune had published a photograph of the “old production line”—a vast room full of workers individually tending their machines—followed by drawings of the proposed “automatic factory.” Not a worker was to be seen in the drawings—not even the ornery old “parts inspector.” A photoelectric scanning device would do his job.
At a congressional hearing late in 1955 President Walter Reuther of the Automobile Workers roundly denounced the NAM’s portrayal of automation as part of industrialization’s “Second American Revolution.” Had the NAM forgotten the misery that accompanied the first? Reuther asked. Displaced workers would not give up family ties, local roots, and neighborhood belongingness to go off to new jobs, even if they could find them and were young enough to take them. “Will automation mean the creation of whole new communities in some areas, while others are turned into ghost towns? How can we increase the market for goods and services sufficiently, and quickly enough, to match greatly accelerated increases in productivity?” Industry replied that displaced workers could find better jobs under automation, indeed that automation would create a bigger pie and “everybody’s slice will be larger.”
While the argument waxed, so did automation. Ford helped lead the way, with its partially automated cylinder-block line and automated production of crankshafts and small parts. As the number of workers “on the line” increased and the number doing more skilled “bench work” on parts and subassemblies dropped, auto worker militancy fell. It had been the more skilled workers, such as metal trimmers and torch welders, with their comradeship and critical production role, who had sparked the great strikes and demonstrations. “Automated” workers appeared to be psychologically atomized.
It was this wider impact of automation and of the tendencies that accompanied it—toward bigness, bureaucratization, even bondage—that concerned a wide array of social observers. Deep concern over such tendencies was almost as old as the trends themselves. From the rampaging machine wreckers at the dawn of the industrial revolution to the latest walkout in protest against automation, human beings had feared the machine as a threat to their status, income, security, and pride. Marx had seen that productive forces rising from technological-social change both reinforced the social order and undermined it. A century before Ford’s automation William Morris fought to preserve handicrafts against the ravaging advance of the machine.
The Englishman Samuel Butler wrote in his 1872 anti-utopian novel, Erewhon, that man is supposed to be the master and the machine the servant, but “the servant glides by imperceptible approaches into the master,” and now man is overly dependent on his “servant,” and his very soul is becoming a machine-made thing. Man is in bondage; he can only “hope that the machines will use us kindly.”
By the 1950s there was less concern over the economic and industrial effects of automation and other technological developments than over the psychological and social. Sociologists feared that the obsessive focus on production, combined with the fragmentation of workers’ lives into numbing pressure on the job and emptiness outside it, in the long run would impair both efficiency and the health of the whole culture. Daniel Bell noted Freud’s observation that work was the chief means of binding an individual to reality. “What will happen, then, when not only the worker but work itself is displaced by the machine?” Many social scientists were influenced by the work of Lewis Mumford, who in Technics and Civilization and other writings had graphically pictured the machine as part of a system of power, superfluous production as “purposeless materialism,” and technology as increasingly the master of man. Two technologies existed side by side, Mumford wrote in the wake of the 1950s, “one authoritarian, the other democratic, the first system-centered, immensely powerful, but inherently unstable, the other man-centered, relatively weak, but resourceful and durable.” It was time for human interventions in behalf of human alternatives.
The human use of human beings—this was the particular concern of Norbert Wiener, who published a book with this title at the start of the 1950s. A professor of mathematics at MIT and author of Cybernetics, which examined the dynamic role and implications of feedback in purposeful machines and animals, Wiener shared with a wide public his fears that “thinking machines” would render the human brain obsolete, especially in an era of mammoth war technology.
But what precisely was the impact of the machine, especially automation—and what could be done about it? During the 1950s the social scientists’ diagnosis was twofold: alienation and anomie. Definitions of these phenomena varied widely, and hence diagnosis and prescriptions did as well. Alienation—from work, from family and community, from self? The standard answer was: all of the above. Specialization, compartmentalization, and routine left workers with little sense of accomplishment, fulfillment, or creativity on the job, and this emptiness carried over into life outside the workplace. But was the essential problem—both on the job and off it—the kind of powerlessness that Marx had analyzed, or the kind of “meaninglessness” that Karl Mannheim had seen as robbing persons of the capacity to make decisions, or the kind of normlessness that Emile Durkheim long since had analyzed in studies of anomie, or the sense of isolation and self-estrangement that was becoming the focus of social psychologists in the 1950s? Great disputes arose about these questions, with the social analysts themselves divided by discipline, specialization, and ideology.
The diagnosis of anomie aroused the sharpest concern, for it applied to a person’s whole life. Defined broadly as the collapse of social norms that regulate social attitudes, expectations, and behavior, a condition of anomie could have a variety of effects: a normlessness marked by the feeling that “anything goes”; a hunger for direction and authority that might lead to a turning toward autocratic leaders; a craving for reassurance from peers and superiors; a proclivity to manipulate others in a culture lacking standards of more benign human interaction; even a tendency to rely on what Robert K. Merton called mysticism—“the workings of Fortune, Chance, Luck.” But anomie remained a somewhat amorphous concept, overly extended, as Melvin Seeman complained, to a variety of social conditions and psychic states such as personal disorganization and cultural breakdown.
Inescapably the cardinal question arose—by what standard, what principle, what central value was the impact of technology being measured? Social observers were remarkably agreed: the test was freedom in all its dimensions and in all its equivalents such as liberty, liberation, individuality. Virtually every idea and program was advanced and defended by reference to this overriding value. “In the present situation of material and intellectual culture,” wrote Herbert Marcuse, a philosopher of the émigré Frankfurt school, “the problem of values is, in the last analysis, identical with the problem of freedom.” That one idea covered all that is “good, right and admirable” in the world. “Freedom—and this is the profound result of Kant’s analysis—is the only ‘fact’ that ‘is’ only in its creation; it cannot be verified except by being exercised.”
Marcuse had his own very definite idea, however, as to what freedom was or should be. Freedom was liberation from an increasingly impersonal, bureaucratic, oppressive technology, from the long and oppressive hours of work that drained people of their humanity, from the restrictions on human spontaneity, creativity, erotic fulfillment, and sensuous joy—restrictions of a Freudian as well as Calvinistic origin. The pursuit of happiness was the quest for freedom; indeed, freedom was happiness, in the fullest dimensions of both these noble concepts.
But other acolytes of Freedom saw different dimensions. They were not only like blind men feeling different parts of the elephant; each was loudly touting his part of the elephant as the whole elephant. For over two centuries Americans had debated and squabbled and even warred over the definition of freedom. During the 1950s the quarrel turned into a cacophony.
“We are children of freedom,” Dean Acheson had proclaimed. All agreed, though not all knew what he meant. During the 1950s American leaders proclaimed freedom throughout the world and for all the world. Conflict and confusion over the principles and practices of freedom did not deter the ideologues of freedom from prescribing it for all. Even before Pearl Harbor, Henry Luce, editor-in-chief of Time and Life and Fortune, had urged the British to follow “America as the dynamic center of ever-widening spheres of enterprises, America …as the Good Samaritan, really believing that it is more blessed to give than to receive, and America as the powerhouse of the ideals of Justice and Freedom.” Though Luce had some second thoughts, at the end of the 1950s he struck the same note: “The founding purpose of the United States was to make men free, and to enable them to be free and to preach the gospel of freedom to themselves and to all men.”
Not only pundits but philosophers sounded this theme. Sidney Hook spoke for many of his fellow intellectuals when he urged on them the duty to publicize the “elementary truth” that what divided the world was “the issue of political freedom versus despotism.” Politicians long before had climbed aboard the Freedom bandwagon with alacrity. If Roosevelt’s Four Freedoms appeared a bit tattered and weather-beaten by now, many Americans remembered that his so-called “Economic Bill of Rights” had spelled out those freedoms in a most specific way, that Truman had sought to implement them, and that even Eisenhower was paying more than lip service to them.
To celebrate Freedom was to celebrate America, and vice versa. When Luce proclaimed in 1941 the belief—shared, he said, by “most men living”—that “the 20th Century must be to a significant degree an American Century,” he laid out the peculiarly American ideals and institutions that must be shared with others—“our Bill of Rights, our Declaration of Independence, our Constitution, our magnificent industrial products, our technical skills.” The reaction was not wholly favorable. Reinhold Niebuhr found an “egotistic corruption” in the very title, a critic dubbed Luce the Cecil Rhodes of journalism, and Henry Wallace countered Luce with a proclamation of the century of the common man. Was this the new American imperialism? Luce later talked less about the American Century but still pushed the doctrine.
Embarrassments occasionally marred this glowing portrait of Freedom versus Autocracy. Leading American intellectuals became furious over “party liners’ control” of a 1949 Cultural and Scientific Conference for World Peace at New York’s Waldorf-Astoria Hotel. Sidney Hook himself had been denied the rostrum to offer a paper disputing the Marxist doctrine of “class truth.” In reply such European and American luminaries as André Malraux, John Dos Passos, Ignazio Silone, Tennessee Williams, Arthur Koestler, and Hook met in Berlin in June 1950 to inaugurate the Congress for Cultural Freedom. Supporting messages arrived from Eleanor Roosevelt and Niebuhr. After properly flaying totalitarian thought control, a number of participants proclaimed that the West must take its stand on communism—it was “either-or.” Condemning those who preferred “neither-nor,” the Congress set up a nucleus of internationally known writers who would have no truck with “neutrality” in the struggle for freedom. It would later develop that the activities of the Congress in the 1950s and 1960s were subsidized in part by the CIA, which disbursed funds through fake foundations.
Still, the American intellectuals did not need Washington gold to stiffen their resolve. Their views sprang from the very core of their belief in individual liberty and human rights. And they gained immeasurably both in their self-confidence and in their influence from their conviction that while the other side was ideological, their own position was not. They contended that after the passions of the New Deal era, the struggle with Hitlerism, and the polemics of the cold war, Americans were spurning ideology as the “opium of the intellectuals,” in Raymond Aron’s words, or coming to the “end of ideology,” in Daniel Bell’s. “Looking back from the standpoint of a newly-achieved moderation,” wrote sociologist Edward Shils, “Western intellectuals view the ideological politics of Asia and Africa, and particularly nationalism and tribalism, as a sort of measles which afflicts a people in its childhood, but to which adults are practically immune.”
Picturing ideology as a form of childhood measles was a curious indulgence on the part of intellectuals who themselves were acting as ideologists by any neutral definition of the term. If an ideology consists of a comprehensive set of goals or values, reflecting the mobilized attitudes of a large section of the public, expressed through institutions such as the press and the state, and legitimized by appropriate political, economic, and other establishments, then postwar Americans indeed possessed an ideology that was brilliantly expressed by its pundits and philosophers. It was an ideology of hazy, undefined ends and richly differentiated means—moderate and incremental policy-making machinery, a politics of bargaining and accommodation, a polity rich in voluntary associations and pluralistic groupings, all leading to a mixed economy and a stable, balanced, consensual society.
Ultimately this kind of society reflected a political ideology of consensus and compromise. Men of ideas such as Hook, Bell, Schlesinger, and Daniel Boorstin often differed on specific issues and reforms, but they struck historian Richard Pells as tending “to elevate existing American customs and institutions to a set of normative ideals.” If they were more interested in analyzing society than in reforming it, however, their “retreat from ideology” did allow them to focus on current economic and political realities, Pells granted. “At the same time, their high regard for pragmatism and stability, together with their dread of fanaticism and upheaval, were reasonable and humane reactions to the catastrophic experiences of the twentieth century.”
It was not that the social critics had wholly deserted their old vocation of judging their own culture. Even though their ideas had reflected the ideals of the European Enlightenment—but without passion, as Shils suggested—those ideas continued to arouse disputes within the still compelling trinity of liberty, equality, and fraternity. Amid the relative affluence of the 1950s critics now appeared less troubled by the lack of real equality of opportunity for the less privileged, far more concerned about the meaning of freedom for the middle classes and the threat to that freedom from solidarity of a smothering suburban kind. Most of the critics deplored the vast disparities in income and welfare among Americans—who could not?—but many of them now focused on psychological and cultural trends within the middle class rather than economic and social deprivations within the working class and the poor. Even the anxiety over automation amounted to a worry over psychological deprivation rather than over bread-and-butter issues of take-home pay.
And looking out over the social and physical landscape surrounding the cities, critics felt they had plenty to worry about. Huge eight-lane highways were grinding their way through the working-class sectors into the greener areas beyond, bringing in their wake concrete cloverleafs, shopping malls, towering apartment houses, and—much further out—suburban ranch houses complete with swimming pools, manicured greens, picture windows, and outdoor barbecue pits. These were the baronies of the new middle classes, “from the managerial employees and the ‘idea men’ in the talent professions at the top,” Max Lerner wrote, “to the file clerks and sales girls at the bottom: a formless cluster of groups, torn from the land and from productive property, with nothing to sell except their skills, their personality, their eagerness to be secure, their subservience and silence.”
The new middle classes, bursting with achievers and achievers-to-be, with postwar “baby-boomers,” with creative skills, with ladders of upward mobility, were the source of enormous energy and talent in the America of the 1950s, and a source too of social and political equilibrium. But critics, even aside from the intellectual disdain for picture windows and barbecue pits, worried about more than dreary suburbs and empty lives. They fretted over psyches. A central fear carried over from earlier work by Erich Fromm, a German philosopher and psychoanalyst who had emigrated to the United States after Hitler’s seizure of power. In 1941 Fromm published Escape from Freedom, which held that, upon the lifting of feudal ties and hierarchy, Protestantism had produced fearful and alienated persons, that industrialization had forced on such persons competitive, insecure lives that left them fearful of economic crises, loss of jobs, and imperialistic wars, and that the outcome was a tendency to submit to authoritarian leaders who offered them feelings of involvement, security, and power. This was the road to fascism. While Fromm feared these tendencies in all strata, he and his followers saw the middle classes—especially the lower middle classes—as most vulnerable to the appeal of fascism.
Other social critics eschewed such apocalyptic visions but they had major concerns of their own. In 1950 David Riesman, a University of Chicago social theorist still in his thirties, gained almost instant attention with The Lonely Crowd, a study of the “other-directed” personality that had replaced the “inner-directed” product of the Protestant ethic, which earlier had superseded the “tradition-directed” member of a hierarchical society held in the family embrace of clan, caste, and castle. In the affluent, leisure-minded postwar era, Riesman’s other-directed man, anti-individualistic, group-centered, and conformist, put social solidarity and harmony over his own individuality and was ready to market his personality rather than his skill or creativity. The Lonely Crowd was studded with memorable phrases and insights: the oversteered child—from Bringing Up Children to Bringing Up Father—from craft skill to manipulative skill—from the bank account to the expense account—heavy harmony and lonely success— the automat versus the glad hand—captains of industry and captains of consumption.
In this book and in Individualism Reconsidered, a collection of essays, Riesman explored the implications of the new conformity for freedom. The pressure of the group on most individuals was so profound, he contended, that they doubted both their ability and their right to stand on their own. How then to protect the freedom of the individual, especially for the solitary man? Only by an appeal to resources within the person, to a heightened self-consciousness, an awareness of potentials and inadequacies, the will to exercise a freedom to make choices in a realm somewhere between anarchy and conformity. Ultimately Riesman sought to find a balance between individuality and comradeship.
Another social critic, William H. Whyte, Jr., wrote even more urgently about the influence of the scientific managers, the recruitment bias toward team players and mediocrity, the corporate demand for togetherness, adjustment, compromise, and conformity. A Princeton graduate and longtime Fortune editor, Whyte followed the spoor of his Organization Men in multiversities and suburbia and their flowering in the hierarchies of corporation boards, law factories, hospitals, banks, the military, education, and organized religion. The old-fashioned boss wanted only a man’s sweat, he wrote, the modern boss his soul. “Group-think,” company loyalty, committee decision-making were the order of the day.
Whyte’s central concern was over the personal freedoms of individuals, their liberation from the group, the office, the organization. His solution, like Riesman’s, was to appeal to the individual to be an individual, to broaden freedom within the group or organization. Nonconformists, however, must recognize the genuine needs of the organization even as they explored crevices, escape mechanisms, room at the bottom. Was this enough? Even Whyte seemed unsure.
Hailed as the latest in social criticism, the work of both Riesman and Whyte had old-fashioned assumptions and goals. Their appeal to the individual to win liberation from group, community, and office was hardly more than a sophisticated modernization of the eighteenth- and nineteenth-century search for liberty from church and state. Such Bill of Rights liberties were fundamental but were they enough? What was freedom for? Released from their social or other bonds, how could human beings define and achieve more positive freedoms—even those as simple and fundamental as FDR’s Four Freedoms? Riesman and Whyte and their colleagues grappled with this kind of question too, but at best they could foresee some kind of murky balance between autonomy and togetherness. Hence they appeared to leave humankind in an existential predicament.
Herbert Marcuse did not share such intellectual inhibitions. Borrowing heavily from Hegel as well as Marx, he explored the interiors of freedom and proposed a dialectical consciousness able to perceive an alternative to any given reality and to use reason to judge between true and false needs. To think dialectically was freedom itself, the vital creative act. Marcuse wanted theory geared to human needs; he wanted the productive system to serve the needs of the entire society; he wanted people to test all the possibilities of erotic and intellectual fulfillment; he wanted them to convert their bodies into instruments of pleasure. But the means of reaching this Utopia he left fuzzy.
And so the acolytes of Freedom argued and agonized over its meaning in a modern age. If intellectual progress seemed slow—if the very terms were misty, if the content of Freedom was so comprehensive as to seem all-inclusive, if the priorities among types of freedom lacked precision, if the ways of expanding freedom ranged from the overly utilitarian to the foolishly Utopian—if all this was true, at least the theorists and practitioners of Freedom were bumping up against its wider and deeper dimensions, organizational, industrial, psychological, cultural, and sexual.
At the very heart of the American idea of freedom were still the noble “First Freedoms” of thought and speech, press and religion. And of these freedoms none was more vigorously pressed and expressed in the fifties than the right of newspapers and other media to offer information and opinion without interference by public authority or hostile group. Even as the press proclaimed and practiced freedom, however, the media faced threats to their liberties far less from outside foes than from internal tendencies toward consolidation and conformity.
Automation continued to transform the production of newspapers as it did other industries. Mergenthaler’s Blue Streak Comet bypassed the venerable Linotype keyboard by using teletype tape for matrix assembly. Harris-Intertype’s Monarch speeded up slug-casting by tape that performed over ten operations per second. Old-time newshawks, weaned on their Linotype, looked on in amazement as tape-triggered brass matrices streamed into the assembly, “making a clinking rhythm like that of a poker player at Las Vegas riffling silver dollars,” Editor & Publisher reported, “and quicker than you could say ‘aces back-to-back,’ column-width slugs lined up at the rate of fourteen per minute.” Technology continued apace. By the late 1950s Interstate’s Fotosetter, arraying characters by photography on acetate film for the printing plate, was offering a variety of type sizes. Even the larger dailies were now using the wire services’ teletypesetters that transmitted tape ready for the composing machines.
Enormously expensive, the new machines called for heavy investments that in turn encouraged newspaper consolidations. William Randolph Hearst’s death at the start of the decade and the succession of his five sons as heads of Hearst Consolidated Publications were a reminder that consolidation was one of the oldest habits in the newspaper game, even under the aegis of flamboyant tycoons like the erratic crusader and art collector of San Simeon. By 1960 hundreds of newspapers were organized in “groups,” a term the trade preferred to “chains.” The new Hearst—but far more reserved in manner and provocativeness—was Samuel I. New-house, who controlled the Portland Oregonian, the St. Louis Globe-Democrat, and over a dozen other major papers and magazines. Newhouse astutely combined mergers and cost-cutting techniques. The extent to which consolidation led to conformity—or on the other hand provided papers with the financial capacity not to conform—remained a simmering issue in the newspaper world.
What was not in dispute, however, was that newspapers had become big business—and were assuming the views that went with it. The great majority of newspapers—and an even larger proportion of total newspaper circulation—had been Republican for so long that people took the one-party press almost for granted; some analysts concluded that many readers simply discounted the conservative bias of editors and columnists. More insidious and hence more influential was the press’s tendency toward conformity. By using a term like red, Marxist, socialist, or reactionary, Max Lerner wrote, the newspapers “cast it outside the pale of discussion. By a black magic they place a taboo on it.” Equally insidiously, many newspapers played it safe, featuring comics, crossword puzzles, and games and dramatizing sensations of crime and sex at the same time that their editorial pages were monuments of insipidity.
In the decade of the growing cold war the most marked effect of press conformity was to rally around the cold war. A Protestant magazine article charged that the press and radio “first lay down a terrific barrage against the Red Menace. Headlines without a shred of substance shriek of atom bombs, or plots to overthrow our government, of espionage, of high treason, and of other blood-curdling crimes.” This was the prelude to labeling all opposition as “red.” A longtime press-watcher, Curtis MacDougall of Northwestern University’s journalism school, noted that when an Associated Press correspondent found no war fever in Moscow, his dispatch was buried in inside pages; if he had reported the opposite, it would have been headlined. Polls indicated that the percentage of Americans who viewed a third world war as inevitable rose steadily from 32 percent in late 1945 to 73 percent in early 1948. Many Americans, influenced both by official and press hysteria and demagoguery at home and by events abroad, obviously did see a red menace.
Later it seemed apparent that most journalists were not biased in their reporting but rather, emerging as they did from the same class and cultural environment, shared cold war assumptions that dominated their perception of events. These assumptions made it more tempting for journalists not only to report happenings but to take part in them. The austere Walter Lippmann had a notable role in conceiving and shaping as well as reporting and analyzing the Marshall Plan. Some reporters complained of being “used” by government, others boasted of it, but in fact officials and reporters were each using the other, for their own purposes. This intertwining of government and press became so close during the 1950s that Douglass Cater, a journalist himself, described the press as the fourth branch of government, full of journalistic brokers and middlemen, compartmentalized much like the government itself, trading information with publicity-seeking politicians, and exerting an influence on government that astonished visitors from abroad—especially British journalists, who generally maintained a certain deference toward the cabinet government in London.
It was not an era for journalistic innovation. Would-be enterprisers in publishing could hardly forget the sad story of the newspaper PM, which had risen in the early forties and then fallen despite the brilliant editorship of Ralph Ingersoll, heavy advances by Marshall Field III and other investors, and reporting by a stable of top-flight writers. FDR himself had welcomed the new paper, in part because its refusal to take advertising, Roosevelt wrote Ingersoll, “appeals to me as a new and promising formula for freedom of the press.” Strident, opinionated, ranging in its collective views from left liberal to liberal left, PM had to abandon its policy of not running advertising and then was itself abandoned by Field. The Chicago publisher had other problems. He had started the Chicago Sun in 1941, purchased the Chicago Times six years later, combined the two—and found that McCormick’s Tribune had a lock on newsstands, the best comics, and Associated Press membership.
A brilliant innovation of the 1920s, however, was more prosperous—and more controversial—than ever thirty years later. Time magazine’s circulation soared during the 1950s from 1.65 million to 2.54 million at decade’s end. Life, launched by Time Inc. in 1936, with its circulation base up to 5.6 million by the mid-fifties, belonged to the elite family of mass-circulation magazines: Reader’s Digest (10.4 million), which had begun to take advertising; The Saturday Evening Post (4.8 million); Look (4.1 million biweekly); Collier’s (3.8 million biweekly); Woman’s Home Companion (4.1 million monthly). It was estimated that advertisers had put well over a billion dollars into Life during its first twenty years. By 1960 the total assets of Time Inc., which now included several magazines and a variety of other enterprises, had risen to $230 million, almost a tenfold increase in twenty years.
Preacher and sermonizer, innovator and enterpriser, moralizer and manipulator, Henry Luce ran his empire with a sometimes imperious, sometimes gentle hand. He delegated considerable authority to his subordinates, but the compelling suggestions, pithy comments, and measured exhortations that streamed from his office, along with his custom of directly editing each magazine for a week or two while the managing editor stood aside, brought his forceful presence into every editorial department. Luce made no secret of his own political views—he was for God, country, the OOP, free enterprise, and Yale, not necessarily in that order. All the editorial convictions of Time Inc., Luce wrote his editors in a paper titled “The Practice of Freedom,” could be summarized in one word, Freedom, by which Luce meant the Bill of Rights, representative government, and competitive free enterprise.
Luce played high politics in the Republican party, throwing his own and sometimes his magazines’ weight behind favored candidates such as Wendell Willkie. But like many an American President, he saw far less power looking out of his office than others saw looking in. Many of his best writers and editors were New or Fair Dealers. Most could easily find jobs elsewhere. And to the “top performers,” as John Kobler wrote, “Luce developed a strong, deep, possessive attachment.” He would brook considerable independence before he would let go a man like Theodore White, who came to differ with him sharply over China policy, but in the end, of course, it was never Luce who went. Life itself seemed indestructible until it was challenged by a medium that was bringing pictures—up-to-the-minute moving pictures—into the nation’s living rooms.
Television was only beginning to come into its own by 1950, but one thing was already clear—the new medium would be a commercial proposition. Just as a small group of teachers, parents, and others had tried to salvage radio for educational purposes and failed, so in the 1950s a comparable group strove to hold a number of channels for higher intellectual levels and standards of taste. Indeed, in that decade television achieved a kind of “Golden Age” as artists and intellectuals experimented with innovative forms of entertainment, featured Edward R. Murrow’s See It Now on CBS and NBC’s Today Show, and premiered Laurence Olivier’s feature film Richard III before it hit the movie-house circuit.
But even the most optimistic innovators in television could not ignore the millions of “little black boxes” that by the mid-fifties sat in most American homes—radio sets that were also chambers of horrors to the critics. The listener could turn the knob from station to station and hear the same programming—endless popular music dispensed by jabbering disc jockeys and punctuated by a profusion of commercials along with a few news snippets. By 1960, with over 500 commercial television stations, TV appeared to be headed in the same direction that radio had taken. The reformers could not overcome the combined power of advertisers and broadcasters, who resisted invasion of their immensely profitable turf by invoking both the spirit of free enterprise and the sanctity of free speech.
A pervasive complacency, a burgeoning middle class, suburban togetherness, automated workers, widespread anomie and escapism, media pitched to the lowest common denominator, trivialization of thought—all these in different ways helped produce a politics of blandness, conformity, and consensus during the 1950s. To be sure, President Eisenhower had begun his Administration in 1953 with constructive purposes, buttressed by Republican majorities in House and Senate. On the domestic front he had created a new department, Health, Education, and Welfare, established the Air Force Academy, replaced mandatory farm price supports with flexible supports based on “modernized parity,” won a housing act that somewhat expanded public housing and eased the burden of home mortgages, and signed legislation establishing a thirteen-year construction program for a 41,000-mile interstate system of highways. On foreign policy he sought to carry on a bipartisan approach, working with leaders of the Democratic minority like Senator Walter George of Georgia, just as Truman had collaborated with Republican senator Arthur Vandenberg.
Bipartisanship in foreign policy, however, was largely a façade behind which the two parties fought each other and—even more—factions within the parties fought one another. Democrats were as usual splintered into almost as many fragments as there were Democratic leaders, but they broadly fell into a Truman-Acheson-Harriman camp calling for hard-line policies toward the Soviets, and a circle that included Eleanor Roosevelt, Tennessee senator Estes Kefauver, and Connecticut governor Chester Bowles, supporting various forms of détente, with Adlai Stevenson seeking to bridge the gap. The Republicans had their own sharp differences— especially between backers of such ideas as massive retaliation and Dulles’s brinkmanship and old-time isolationists like Taft and his followers.
Bipartisanship was a skimpy blanket hauled out at every point of military crisis or foreign policy quandary so that it could cloak differences and present a national posture of harmony and unity. “Politics must stop at the water’s edge,” the politicians and pundits would cry out. This was nothing new. In 1948 bipartisanship had been used to thwart the posing of fundamental alternatives to cold war assumptions. Writing a quarter century later, Robert A. Divine found it “a tragedy that the containment policy did not receive the careful analysis and debate it merited. Wallace’s attack on Truman’s policy suffered from emotional charges and flabby rhetoric, but beneath the shrill oratory there were important objections to the ‘get tough with Russia’ policy that deserved thoughtful consideration rather than contemptuous dismissal as Communist propaganda.” It was estimated that less than 1 percent of the nation’s newspapers supported Wallace’s views in 1948, and many gave him short shrift even in their news columns. Neither Dewey in 1948 nor Eisenhower in 1952 challenged the cold war assumptions of the Truman Administration. The argument was over means, not ends, over cold war tactics, not strategy.
Cold war assumptions continued to grip the American mind during the fifties. A soothing bipartisanship discouraged clear, focused debate and shrouded fundamental choices in a cocoon of bland harmony. The most visible impact of consensus was on the election of 1956. Along with Eisenhower’s personal popularity and the rally-’round-the-President impact of the Budapest revolt against Moscow and the Suez crisis, that consensus helped produce Stevenson’s staggering 1956 loss to Eisenhower. The President carried both the electoral college and the popular vote by an even bigger margin than in 1952. His challenger won only Missouri and six southern states. Shifting back and forth between bipartisan agreement and opposition on details but always following cold war assumptions, Stevenson simply could not come to grips with the amiable figure in the Oval Office. The Democratic party—especially the Democratic leaders in Congress—had not built up a credible, consistent, comprehensive foreign policy program resting on premises different from cold war encirclement and containment of Russia.
Part of the trouble was that Americans, unlike the British, had no loyal but militant opposition with an institutional base. The Democratic party as such was too fragmented to provide a coherent alternative program to a strategy of containment, which after all it had bequeathed to the Eisenhower Republicans in 1952. Following the 1956 debacle the Democrats sought to remedy this situation by establishing the Democratic Advisory Committee (later Council). In joining it Stevenson called for “strong, searching, and constructive opposition.” The DAC’s advisory committees on policy included perhaps the most remarkable collection of political and policy brains of the era: William Benton, Bowles, Senators Kefauver and Herbert Lehman, political scientist Hans Morgenthau, Michigan governor G. Mennen Williams on foreign policy; economists John Kenneth Galbraith, Marriner Eccles, Walter Heller, Leon Keyserling, Isador Lubin, and historian Arthur Schlesinger, Jr., on domestic policy; labor leaders James B. Carey, Sidney Hillman, David J. McDonald, Walter Reuther on industrial policy. The DAC issued brilliant position papers but in doing so revealed anew the policy splits within the Democracy—especially the deep divisions between the presidential Democrats and the congressional Democrats. The Democratic leaders of the House and Senate casually boycotted the Council.
The most conspicuous victim of cold war consensus was the twice-defeated Stevenson. A man of such sparkling wit and elegant charm that he made even the most sophisticated “madly for Adlai,” a scintillating speaker who eschewed bombast and banality, a politician with a high sense of responsibility and probity, he was unable to break out of the intellectual cocoon of containment and put forward a comprehensive alternative strategy. Instead he called for specific changes—an end to the draft and a ban on H-bomb testing—but both of these lay in the field of military expertise where the voters preferred to trust the general in the White House. Stevenson’s campaign faults—endless reworking of speeches, immoderate moderation, excessive cautiousness—were not simply personal failings. They were the flaws of a candidate whose party, fellow leaders, and personal philosophy left him in a consensual void in which he desperately cast about for some winning stance and failed to find it.
Could the Democrats have found another leader who could have united the party behind a strategy of détente? It was unlikely, given the divisions in the party. But there was one leader who transcended the divisions, who maintained good personal relations with Truman, Harriman, Stevenson, and the congressional leaders even as she campaigned for peace. This was Eleanor Roosevelt. During these years she threw herself into the war against war and poverty, the campaign for civil rights and social welfare. She proved herself once again a consummate politician, as in the 1956 campaign when her appearance for Stevenson at the Democratic convention proved crucial in stemming a last-minute push by Harry Truman for Harriman. Though she was seventy years old by the mid-1950s, it was not really her age that prevented her from becoming the acknowledged leader of the Democratic party, perhaps even its candidate. The explanation was simpler. She was a woman.
Still, the political failures of the 1950s had much deeper sources than the Democrats’ incapacity to offer a united front behind an alternative foreign policy. They lay in the cold war consensus that in turn stemmed from the spiral of fear and hate that continued to dominate Soviet-American relations. This spiral thwarted the steady, day-to-day diplomacy, the thoughtful and imaginative planning, the flexible but purposeful policies necessary to deal with Khrushchev. In the absence of such determined statecraft, problems festered, tempers rose, crises broke out, Americans rallied behind the President, Congress delegated him gobs of power. Brinkmanship was the inevitable result of drift and indecision. It was not Eisenhower’s fault that the crises of Hungary and Suez erupted at the height of the 1956 campaign. It was simply his luck.
If the Democrats would not challenge cold war assumptions, who would? Not the media as a whole in the climate of the fifties, nor the churches, nor the educators. In Western society the task of political criticism and social dissent, if all other leaders or groups failed, lay with the intelligentsia. Freed of close dependency on institutions, the intellectual, in Richard Hofstadter’s words, “examines, ponders, wonders, theorizes, criticizes, imagines,” rather than seeking to manipulate or accommodate. During the 1950s American foreign policy was the target of the most searching criticism by four men—a journalist, a diplomat, a professor, and a theologian— who in the view of some were reminiscent of the great thinkers of the eighteenth-century constitutional founding era in the breadth and depth of their intellectual power.
If Walter Lippmann, George Kennan, Hans Morgenthau, and Reinhold Niebuhr could even be compared with the men of 1787, it was in part because they, like the Framers, had lived and reflected in a time of almost ceaseless ferment and conflict. Reaching adulthood early in the century, they had witnessed all its traumas—World War I and its aftermath, the great depression, the rise of Nazism, World War II, the slave-labor camps and death factories, the incinerated cities and the atom bomb and hydrogen bomb, the cold war. They had known evil in much the way that J. Robert Oppenheimer could say to President Truman: “In some sort of crude sense, which no vulgarity, no humor, no overstatement can quite extinguish, the physicists have known sin; and this is a knowledge which they cannot lose.”
Lippmann, oldest of the four, was still in his early sixties when Eisenhower first ran for office. Although he backed the general over Stevenson in 1952, the noted columnist was as unsparing of Republican foreign policy as he had been of Democratic. Just as he had attacked Truman’s NATO policy as the work of “zealous cold warriors” who sought to bribe and bully nations into an anti-Soviet alliance, now he questioned the Southeast Asia Treaty Organization, incidentally noting that it included only one Southeast Asian nation. Just as he had confronted Acheson in a blazing argument over the Truman Doctrine and later called for his resignation, now he lectured Dulles face to face—and was lectured back.
Lippmann could consistently criticize Democrats and Republicans because he consistently thought in terms of fixed assumptions: that foreign policy must act always for a carefully defined national interest, that world peace depended on a balancing of national interests, that great nations must wield their power circumspectly amid a network of secondary as well as primary powers, that prudent diplomacy called for professional foreign-policy-makers insulated against the wilder passions and selfish interests of the masses, that apocalyptic visions and abstract “solutions” must be rejected in favor of “practical” arrangements such as old-fashioned spheres of interest and regional neutralization. These ideas were broad and flexible enough to allow for a variety of applications and even contradictions; his support for a global equilibrium of power with the Soviets, for example, appeared to clash with his calls for withdrawals from Berlin and from Taiwan and Southeast Asia.
The traumas of the age of Korea and McCarthyism had indeed left Lippmann so pessimistic about the paralysis and indecision of democracies that his faith in popular rule fell even lower than it had been in the years when he was writing Public Opinion and The Phantom Public. The people, he argued in what he hoped would be his culminating masterwork, The Public Philosophy, had “acquired power they are incapable of exercising, and the governments they elect have lost powers which they must recover if they are to govern.” He called for the restoration of “government strong enough to govern, strong enough to resist the encroachment of the assemblies and of mass opinions, and strong enough to guarantee private liberty against the pressure of the masses.” Jumping on this sentence, Archibald MacLeish, former Librarian of Congress, now a Harvard professor, and always a poet, accused Lippmann of narrowing freedom to fit within the rational, ordered society in which the columnist appeared to believe. He accused Lippmann flatly of being opposed to real freedom and democracy. Lippmann indignantly responded that he did believe in freedom—his kind of freedom.
While Lippmann was seeking a “public philosophy” as the intellectual foundation for the kind of polity and policies he favored, George Kennan was coping with the aftermath of the “X” article that Lippmann had attacked so tellingly. Kennan’s harsh portrait of Soviet power and motivation became so rigid a dogma in Administration councils as to leave the diplomat penitent and fearful. He continued to write and speak against the “legalistic-moralistic” approach to world affairs—and especially its application to Russia—that he felt had crippled Roosevelt’s dealing with wartime problems and possibilities. Kennan continued to preach “realism,” but under Acheson’s State Department leadership it was increasingly evident that once you founded your foreign policy strategy on “realism,” you were on treacherous ground. It could lead to endless ambiguities and self-contradictions, as it did even with Kennan. And tougher and more “realistic” realists could make a dogma out of what Kennan, at least, had seen as a policy of prudence. By 1950 Kennan was so disturbed by the Administration’s cold war extremism as to greet with relief an invitation from Oppenheimer, now head of the Institute for Advanced Study at Princeton, to continue his historical studies there.
In the quiet of the Institute, Kennan hoped to think through his own premises and experiences. He had maintained his friendship with Lippmann despite their differences over containment and over specific policies, and those very differences between two men who prided themselves on their “realism” illustrated the pitfalls of the concept. When it came to military alliances, German policy, disengagement in Europe, summitry, and indeed the whole strategy toward Russia especially after Stalin, what was realistic? Who admitted to being unrealistic? The concept had no intrinsic meaning. The problem was especially acute for Kennan because behind the cool façade of the diplomat and the detachment of the historian lay a deeply humane, sensitive, and indeed moral person who had seen the errors of “realists” in power, viewed diplomacy as the vocation of skilled men of the highest probity and responsibility, and insisted that a nation must seek to live up to its own moral standards even while rejecting the will-o’-the-wisp of global utopianism. And the more Kennan pondered in the groves of academe, the more he questioned the whole philosophy of cold war containment.
What seemed to be lacking in both Kennan and Lippmann was an overriding philosophy of international relations and a coherent strategic concept that could stand the test of repeated shocks like Korea and Berlin, Hungary and Suez, as well as the endless currents of change. And no one seemed readier to supply this need than Hans J. Morgenthau, a professor of international relations at the University of Chicago. A veritable child of conflict, Morgenthau had grown up in a Bavarian city seared by race hatred even in the early 1920s; when, as a young Gymnasium student at the top of his class, he was chosen to give the annual Founder’s Day address, the deposed duke of the region sat in the front row during the speech holding his nose in an obviously anti-Semitic gesture. Morgenthau left Germany as the Nazis moved toward power and later settled in Madrid to teach diplomacy, only to be overwhelmed by the Spanish Civil War; he made his way to Paris during the Popular Front days and then to Brooklyn College, only to be vilified there by young ideologues who were put off by the émigré’s sober and scholarly approach to some of the passionate questions of the day.
Later, at Chicago, Morgenthau produced a series of volumes—Scientific Man vs. Power Politics, Politics Among Nations, In Defense of the National Interest—that marked him as a formidable strategist of international politics. Morgenthau began with the premise that power politics, “rooted in the lust for power which is common to all men,” was inseparable from all social life. There was no escape from it: “whenever we act with reference to our fellowmen, we must sin and we must still sin when we refuse to act.” Power was the central, almost the exclusive, foundation of national interest and criterion of foreign policy. Such power, essentially military, must of course be prudently managed, but managed also quickly and decisively, which required strong executive leadership. The executive branch, however, took little initiative in foreign policy, Morgenthau lamented to the Senate Foreign Relations Committee in 1959, because it feared Congress, and Congress feared public opinion. Yet public opinion, he went on in much the same vein as Lippmann, should be not the cause but the result of “dynamic executive and congressional leadership.” Public-opinion polls measured the impact of past leadership, not the potential of the new. They must not be the yardstick of foreign policy.
“The history of America,” Morgenthau instructed Senators J. William Fulbright, Mike Mansfield, and Wayne Morse among others, “is the story of the enthusiastic responses of the American people to dynamic leadership on behalf of foreign policies which can be shown to have a positive bearing upon the national interest.”
Critics were not slow to challenge Morgenthau’s realism. If the national interest was the foundation of foreign policy, they said, the power underlying it must be clear and measurable, which meant military power—but even this power had often been miscalculated by both its wielders and its targets. The professor himself, the critics added unkindly, had miscalculated the national interest—for example, in his expectation that the Soviets would risk or even start a ground war in Europe once they acquired nuclear power. There were, moreover, other tangible and intangible bases of power—economic, psychological, ideological—that had to be included in assessing the might of nations. Yet if these were included, the “national interest” became such a tangle of multiple, shifting, and dynamic forces as to defy measurement and analysis. Hence to be for the national interest or for realism was no more clarifying than to be for wisdom or common sense or statecraft. Who wasn’t?
Realism, in short, was a necessary but inadequate component of a strategy of international relations. It was a preoccupation with means—the marshaling of power—in a series of world crises that called for a sense of proportion and perspective, a wider comprehension of ends, a philosophy of world politics, even a theology of the human condition. If this was the call, it appeared to have been answered in the 1950s by a theologian-philosopher-politician who had for twenty years been leaving church pulpits and college campuses dazzled by his stabbing oratory and pungent sermonizing. This was Reinhold Niebuhr.
Schooled at Elmhurst College in Illinois and at Yale Divinity School, he had held an evangelical pulpit in Detroit until 1928, when he joined the Union Theological Seminary in New York, much to the dismay of the established theologians there who deplored his lack of a doctorate, his bumptious midwestern manner, and his outspoken radicalism. His years in Detroit had left Niebuhr filled with fierce indignation over the industrial and human wasteland he had witnessed outside his own middle-class parish.
Detroit made Niebuhr a socialist; then for the next thirty years he followed a zigzag route through a series of doctrines and causes as he tried to come to grips with the depression, the New Deal, and the cold war. What kept him from intellectual faddism was his philosophical ambivalence—his tendency to embrace different doctrines at the same time in a kind of continuous internal dialectic; even while proclaiming a thesis he nurtured the seeds of its antithesis. Thus in Detroit he proclaimed the Social Gospel of Walter Rauschenbusch even while he exhibited, in his passion for new ideas and his instinct for irony and political practicality, many of the intellectual traits of the pragmatism of James and Dewey. His Detroit experience and the onset of the depression now moved him toward Marxist ideas, but it was Niebuhr’s own brand of Marxism, shot through with concern over the havoc of capitalism, fear of rising fascism, an ornery repugnance for communist dogma and messianism, and a hatred for Soviet bureaucracy and oppression. During the late 1930s, after holding the New Deal in some distaste for its opportunism and its “whirligig” of reform, he deserted Norman Thomas and the socialists to vote for Roosevelt. This shift too was marked by ambivalence: Niebuhr, who turned to FDR in part because of the President’s resistance to Nazism, had earlier attacked him for expanding the Navy.
By the 1940s Niebuhr could best be described as a liberal realist who was faithful to his earlier Social Gospel compassion in seeking to push the Roosevelt and Truman Administrations further to the left, in the process taking active leadership in Americans for Democratic Action and other liberal groups. At the same time he took a militant stand against Soviet expansionism and left-wing dogmatics. The old tensions and ambivalences remained. During the war he had hoped that “the companionship in a common purpose” with Russia would persuade the Soviets “to disavow political forms and fanaticisms which outrage standards of freedom established in the Western world.” This was the kind of “liberal illusion” that Niebuhr at other times denounced. He inveighed against American pride and self-righteousness but he also educated a rising generation of politicians in “realistic,” hard-nosed politics—in the notion, according to Richard W. Fox, that “moral men had to play hardball.”
By the 1950s Niebuhr had reached the height of his fame. “He was the father of us all,” George Kennan said of him. The ADA in a formal resolution named him its “spiritual father.” The theologian’s words excited so many agnostics, backsliders, and heretics that someone proposed a new group, “atheists for Niebuhr.” How to account for this extraordinary influence? The answer lay less in Niebuhr’s own ideological “whirligig” over the years than in the power of his theology of human nature. Whatever his current credal passion, it was informed by his biblical awareness of original sin, but sin now armed by technology with new destructive power, his rejection of Jeffersonian “illusions” for a Dostoevskian recognition of human evil, his sensitivity to human alienation, anxiety, and the “dizziness of freedom,” his constant reminders of pride, aggressiveness, sinfulness. Always sin—sin as the “narcosis of the soul.”
Audiences would never forget the sight of this man behind pulpit or rostrum, his bald pate gleaming as he pitched his hawklike face forward, his words tumbling out as his whole body seemed to weave and thrust, while his listeners tried frantically to scribble down his dazzling epigrams and polemical outbursts. His written words also had a stunning impact; for Harvey Cox his first reading of Moral Man and Immoral Society, gulped down in one sitting, was an intense revelation that made him “an instant Niebuhrian.” But when the sermons and books were digested, the question remained whether Niebuhr had done much more than clothe liberal realism in a powerful theological frame without resolving the ultimate in his paradoxes—the tension between liberal compassion, hopes and dreams, and hardheaded realism. Was Niebuhr simply one more example of the great Tocquevillian failure in American intellectuals—the failure to connect practical expedient politics informed by human possibility and limitation to lofty but explicit goals that might challenge the best in humankind?
No more than Morgenthau or the others did Niebuhr take on the toughest intellectual task of all: to explore the dimensions of liberty, the structure of freedom, the ambivalences of equality—and the tension among these values—and to link these with the strengths and weaknesses of American institutions, politics, and leadership. What was desperately needed in postwar America was analysis of the intervening linkages between ends and means, but this would have called for an analysis of political parties and electoral processes and public opinion and governmental structures—analysis hardly conducive to evangelical sermonizing and radical rhetoric. It was this failure that—granted the empirical richness and political wisdom of Lippmann and Kennan, Morgenthau and Niebuhr—set them a rung below the intellectual leadership of the 1780s. The Framers had crafted a constitution that superbly fixed their goals of individual liberty to concrete governmental institutions and electoral processes—so superbly that leadership in the 1950s still had to operate through their centuries-old system in seeking to reach twentieth-century goals.
Nor did these four political analysts—in even sharper contrast with the Framers in their time—hold much sway over foreign, even European, opinion. Kennan, a visiting professor at Oxford in 1957-58, lectured for the BBC on “Russia, the Atom and the West.” The stoutly anticommunist newspaper Le Figaro ran Lippmann columns. Lippmann indeed had a fan in General de Gaulle, who found Le Crépuscule des démocraties—the French edition of The Public Philosophy—full of “rare perceptions,” mainly because the two men shared strong doubts about the equation of democracy with parliamentarism and the “usurpation of popular sovereignty by professional politicians,” in de Gaulle’s words. But in general the ideas of the four were hardly exportable, conditioned as those ideas were by America’s geopolitical worldview.
Other things American, however, were most exportable. Europeans found their continent awash in American advertising and consumer goods. Turn the corner near Beethoven Strasse in Amsterdam or push your way through Piccadilly Circus or parade down the Champs-Elysées and you could hardly escape the ads for Kent cigarettes or Coca-Cola or Ford cars. Or escape the products themselves in the shops—Maxwell House coffee and Sea & Ski suntan lotion and Heinz tomato ketchup and Revlon lipstick. American cars seemed to be conquering European streets and mores, crowding highways, requiring car parks, changing suburban and recreation patterns.
Americans were exporting their corporations along with their goods. In the late fifties some two hundred American companies a year were settling in Belgium, Holland, and Prance, and about the same number in Britain. These enterprises employed tens of thousands of Americans and Europeans. Businessmen and politicians denounced the invaders for paying higher wages and salaries and for “disruption of orderly marketing.” American loans, or American management practices, or American competition would make Britain—or France or Italy or Belgium—the “49th State.” The Yankee traders, moreover, were taking back European art and other treasures. A London antique shop featured in its window a bristling sign: “Americans are not served.”
Other Europeans fought back in ways known to old cultures. Despite much advertising, some American products simply could not make a go of it: Campbell soups had trouble competing in the home of famous potages; the British did not take to motherly Betty Crocker and her cake mixes; General Mills tried to market Cheerios but Londoners stuck with their cornflakes and their kippers. Europeans attacked American economic and cultural “imperialism”: the Marshall Plan as a “dollar noose,” American loans as the work of a “shabby moneylender,” American managers as crass and unknowing, American GIs as “overpaid, oversexed, and over here.” For some Americans the height of indignity was a report by a Russian, the Soviet writer Ilya Ehrenburg, after a trip to the States that Americans suffered from “spiritual standardization”—“the same houses, the same furniture, the same crockery.”
No place on the globe, critics complained, escaped “Coca-colonization.” Arthur Koestler noted: “The motorbus which carries the traveller at 5 A.M. from Bangkok airport to the center of the capital of Thailand has a loudspeaker through which American crooners purr at him, and makes him wonder whether his journey was really necessary. The Arabian desert is ploughed by Cadillacs, and the exhibition of Eskimo handicrafts at the airport of Anchorage, Alaska, bears the same hallmark of the Late Woolworth Period as the idols of Krishna, made of plastic, which are worshipped in Indian homes.”
How could the “other” America be presented abroad—the good America, the America of books and music, of the Bill of Rights and representative government, especially at a time when the Kremlin was reputed to be spending half a billion dollars a year on propaganda? This was the job of the United States Information Agency, the successor to a series of agencies going back to the wartime propaganda units. By the end of the decade the USIA was running a wide range of information and cultural activities— books, films, lectures, radio programs, exhibits, student and teaching exchanges—through 200 posts in over 80 countries. No agency was more vulnerable politically both at home and abroad; while young European radicals were assaulting overseas libraries from the outside, McCarthy’s men were doing so from the inside. Some overseas librarians hid books by Tom Paine and other radicals; a few timid souls actually burned books— only about a dozen, but enough to touch burning memories of the Nazis. “For the free world outside the U.S.,” wrote a Canadian journalist, “McCarthyism is not just a spectacle. It is a tragedy.”
McCarthy’s assault on the Bill of Rights symbolized the USIA’s broader problem. Which America, what kind of America, should it seek to present abroad—America in all its variety, its freedoms and oppressions, its high culture and its barbarism, its noble principles and its often egregious practices? “France was a land, England was a people,” Scott Fitzgerald had written, “but America, having still about it that quality of an idea, was harder to utter.” America was liberty, individual rights, Freedom—these were the foundation stones. But then there was that spectacle of the long-tolerated McCarthy.…
By the 1950s private philanthropic foundations were deeply involved in international affairs, especially in the Third World. The Ford Foundation devoted over $50 million—about a third of its total spending—to international programs from 1951 through 1954.While much of this effort abroad was for practical economic development programs, it also had a strong ideological cast. For years Ford helped finance the Congress for Cultural Freedom, which Ford officials defined as an effort “to combat tyranny and to advance freedom in Europe and Asia.” The tyranny was Marxism, Soviet style, and the freedom was the Bill of Rights, American style, but the implications of extending civil liberties to poverty-stricken peoples rather than helping them achieve social and economic freedoms were left largely unexplored.
The men and women who had the most influence, however unwittingly, on European perceptions of the United States were American writers and artists. The late 1940s and the 1950s brought an Indian summer of the sparkling literary era that had stretched from World War I through the 1930s. Still shining or at least flickering in the afterglow of that era were the giants of the 1920s. Sinclair Lewis died at the start of the fifties but only after publishing a final volume of social criticism, Kingsblood Royal, an attack on racial prejudice. Although the best work of Robert Frost was behind him, he was still the most widely read serious poet in America. Ernest Hemingway published Across the River and into the Trees in 1950 and The Old Man and the Sea two years later, followed by the award of a Pulitzer Prize and, in 1954, the Nobel Prize in literature. William Faulkner, who had won the Nobel five years earlier, published Requiem for a Nun in 1951 and A Fable in 1954, and he completed his trilogy about the Snopes clan with The Town and The Mansion during the late fifties. Soon after the end of the decade, the pens of Hemingway, Faulkner, and Frost would be stayed for good.
Crowding onto the literary scene were younger writers who brought a springtime of creativity even while the Indian summer waned. Between 1947 and 1955 playwright Arthur Miller gave to the stage All My Sons, Death of a Salesman, The Crucible, A View from the Bridge. Ralph Ellison wrote a single stunning novel, The Invisible Man; J. D. Salinger published The Catcher in the Rye and Franny and Zooey; Saul Bellow contributed The Adventures of Augie March and Henderson the Rain King; playwright Tennessee Williams, after his brilliant The Glass Menagerie and A Streetcar Named Desire, wrote The Rose Tattoo, Camino Real, Cat on a Hot Tin Roof. These five men were in their late thirties or early forties; even younger was Norman Mailer, who had brought out The Naked and the Dead, a war novel, at the age of twenty-five and wrote two significant works, Barbary Shore and The Deer Park, in the 1950s.
Of all the tests of great literature, two are most clearly measurable— longevity and universality. The permanence of the notable work of the Indian summer could not be tested for another century, but the universality of the older generation of writers had striking demonstration during the 1950s.
France had been the supreme testing ground abroad for American writers, in part because French critics viewed themselves as the ultimate tribunal of international letters. The literati of Paris had been peculiarly generous to American novelists, some of whom they had known during the novelists’ self-imposed exiles in France in the twenties. Lewis’s Babbitt had sold 80,000 copies in France within a few months of publication; at least thirteen of his other works were translated into French by the end of the thirties. During that decade, the “greatest literary development in France,” in the judgment of Jean-Paul Sartre, was the discovery of Faulkner, Dos Passos, Caldwell, and one or two other writers; at once, he added, “for thousands of young intellectuals, the American novel took its place, together with jazz and the movies, among the best of the importations from the United States.” Even André Gide, the grand old man of French letters, said that “no contemporary literature” excited his interest more than that of young America.
And of these “young” Americans, no one excited the French more than Hemingway. His subjects fascinated them—bloody prizefighters, hired killers, disemboweled matadors, crippled soldiers, hunters of wild animals, deep-sea fishermen, as André Maurois summed them up. They liked his style even more—the simplicity of word and deed, the flat unemotional perceptions, the code of courage and personal honor, the clean, hard writing style, the celebration of nada—nothingness. It was a style that was said to have influenced Camus. By 1952 For Whom the Bell Tolls had sold over 160,000 copies in a French-language edition.
In the long run, the reputation of Faulkner in France surpassed even Hemingway’s. The literati liked the sense of tragic pessimism in the Mississippian, his metaphysical approach to time, his “magical, fantastic, and tragic” universe, as one reviewer wrote, inhabited by “a strange music, an unforgettable rhythm of incantation.” If Hemingway influenced Camus, Faulkner, according to Sartre, inspired Simone de Beauvoir’s technique of substituting a more subtle order of time for the usual chronology. Some French critics viewed Faulkner as America’s best—even the world’s best— novelist, much to the discomfiture of Hemingway, who at least had the consolation of vastly outselling Faulkner in France. Americans were wrong to treat Faulkner as a regionalist, Paris critics asserted; he was rather a “universal writer” in the fullest sense.
It was not only Faulkner and Hemingway that France celebrated, and it was not only France that celebrated American writers. Across the Continent there appeared a hunger for Steinbeck, Dos Passos, Fitzgerald, and others, and for Westerns and detective stories as well. A German writer told many years later of how he had cadged books from American GIs during the occupation and built his literary education on crates of Armed Services Editions, “courtesy of the American taxpayer.”
Why this transatlantic appeal of American writers? Italian novelist Cesare Pavese, who translated earlier American classics as well as Faulkner and Hemingway, said he had found “a thoughtful and barbaric America, happy and quarrelsome, dissolute and fruitful, heavy with all the world’s past, but also young and innocent.” Was this all there was to it—Europeans in one of their recurring “discoveries” of a simple, innocent, youthful America, refreshing to jaded continental sensibilities? Were either the established or the rising American writers of the 1950s telling them anything about the heart and mind and soul of America? What ultimately did this big, bustling country stand for?
Hemingway did not answer this question—he had no intention to. He dwelt on men’s—it was almost always men’s—individual fear and bravery, desire and frustration, struggle and death. Like Robert Jordan in For Whom the Bell Tolls, he had no political beliefs except a furious antifascism and an all-embracing individualism. “You believe in Life, Liberty and the Pursuit of Happiness,” Robert is told by “himself,” but himself wants these good things for individuals, not nations. Hemingway’s The Old Man and the Sea superbly portrayed a man’s struggle against a personal adversary and a fated defeat by an inexorable environment, but had nothing to say about collective effort and frustration.
With his closeness to the land, his love of community and region, his feeling for the “presentness of the past,” his old-fashioned sense of religious morality, William Faulkner appeared far more likely than Hemingway to plumb the mind and heart of the country. He was very much in the American literary tradition—indeed, two traditions, as Hyatt Waggoner suggested: the romantic symbolism of Hawthorne and Melville, the naturalism of Howells, Twain, and Dreiser. The people of Yoknapatawpha County—their greed and cunning, their moral and physical vulgarity, and the struggles of some to break the chains of fate and rise to some kind of human stature—were lifted in his charged prose to the level of universality and tragedy that the French critics so praised. But Faulkner, like Hemingway, was far more concerned with private values and personal afflictions than with public substantive values like political and economic freedom. When he did call for individual rights and liberty in his books and public addresses, they were largely his kind of rights and liberty—a sphere of private space, artistic independence that no government could be allowed to invade. The broader picture of “what the country stood for” that emerged from his writings was murky, even muddled.
A marvelous line from Faulkner’s Absalom, Absalom, noted by Malcolm Cowley, summed up the eloquence and the despair of modern human existence: “moving from a terror in which you cannot believe, toward a safety in which you have no faith.” This was the folly and impotence of America but by no means its essence.
Robert Frost even more than Faulkner defined freedom in his sayings and writings as personal liberty against the state, whether New Deal bureaucracy or the compulsory public school. And his freedom too, on closer inspection, turned out to be the crucial but self-serving independence of the man of letters. “We prate of freedom,” he said. “All I would keep for myself is the freedom of my material—the condition of body and mind now and then to summons aptly from the vast chaos of all I have lived through.” And he wrote:
Keep off each other and keep each other off.
You see the beauty of my proposal is
It needn’t wait on general revolution.
I bid you to a one-man revolution—
The only revolution that is coming.
Many of Frost’s finest poems celebrated the independent, self-reliant, skeptical country man, who could say with the poet, “The freedom I’d like to give is the freedom I’d like to have.” But this was a negative freedom largely irrelevant to the human needs of millions of industrialized, urbanized, automated Americans. There was one thing Frost could not do, Granville Hicks had written earlier. He “cannot give us the sense of belonging in the industrial, scientific, Freudian world in which we find ourselves.” The poet never did, never wanted to.
What writer, then, did manage to present the essence of the American experiment in the realm of ideas and values? Surely not Norman Mailer, who in the 1950s was still preoccupied with self-promotion, self-definition, self-resolution, with orgasm as true love, ideology as illusion, the hipster as the “wise primitive.” Surely not Tennessee Williams, who epitomized the decade’s concern with personal trauma and private values, or J. D. Salinger’s adolescents in constant rebellion against the “phony bastards” around them, or Saul Bellow’s characters—unforgettable but largely preoccupied with their own psyches.
The writer who came closest to dramatizing the great public issues and values was a playwright. Manhattan-born Arthur Miller, the son of a garment manufacturer afflicted by the depression, rebelled against commercial values and middle-class hypocrisy much as his fellow writers did. But he used his characters to dramatize social as well as personal needs and failures. The “right dramatic form,” he wrote in 1956, “is the everlastingly sought balance between order and the need of our souls for freedom; the relatedness between our vaguest longings, our inner questions, and private lives and the life of the generality of men which is our society and our world.” His great plays—notably Death of a Salesman and The Crucible—were in part direct responses to the threat to people’s hopes and dreams from a business civilization that degraded them and the threat to personal liberty from McCarthyism. Predictably Miller was attacked from the right; more significantly, he was criticized from the left for not being radical enough, for not being clear whether it was Willy Loman who was at fault or the society that produced him, for not tying his plays more explicitly to current issues.
Miller easily survived his critics. But even though his plays were produced in Europe, he had only a limited impact abroad. The ambiguities in his dramas—the ambivalences in Miller reflecting those in the larger culture—were enough to blur his powerful portrait of America’s yearnings toward both liberty and order, freedom and security, individualism and solidarity. The portrait was not clear to all Americans either. Miller himself wryly mentioned the man who came out of a performance of Death of a Salesman exclaiming, “I always said that New England territory was no damned good.”
If American writers were unsure of what their nation stood for, it was not surprising that Europeans were equally puzzled. European intellectuals had long labeled the country’s commitment to freedom as either self-indulgence bordering on anarchy or a boorish egalitarianism bending toward class leveling. “I am held to be a master of irony,” George Bernard Shaw had gibed. “But not even I would have had the idea of erecting a Statue of Liberty in New York.” On the other hand, Europeans had to and did admire the American commitment to some notion of freedom in two wars and the cold war. If Europeans, with their long exposure to Americans, were left uncertain, what could be expected of the Soviets, with their very different, very ideological conception of freedom?
A remarkable meeting in San Francisco in September 1959 between Nikita Khrushchev and nine American labor leaders headed by Walter Reuther helped answer this question. For two hours the two sides went at it, Khrushchev reddening, pounding the table, shouting out his arguments, the union men roaring back in a cacophony of indignant voices. More and more the argument narrowed down to the question of freedom—for workers in East Germany, for Hungarian “freedom fighters,” for West Germans, for Americans. Khrushchev was soon on his feet. Suddenly, according to the official record, he gave a burlesque demonstration of the dance he had witnessed during the Hollywood rehearsal of the forthcoming film Can-Can. He turned his back to the table, bent downward, flipped his coat up, and gave an imitation of the cancan.
“This is a dance in which girls pull up their skirts,” the Premier fulminated. “This is what you call freedom—freedom for the girls to show their backsides. To us it is pornography. The culture of people who want pornography. It’s capitalism that makes the girls that way.”
The meeting sputtered toward its end.
“We are interested in how best to advance the interest of workers under freedom,” Reuther said.
“You have your point of view; we have ours,” Khrushchev replied. “They are irreconcilable.”