CHAPTER THREE


RUGGED INDIVIDUALISM (BARELY) SURVIVES MODERNITY

FRANKLIN ROOSEVELT’S NEW DEAL wasted no time in building out a greatly enlarged federal government infrastructure that would curtail both the territory and the freedom of rugged individualism. If the meaning of democracy was now to include extensive federal programs for the forgotten man, then expansion of the federal infrastructure surely follows. There are several ways to measure the growth of the federal government and its effect on rugged individualism: the creation of new federal departments, the birth of new agencies, expanded regulations, new fields of federal responsibility, growth in size as measured by the number of employees and expenditures as a percentage of GDP (gross domestic product), etc. The New Deal moved aggressively on all fronts.

Federal Growth: History Tells an Interesting Story

The timing and growth of new federal departments tell an interesting story. In the beginning (1789), there were only three cabinet departments: War, State, and Treasury, corresponding to the primary responsibilities of the new federal government as the founders saw them. Even though the territory of the United States expanded enormously between the founding and the Civil War era—with the Northwest Ordinance, the Louisiana Purchase, the Florida Purchase, and land acquired through the Mexican War—the only cabinet department created to oversee the growth was the Department of the Interior (established in 1849). The Supreme Court also increased from seven members to nine during this time, and representation in both the House and the Senate grew as the population grew and new states were added.

It is not surprising that, with the end of the Civil War, the role of the federal government increased as it worked through what a post-slavery society in the South might look like. The Office of the Attorney General was moved out of the Executive Office of the President to its own cabinet-level Department of Justice (1870). The Thirteenth, Fourteenth, and Fifteenth amendments were passed, ending slavery and expanding the role of the federal government in the areas of civil rights and voting rights. As Herbert Hoover pointed out in his essay on “American Individualism,” rugged individualism in the United States is accompanied by equality of opportunity, and this phase in the growth and reach of the federal government is both explained and justified by expanding opportunity in the aftermath of slavery and the Civil War. The Department of Agriculture was also established in that time frame (1862).

The Progressives—distrustful of free markets, the states, and rugged individualism—wanted to increase the regulatory role of the federal government in business. So the Departments of Commerce (1903) and Labor (1913) were created. When joined with Agriculture, the federal government now had a big say in what happened in the major sectors of the economy. Then, too, the Progressive Era saw the creation of major new regulatory agencies, such as the Interstate Commerce Commission (1887) and the Federal Reserve System (1913). Major antitrust laws were passed: the Sherman Act in 1890 and the Clayton Act in 1914. This phase of federal growth inserted Washington, D.C., into the capital-labor battles and hemmed in free markets and businesses with government regulation. Then, too, came the Progressive constitutional amendments: the Sixteenth Amendment with its federal income tax, the Seventeenth allowing the direct election of US senators, the Eighteenth establishing Prohibition, and the Nineteenth giving women the right to vote, the last again expanding equality of opportunity. Overall, the Progressives aggressively inserted the federal government into business and free markets, expanding regulation at the expense of individualism, and illuminating the tension between rugged individualism and democracy.

But it was left to the New Deal to enact the largest expansion of the federal government in our history. Much of this was accomplished in the name of addressing a temporary economic emergency, but nearly all the expanded infrastructure became a permanent addition to the federal bureaucracy. Indeed, the New Deal has become the framework in which American domestic and economic policy operates even today, more than eighty years later. The New Deal was the first, and most important, of three revolutionary periods in modern times that have come to define the relationship between a growing federal government and the freedom of the rugged individual. Each of those three revolutions—Franklin Roosevelt’s New Deal of the 1930s, Lyndon Johnson’s Great Society in the 1960s, and the Reagan Revolution of the 1980s—marks a turning point in the evolving story of how the federal government’s role has grown in such a way as to crowd out and limit rugged individualism.

The New Deal Revolution

In the first one hundred days of the New Deal in 1933, a vast array of emergency legislation was enacted. In all, some forty new administrative agencies, referred to by historians as the alphabet soup agencies, were formed in the first year of the New Deal. What the new measures had in common was a shift in power away from the people and their elected representatives in the Congress toward expert administrators in the executive branch. Roosevelt practiced what President Obama’s former chief of staff, Rahm Emanuel, later preached: “You never want a serious crisis to go to waste. . . . [It’s] the opportunity to do things that you could not do before.”1

The most sweeping law was the 1933 National Industrial Recovery Act, which began—as did its companion, the Agricultural Adjustment Act of 1933—by declaring a national emergency. It then delegated to the president alone the powers to “effectuate the purpose” of the law and to set “codes of fair competition.” Republican Congressman Charles Eaton of New Jersey said at the time that this law was the New Deal’s attempt “to remake the entire structure” of American capitalism.2 Political scientist and historian Ira Katznelson explained the grand scale of the New Deal revolution: “In a decisive break with the old, the New Deal intentionally crafted not just a new set of policies but also new forms of institutional meaning, language, and possibility for a model that had been invented 150 years before,” adding that it “retrofit[ted] capitalism and shap[ed] a welfare state.”3 While the Supreme Court later ruled that Congress’s broad delegation of power to the executive branch was unconstitutional, much of the law itself remained.4 The rugged individual thus became the regulated individual.

Another example of the New Deal revolution with contemporary application is Roosevelt’s 1934 executive order creating the National Labor Relations Board. One might assume that a president could not unilaterally create a board with extensive powers over collective bargaining, labor relations, and labor elections, but Roosevelt felt that the National Industrial Recovery Act was sufficiently broad to authorize this. The president concluded that “this Executive Order . . . marks a great step forward in administrative efficiency and more important in labor matters,” allowing the president, through this new board, to have the “machinery for adjusting labor relations . . . be clarified.” To those concerned with the extent of the president’s use of executive orders today, this is a powerful precedent, and to those worried about the health of rugged individualism, a discouraging one, with the president now using the emergency legislation of the economic crisis to establish even greater federal regulation of labor, management, and the marketplace.

It is surprising to look back and see that Roosevelt carried out his revolution without even using two primary sources of power. First, there were no new cabinet departments created by Roosevelt, but instead scores of executive agencies, boards, and commissions. Second, there were no constitutional amendments to effect or codify the New Deal revolution. The only constitutional amendment of that era was one following Roosevelt’s unprecedented twelve-year presidency to limit the president to two terms. Instead there were constitutional interpretations, by both Roosevelt and the courts. In Roosevelt’s Constitution Day speech on September 17, 1937, he asserted that the Constitution is “a layman’s document, not a lawyer’s contract.” But Roosevelt’s constitution was not about American individualism and liberty but, as he said in that speech, about meeting the “insistence of the great mass of our people that economic and social security and the standard of American living be raised . . . to levels which the people know our resources justify.” Therefore, he added, we cannot “seriously be alarmed” when people use “legalistic phrases” to “cry ‘unconstitutional’ at every effort to better the condition of our people.” To Roosevelt, the Constitution was an evolving document and needed to catch up to the social and economic needs of his “forgotten man,” not be caught up in legalism and individual rights. This, of course, continues to be part of the constitutional debate today between “originalists” and those who believe in a “living constitution.”

The New Deal relegated rugged individualism to the back bench and focused public policy instead on the forgotten man. In fact, the rugged individual was now part of the problem, tied with laissez-faire economics and the fat cats on Wall Street, and not part of the solution. In the name of an economic emergency, Roosevelt expanded and redirected the federal government toward massive intervention in the economy and in every kind of federal policy. Only recently, the US Supreme Court found itself asking whether New Deal policies still in effect concerning raisins still made sense, with even dissenting Justice Sonia Sotomayor acknowledging that the price support law “may be outdated and by some lights downright silly.”5 It was a complete revolution away from individualism, constitutionalism, and free markets toward government growth and intervention in favor of the forgotten man.

Before leaving the New Deal, however, we should acknowledge one small step for rugged individualism: the blended approach that left room for both the rugged individual and the forgotten man in addressing Social Security and retirement. The 1935 Social Security Act was to “furnish financial assistance . . . to aged needy individuals.”6 The definition of “aged” was provided in the Act: age sixty-five. But “needy” was a more difficult term, calling forth a sliding scale test based upon total wages earned in which higher wage contributions received smaller percentage returns. To pay for these services, a tax on employee wages and an excise tax on employers were established.

In a sense, the formula for Social Security established a two-track system, one for the rugged individual and one for the forgotten man. As historian Edward Berkowitz described it, “Public and private pensions expanded together and the compatibility of the two became one of Social Security’s celebrated virtues.”7 The rugged individual who can take care of himself must nevertheless pay into the system and may only receive back a portion of his investment. On a percentage basis, the forgotten man was entitled to receive more. But the rugged individual could have private retirement funds alongside his Social Security payments, and the government would later (in the 1970s) allow certain of those funds to grow on a tax-deferred basis. It was not one-size-fits-all as we would later see, for example, in the case of Obamacare. And this synthesis provides one useful way to think about the rugged individual in public policy: carve out a track that makes sense for those who can take care of themselves and a different track—a safety net—for those who cannot.

By now, of course, that distinction has become blurred in Social Security. As Social Security has evolved since 1935, it has become less a safety net and more of an entitlement for both the rugged individual and the forgotten man. Both have merged into a new class: the entitled man. So a promising synthesis in which the rugged individual and the forgotten man were each addressed separately by the Social Security policy has ended up compromised, with everyone now an entitled man, and the costs have skyrocketed well beyond the ability of the system to pay them.

The Postwar Transition

With the end of World War II and the death of Franklin Roosevelt, one might have expected the postwar world to see the massive New Deal and wartime growth of the federal government cut back. Indeed, Herbert Hoover himself was brought out of retirement in 1947 to head up the first Hoover Commission for the reorganization of the executive branch of government, a project promising to be not merely an organizational exercise but an actual reduction in the role of the federal government. But with the surprise election of Democrat Harry Truman in 1948, the wind came out of those sails and the Hoover Commission settled for rearranging the offices rather than returning to a form of democratic republicanism more compatible with the founders.8

When Dwight D. Eisenhower, a Republican, was elected president in 1952, there was new hope among many for a reduction in the size and role of the federal government, and Herbert Hoover was called upon to lead a second Hoover Commission. But once again, a new president turned out to be cool toward declaring war on the federal programs of the New Deal. Eisenhower seemed to acknowledge the political difficulty of rolling back popular entitlement programs and was willing to accept the social safety net constructed by the New Deal.9 When James Reston of the New York Times evaluated Eisenhower’s first term in 1956, he concluded it was “surely one of the great paradoxes of recent American political history” that Republicans had “swallowed” the New Deal measures without attempting to repeal a single one.10 And this becomes part of the larger story of government incursions into the life of the rugged individual: some become accepted, as Eisenhower did with Social Security and other aspects of the safety net, while others remain contentious.

Indeed, Eisenhower proceeded to build the federal government in new directions, establishing the new Department of Health, Education, and Welfare, reorganizing the Defense Department, creating the National Aeronautics and Space Administration, and constructing tens of thousands of miles of interstate highway across the country. It is symbolic and fitting that the large federal executive office building next to the White House is named for Eisenhower. Still, Eisenhower saw the dangers of big government allied with big business, warning in his farewell address about “the military-industrial complex.” He also saw danger to the rugged individual, highlighting the “solitary inventor, tinkering in his shop, [who] has been overshadowed by task forces of scientists” funded by the federal government.

President John F. Kennedy, in his brief term, proceeded to push the federal government into civil rights, the exploration of space, and other aspects of the “new frontier” he advanced. Ironically, Progressives had earlier pronounced that the American frontier, and with it rugged individualism, had disappeared, but Kennedy saw an optimistic and compelling “new frontier.” In his inaugural address he famously called upon Americans to “ask not what your country can do for you—ask what you can do for your country.” But classical liberal economist Milton Friedman objected to Kennedy’s challenge, calling it both “paternalistic” and “organismic.” Friedman, a leading figure of twentieth-century conservatism, urged Americans not to be lulled into accepting Kennedy’s seductive challenge, saying that instead Americans should ask: “What can I and my compatriots do through government . . . to protect our freedoms?”11 As Friedman would often explain, the most important part of that was limiting the government’s power and reach over the individual.

The Great Society Revolution

Thrust into office by the death of President Kennedy in 1963, and elected by a landslide in 1964, Lyndon Johnson had a lot of political capital to spend. And, as the long-time Senate majority leader, he knew how to get things done in the Capitol. As a consequence, despite being weighed down by the advancing conflict in Vietnam, Johnson carried out a massive domestic agenda he called the Great Society. Capturing the large Johnson personality and expansive vision of government, he often said at campaign stops, “We’re in favor of a lot of things and against mighty few.”12

One of the major contributions of Johnson’s Great Society was an expanded definition of the forgotten man to include not only the poor but racial minorities, children, and anyone else who had been left behind by society. In this sense, Johnson provided a vision for who should be helped and what should be done by government in a Great Society. His landmark speech at the University of Michigan on May 22, 1964, best captured LBJ’s notion of the Great Society. He made it clear that just having wealth did not make a great society: a great and happy society needed to end poverty and racial injustice. But to Johnson, that was “just the beginning.” Johnson argued that three projects would comprise the heart of his domestic effort: (1) making the cities and urban areas great; (2) addressing environmental issues in the countryside (water, food, air, parks); and (3) making the educational system excellent so that everyone had a promising future. Not only did the Great Society build on the New Deal, but Johnson sought to out-Roosevelt his mentor and hero if he could.

In a day now when relatively little legislation is actually passed by Congress and signed by the president, the scale of the Great Society is difficult to comprehend. In total, the Great Society agenda comprised some 435 bills, one for every member of the House.13 This staggering volume of legislative achievement included such major measures as Medicare, Medicaid, the Civil Rights Act of 1964, the Voting Rights Act of 1965, the Fair Housing Act of 1968, the Elementary and Secondary Education Act, Head Start, Model Cities, and the Truth-in-Lending Act. A new cabinet department for Housing and Urban Development was created.

Political scientist James Q. Wilson later observed that part of the permanent change effected by the Great Society was “lowering the legitimacy barrier” for federal government action.14 Previously there had been a serious argument over whether the federal government had the constitutional legitimacy to act in domestic matters such as welfare, education, urban renewal, etc. But, as Wilson said, thanks to the Great Society, that barrier has fallen and “political conflict takes a very different form. New programs need not await the advent of a crisis or an extraordinary majority because no program is any longer ‘new’—it is seen, rather, as an extension, a modification, or an enlargement of something the government is already doing.”15

In order to accomplish his agenda, Johnson employed what he described as “creative federalism.” In a November 11, 1966, memorandum to his senior officials, Johnson urged joint action on major problems “worked out and planned in a cooperative spirit with those chief officials of state, county, and local governments who are answerable to their citizens.”16 Although the federal government role was expanded to address state and local problems such as education and welfare, this would largely be accomplished by categorical grants, often bypassing the states and going directly to neighborhood organizations or nonprofit groups. Grants-in-aid were the preferred tool for this, with federal grant programs growing from 132 in 1960 to 379 by 1967. Similarly, funding for those grants grew from $8.6 billion in 1963 to $20.3 billion by 1969.17 For the rugged individual, then, the vast growth of the federal government and its ever deeper involvement in people’s lives was somewhat mitigated by channeling funds into local and community groups. But many of these local and state partners would prove to be unreliable and uneven in their approaches, so the federal role lasted whereas the state and local partnerships often did not.

Medicare and Medicaid provide a classic case study in how creative federalism was intended to work in the Great Society. Rather than federalize health care entirely, Medicare sought to work alongside the existing system of private health insurance, with private doctors and hospitals. As Robert Ball, the Social Security commissioner during the Johnson administration, put it, Medicare simply “accepted the going system of the delivery of care and the program structure was molded on previous private insurance arrangements.”18 Health insurance for the working population would continue to be provided by private insurance companies and community health plans such as Blue Cross and Blue Shield. Medicare and Medicaid, established through an amendment to the Social Security Act, would provide medical insurance for the elderly in a kind of safety-net system. As a result, even this sweeping law left room for the rugged individual and the forgotten man to coexist, the former retaining his private or employment-provided insurance and the latter benefiting from the government-created safety net. Even this synthesis, however, was not a massive victory for the rugged individual. Ball later acknowledged that this was only done because otherwise politics would prevent the passage of the Medicare and Medicaid legislation—the goal all along had been to federalize medical care.19

Likewise, the Great Society’s approach to poverty at least held out some hope for rugged individualism. Johnson made it clear that his approach was not cash handouts to the poor, but rather empowering the poor to be qualified and able to find work. The Declaration of Purpose of the war on poverty legislation said the program would work by “opening to everyone the opportunity for education and training, the opportunity to work, and the opportunity to live in decency and dignity.” In the end, Johnson’s reach exceeded his grasp. Poverty, of course, was not eliminated, and the Office of Economic Opportunity that led the war against it was eliminated in the next decade. Twenty years later, Charles Murray would argue that the Great Society’s anti-poverty programs actually abetted rather than ameliorated it.20 Joseph Califano Jr., Johnson’s chief domestic policy advisor, acknowledged that, during the Great Society era, “the Government simply got into too many nooks and crannies of American life.”21 The notion that the federal government can and should do everything for everyone not only undercut individual liberty but, as a practical matter, did not work.

The Transition of the 1970s

It is not the case that every time a Democrat was elected president the federal government grew, and whenever a Republican came into office the federal government shrank. In fact, between the end of the Great Society, when Lyndon Johnson did not seek reelection in 1968, and the election of Ronald Reagan in 1980, the federal government continued to grow in both size and reach under two Republican presidents, Richard Nixon and Gerald Ford, and one Democrat, Jimmy Carter.

Nixon added new federal agencies—the Department of Natural Resources, the Environmental Protection Agency, the Council on Environmental Quality, and the Occupational Safety and Health Administration—and enacted one of the most sweeping federal environmental laws, the Clean Air Act of 1970. Rather than cut taxes, as conservatives would have wished, Nixon redirected federal money toward states and municipalities. Declaring “I am now a Keynesian,” Nixon completed Franklin Roosevelt’s earlier initiative to take America off the gold standard, and he also imposed federal wage and price controls to combat inflation in 1971. He even proposed a Family Assistance Program that would have guaranteed a minimum welfare payment for every American, but that was defeated in the Senate. In short, Nixon presided over a significant expansion of the federal welfare and administrative state.

President Ford was left to clean up the Watergate mess and to deal with inflation when wage and price controls did not do the job. And President Carter’s term was dominated by economic concerns—both inflation and unemployment—an energy crisis, and the Iran hostage situation. Carter talked about smaller government but, in the end, pursued big government policies to address tough problems. His primary organizational accomplishment was to split the former Department of Health, Education, and Welfare into two cabinet-level departments, one for Health and Human Services and a separate Department of Education. The latter empowered far greater intrusion by the federal government into education at all levels. In summary, the 1970s were a continuation of the New Deal and Great Society eras.

The Reagan Revolution

Ronald Reagan had a very different understanding of the federal government and its role than his modern predecessors. He first articulated his view on the national stage in a major televised address called “A Time for Choosing” during Barry Goldwater’s failed campaign for the presidency in 1964. If the University of Michigan address on the Great Society was Johnson’s case statement for his presidency, Reagan’s philosophy was best stated in this message. He began the domestic policy portion of this message, delivered on October 27, 1964, by looking at the failure of the economic pillar of the American system. Who is going to pay for all the Great Society programs? Reagan asked. He was also troubled by the assumption that the federal government is the level and the executive is the branch to fix America’s domestic policy. “No government ever voluntarily reduces itself in size,” Reagan said. “So government’s programs, once launched, never disappear. Actually, a government bureau is the nearest thing to eternal life we’ll ever see on this earth.”

Adding more specifics, Reagan noted that the individual income tax rate was too high, the federal budget had not been balanced twenty-eight of the last thirty-four years, and the federal debt limit had been raised three times in the last twelve months. “We have accepted Social Security,” Reagan acknowledged, but the federal workforce was too large and “proliferating bureaus with their thousands of regulations have cost us many of our constitutional rights.” As a consequence, Reagan concluded, “a perversion has taken place,” our natural rights “are now considered to be a dispensation of government,” and freedom has become fragile. Echoes of Herbert Hoover’s observation in the 1932 election campaign—this is not a choice between two men but two philosophies of government—could be heard in the Reagan address. Reagan clearly saw that big government meant less freedom for the individual.

When elected president in 1980, Reagan included this now-famous statement in his first inaugural address: “Government is not the solution to our problem; government is the problem.” Noting that government “has no power except that granted it by the people,” Reagan said it was “time to check and reverse the growth of government which shows signs of having grown beyond the consent of the governed.” In a news conference on August 12, 1986, Reagan said: “The most terrifying words in the English language are, ‘I’m from the government and I’m here to help.’ ” Concluding his presidency, he said, “Man is not free unless government is limited,” in his 1988 farewell speech. But Reagan found actually reversing the trend of government growth to be a major challenge. He did succeed in cutting taxes—initially a 23 percent across-the-board cut of individual tax rates phased in over three years, followed later by another cut. In total, he cut the top income tax rate from 70 percent to 28 percent. But under Reagan the federal workforce grew modestly and the national debt went from $907 billion in 1980 to $2.6 trillion in 1988. Some of this increase was a further commitment to national defense and some was the difficulty of getting spending cuts through Congress.

In addition to tax cuts, Reagan made progress in the “devolution” of some federal responsibilities back to state and local governments and in deregulating the airlines and banking. He sought to return financial responsibility for more than sixty federal programs for low-income households to state and local government, funded at least in part by federal grants.22 Much of this “New Federalism” or “devolution” did not actually become law, although the federal budgets under Reagan nevertheless reduced funding for these programs. There was talk of eliminating the federal Department of Education, but this did not occur. Instead, a major national study on “A Nation at Risk” in 1983 led to expanded federal leadership in education.

Still, in addition to tax cuts and some devolution, Reagan’s philosophical and rhetorical attack on big government and his defense of individual freedom provided valuable underpinning for the continued importance of American rugged individualism. In his book The Age of Reagan: The Conservative Counterrevolution, 1980–1989, Steven F. Hayward argued that Reagan restored much of the American founding that had been lost or forgotten. We agree. Even some of his foreign affairs and national defense speeches were fundamentally about individual freedom. His famous Berlin speech on June 12, 1987, with its challenge, “Mr. Gorbachev, tear down this wall,” was fundamentally about the ideal of individual freedom. In his 1982 Westminster speech, he warned about the threat to “human freedom by the enormous power of the modern state.” And in his 1988 State of the Union address, he said that “limited government” is the “best way of ensuring personal liberty.” This clear link between personal liberty and American individualism, on the one hand, and limited government on the other, was an important contribution.

It is not too much to claim that Reagan rhetorically re-centered the nation in its understanding of individualism and democracy. Rather than asking first what government can do to solve problems, Reagan asked the classic “rugged individualism” question: what can the individual do? If he decided that a matter was properly addressed by government, rather than left to the individual, Reagan would then turn to questions of federalism: which level of government and which branch? All of this reordered the thinking of conservatives and of the nation itself. Reagan’s sunny optimism made the individualism and limited government case with greater effectiveness than the sometimes dour Calvin Coolidge or Herbert Hoover of an earlier era or the harder Western persona of Barry Goldwater. One should not underestimate this rhetorical and spiritual reawakening and re-centering accomplished by Ronald Reagan. Indeed, one of the questions to be asked at the end of the Reagan Revolution or the New Deal is not just whether the programmatic outcome was good—whether unemployment or the size of government was tamed—but the articulation of the proper relationship between the government and the people.

Philosophical Debates about American Individualism

As individualism versus collectivism played out on the political and policy scene, there were also relevant debates about it in the realm of philosophy and ideas. Two of these debates especially informed the larger public policy approach to the subject. One might best be understood as coming from the world of political economy, with two of the leading contenders being libertarian-conservative Milton Friedman, representing the importance of individual liberty, and political scientist Michael Harrington, standing for many of the socialist or collectivist ideals of the student radical movement of the 1960s. A second round of conversations about individualism in America was triggered by sociologist Robert Bellah and political scientist Robert Putnam in their respective books, Habits of the Heart and Bowling Alone.

In Milton Friedman’s introduction to the fiftieth anniversary edition of Friedrich Hayek’s Road to Serfdom,23 Friedman praised Hayek for capturing the timeless choice facing society: collectivism and central direction (serfdom) versus individualism and voluntary cooperation (freedom). Hayek had understood that there was a road to freedom that had been abandoned, pointing to the German intellectuals of the 1870s who felt central planning was needed and left the road to freedom in favor of socialism and ultimately totalitarianism. Hayek sought to reintroduce the alternative of individualism, arguing that central planning led to bad policy and also resulted in coercion.

Friedman argued that political freedom, which was embraced widely in America, depended on maintaining economic and social freedom as well.24 He argued, like Hayek, that there were really only two choices: (1) a relatively free market; or (2) government control of the economy. The mixed solution, which became known later as the Nordic or Scandinavian model, was unworkable. First Friedman, then Goldwater, and later Reagan revived the notion of rugged individualism, establishing a free and responsible society as a preferable alternative to collectivism. Whereas Hayek had been fighting European socialism, Friedman was battling the regulatory state of the American Progressives. Both provided an important foundation for individualism.

In addition, Friedman offered two ways of measuring how the size and reach of government correspondingly reduced the level of freedom enjoyed by the individual. One measure is government spending as a percentage of national income. His goal was 10 percent for all government spending (not just federal), which he noted was actually in practice in 1928, prior to the Great Depression. What concerned Friedman was that the percentage rose to 20 percent during the New Deal era and up to 36 percent by the time Ronald Reagan came to office. Following the Reagan Revolution, the percentage hovered near the 34 percent range for the next twenty years. By 2008, however, it rose to 40 percent.25 By Friedman’s accounting, then, the Reagan Revolution at least put a brake on government growth.

Friedman’s second test is the amount and kind of government regulation over people’s daily lives. He listed fourteen government regulations that he claimed were inconsistent with a free society in 1962: parity price support, tariffs, output control, rent control, minimum wage laws, industrial regulation (Interstate Commerce Commission), speech regulation (Federal Communications Commission), social security, licensing, public housing, military conscription, national parks, the US Post Office, and toll roads.26 Not listed is education, though he devoted a lot of time to arguing against the government monopoly in education, being an early proponent of vouchers. By now, this list from 1962 could be multiplied many times over (see chapter 4).

During the 1960s, an alternative economic view was developed and promulgated by student radicals and intellectuals, one that would further undercut the notion of American individualism in favor of greater collectivism. And, as in the Progressive Era, this would largely be undertaken by reducing individualism to purely economic notions of greed. A good starting point is the Port Huron Statement, a 1962 manifesto developed by Students for a Democratic Society (SDS). In it, the so-called New Left laid out how the relative comfort of the childhoods of college students had now been undone by racism, nuclear threats, the Cold War, poverty, corrupt capitalism, and the decline of American values. The statement actually purported to support a certain kind of anti-establishment individualism, one that would not fall into “egoism.” But in fact it pointed to a reinventing of American society, foreign policy, labor relations, university governance, and social structure around New Left principles.

In the same year, 1962, Michael Harrington, a political scientist, activist, and socialist, published the influential book, The Other America, with its sweeping indictment of government policy toward poverty. Indeed, Harrington’s book clashed directly with another intellectual of the time, Assistant Secretary of Labor Daniel Patrick Moynihan, who developed the framework for Johnson’s War on Poverty. Whereas Moynihan thought that family structure among the poor, especially the black poor, was the fundamental problem, Harrington argued that it was all about unemployment, underemployment, and poor housing. The problem was in the very culture of poverty, Harrington said, demanding a cultural change.27 Harrington argued that central planning was needed to develop housing and jobs and infuse billions of dollars into social investment in order to solve the problem. Instead, he said, “The great opportunity for social change since the New Deal was sacrificed to the tragedy in Vietnam.”28

To Harrington and the student radicals of the 1960s, the welfare state was not enough to save the poor and minorities from the harshness of market capitalism. The welfare state only made the poor into wards of the state; it wasn’t built to address the hopelessness and desperation of poverty. We need social security, housing, and a comprehensive medical program for all, Harrington said.29 The appendices to the book track the ongoing data about poverty since 1962, supporting Harrington’s claim that the same people were poor in 1970 who had been poor in 1960. He felt the War on Poverty had really done nothing.

It is little wonder, then, that Johnson’s Great Society revolution had unraveled by 1968. Under attack from Friedman, Goldwater, and those favoring greater individual rights and less government from the right, and the New Left demanding a government transformation well beyond the welfare state, Johnson was very much caught in the middle of a heated war. Then, too, came his unpopular buildup and failure abroad in the Vietnam War. What the 1960s essentially added to the debate about rugged individualism and the 1930s welfare state was a new critique from the left that the welfare state had not gone far enough. This debate very much continues today.

The Sociological Debate about Individualism Also Informs Public Policy

Another philosophical debate about individualism, this one involving primarily sociological analysis, addressed individualism itself, specifically whether it was a kind of selfishness that undercut the social cohesion necessary for a free republic. In that sense, the debate reached back at least to Tocqueville, who had pointed out that American individualism was different from European understandings of this idea. It was based on self-interest “rightly understood,” a self-interest that included a commitment to others.

Robert Bellah, a sociologist at the University of California at Berkeley, and several of his colleagues published Habits of the Heart during the Reagan Revolution in 1985. The subtitle of the book raises its core concern: “Individualism and Commitment in American Life.” In the preface to the book, the authors warn that American individualism “may have grown cancerous—it may be destroying those social integuments that Tocqueville saw as moderating its more destructive potentialities, that it may be threatening the survival of freedom itself.”30 In the glossary of the book, the authors acknowledge that individualism carries at least two meanings: (1) “a belief in the inherent dignity and, indeed, sacredness of the human person,” and (2) “a belief that the individual has a primary reality whereas society is a second-order.”31 The second understanding—that the individual is primary and voluntarily consents to participation in society in various ways—is the essence of American individualism. But Bellah and his colleagues, fearing the rise of the selfish French “individualism,” call for the rise of communitarian or collective forces, apparently ignoring Tocqueville’s understanding that Americans already do participate voluntarily and extensively.

In 2000, Harvard political scientist Robert Putnam published Bowling Alone: The Collapse and Revival of American Community. Putnam spoke of the need for “social capital” to make a nation work well, including both what he called bonding capital (relationships with people who are alike) and bridging capital (relationships with different kinds of people). The title of his book is drawn from the demise of bowling leagues, a minimal form of civic engagement, showing that individualism has threatened the formation of vital social capital. Putnam’s work triggered important debates about whether social capital has actually declined or whether earlier forms of community engagement, such as Elks Clubs and bowling leagues, had simply given way to new forms of community and civic engagement.32

Fundamental to any debate about individualism versus communitarianism or collectivism is one’s understanding of individualism itself. The battle always seems to result from an effort by communitarians and collectivists to define individualism narrowly, in a selfish, self-interested, even economically advantaged way. Then supporters of individualism respond: No, it’s not that limited; it must be “rightly understood” (Tocqueville); it incorporates equality of opportunity (Hoover); it leaves men free to consent, free to associate voluntarily. Stanford political scientist Francis Fukuyama sees another useful distinction: Americans may be anti-statists, but that is not the same as hostility to community.33 As Fukuyama pointed out, “The same Americans who are against state regulation, taxation, oversight, and ownership of productive resources can be extraordinarily cooperative and sociable in their companies, voluntary associations, churches, newspapers, universities, and the like.”34 Americans also tend to be very loyal and committed to family, which is an extension beyond mere individualism. So perhaps it is a certain kind of forced community, community demanded by the state, that Americans resist, preferring instead to start with individualism and consent to join associations of various kinds. Indeed, Americans are widely known for their philanthropy, volunteerism, church membership, and so on. Do these things not count to the collectivists? As sociologist Seymour Martin Lipset rightly observed, “[I]n America, individualism strengthens the bonds of civil society rather than weakens them.”35

Conclusion

Rugged individualism in the modern era became a tug of war carried out on philosophical, economic, political, and policy grounds. Presidents Franklin Roosevelt and Lyndon Johnson concluded that rugged individualism had left millions of people behind economically and sought to extend the reach of the federal government to assist the forgotten man by constructing a welfare safety net and developing scores of new government programs. By the 1980s, Ronald Reagan won the presidency, arguing that the federal government had grown too large and intrusive and needed to be trimmed back to leave room for more individual liberty, though he found the actual reduction in the size and role of government to be a major challenge.

Despite efforts by the Progressives, the New Left, and other communitarians and collectivists to kill American rugged individualism, it managed to survive, if barely. Although Americans want to help those in need, there remains a preference for individual and voluntary action, in part because growing federal government programs have not seemed to solve the problems. As Tocqueville and others have observed, Americans have their own particular form of individualism that they are not prepared to give up. As the sociologist Herbert Gans put it, “virtually all sectors of the [American] population” continue to pursue individualism, but it “is hardly separation from other people. It is to live out their lives in freedom and engagement with small parts of society, starting with the family, while participating in a wide range of voluntary activities to assist others.”36

Notes

1. Gerald F. Seib, “In Crisis, Opportunity for Obama,” Wall Street Journal, November 21, 2008, http://www.wsj.com/articles/SB122721278056345271.

2. Ira Katznelson, Fear Itself: The New Deal and the Origins of Our Time (New York: Liveright, 2013), 238.

3. Ibid., 6, 36.

4. A.L.A. Schechter Poultry Corp. v. United States, 295 US 495, 55 S. Ct. 837, 79 L. Ed. 1570 (1935).

5. Horne v. Department of Agriculture, 576 US, no. 14–275 (2015).

6. Ibid., 216.

7. Edward Berkowitz, “Medicare: The Great Society’s Enduring National Health Care Program,” in The Great Society and the High Tide of Liberalism, ed. Sidney M. Milkis and Jerome M. Mileur (Boston: University of Massachusetts Press, 2005), 335.

8. See Joanna L. Grisinger, The Unwieldy American State: Administrative Politics since the New Deal (New York: Cambridge University Press, 2012).

9. Ibid., 199.

10. James Reston, “Eisenhower’s Four Years: An Evaluation of the Republican Administration in a Complex World,” New York Times, July 22, 1956.

11. Milton Friedman, Capitalism and Freedom (Chicago: University of Chicago Press, 1962), 1–3.

12. Theodore H. White, The Making of the President 1964 (New York: New American Library, 1966), 413.

13. David Shribman, “Lyndon Johnson: Means and Ends, and What His Presidency Means in the End,” in The Great Society and the High Tide of Liberalism, ed. Milkis and Mileur, note 8, 239.

14. James Q. Wilson, “American Politics: Then and Now,” Commentary, February 1979, 41.

15. Ibid.

16. Lyndon B. Johnson, “Memorandum on the Need for ‘Creative Federalism’ through Cooperation with State and Local Officials,” November 11, 1966, in The American Presidency Project, ed. Gerhard Peters and John T. Woolley, http://www.presidency.ucsb.edu/ws/?pid=28023.

17. Berkowitz, “Medicare,” note 7, 323.

18. Ibid.

19. Ibid., 322.

20. Charles Murray, Losing Ground: American Social Policy, 1950–1980 (Philadelphia: Basic Books, 1984).

21. David E. Rosenbaum, “20 Years Later, the Great Society Flourishes,” New York Times, April 17, 1985, http://www.nytimes.com/1985/04/17/us/20-years-later-the-great-society-flourishes.html?pagewanted=all.

22. John M. Quigley and Daniel L. Rubinfeld, “Federalism and Reductions in the Federal Budget,” National Tax Journal 49, no. 2 (June 1996): 189–302.

23. Friedrich Hayek, The Road to Serfdom: Fiftieth Anniversary Edition (Chicago: University of Chicago Press, 1994).

24. Friedman, Capitalism and Freedom, 7–21.

25. http://www.usgovernmentspending.com/total_spending_chart.

26. Friedman, Capitalism and Freedom, 35–36.

27. Michael Harrington, The Other America: Poverty in the United States (New York: Touchstone, 1962), 79.

28. Ibid., 205.

29. Ibid., 167.

30. Robert N. Bellah, Richard Madsen, William M. Sullivan, Ann Swidler, and Steven M. Tipton, Habits of the Heart: Individualism and Commitment in American Life (Berkeley, CA: University of California Press, 2007), xlviii.

31. Ibid., 334.

32. See David Davenport and Hanna Skandera, “Civic Associations,” in Never a Matter of Indifference: Sustaining Virtue in a Free Republic, ed. Peter Berkowitz (Stanford, CA: Hoover Institution Press, 2003).

33. Francis Fukuyama, Trust: The Social Virtues and the Creation of Prosperity (New York: The Free Press, 1995), 51.

34. Ibid.

35. Seymour Martin Lipset, American Exceptionalism: A Double-Edged Sword (New York: W.W. Norton, 1996), 277 (emphasis original). See also Robert Wuthnow, Acts of Compassion: Caring for Others and Helping Ourselves (Princeton, NJ: Princeton University Press, 1992), 22.

36. Herbert J. Gans, Middle American Individualism (New York: The Free Press, 1988), 1–4.