CHAPTER 2

Where Liberalism Went Wrong

1114115287

The first step on the path to a liberal revival is to understand how liberalism failed in the past. The term “liberal” may refer either to a political philosophy or a specific location along an ideological continuum from left to right. In this book, I concern myself more with the latter than the former. Liberalism, in this sense, stems from a basic human impulse to use the powers of government to promote the common good, to ensure that everyone has the same social, economic, and political opportunities. Liberals believe first and foremost that government should be democratic, composed of freely elected representatives who serve the interests of the many rather than the few. In serving the interests of the many, however, liberals also hold that government must respect the rights of individuals, along with those of racial, ethnic, religious, and political minorities, by guaranteeing them the freedom not only to vote but also to speak, assemble, petition, worship, have a speedy trial, and enjoy due process and equal standing before the law.

Liberal sentiments resound in the words of the founding fathers and the charter documents of the United States. The signers of the Declaration of Independence stated, “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed. . . .”The U.S. Constitution likewise begins by affirming that “the people of the United States, in order to form a more perfect union, establish justice, insure domestic tranquility, provide for the common defense, promote the general welfare, and secure the blessings of liberty to ourselves and our posterity, do ordain and establish this Constitution. . . .”

What distinguishes liberals from others is the belief that the rights and privileges outlined in the Declaration of Independence and enumerated in the U.S. Constitution are guaranteed to all people regardless of their characteristics, inborn or acquired. Thus, equality of opportunity should be offered to all persons resident in the United States, whether male or female, black or white, gay or straight, rich or poor, owner or worker; and liberals believe that equality of opportunity should exist not only in theory but in reality.

For those holding these beliefs, the creation of a liberal society represents an ongoing challenge, a life calling. The cultivation and extension of liberty within the United States is very much a work in progress. U.S. history is fraught with many paradoxes, but perhaps the greatest is that at its inception, the nation that came into existence to affirm the freedom of the individual and the right to self-government denied those privileges to most of people who then lived within its boundaries. According to the Constitution, adopted in 1789, full rights of citizenship were not offered to women, most blacks, and, in some states, those without property.1 At the time of the Constitution’s ratification, therefore, only around 43 percent of all U.S. inhabitants—basically free white male citizens over the age of twenty-one—enjoyed full rights under the U.S. Constitution, and in southern states the figure was only 33 percent.2

The Triumphs of Liberalism

Liberalism is thus a living enterprise rather than a static ideology; and as such it has had its triumphs and tragedies in the course of U.S. history. Liberal moments, during which rights expanded and liberty was extended, have alternated with periods of retrenchment during which conservatives have sought to stymie the expansion of rights and roll back prior extensions of liberty. The first liberal moment, of course, came with the founding of the United States, which created a constitutional structure for the cultivation of democracy, the assurance of rights, and the expansion of freedom. Since the adoption of the U.S. Constitution in 1789, the primary goal of liberals has been to realize this theoretical structure more fully and extend its benefits to a widening circle of Americans.

The second liberal moment began with the outbreak of the Civil War in 1861 and ended in 1876 with the withdrawal of federal troops from states of the former Confederacy. During these fifteen years, the Union was preserved at great human cost and amendments to the U.S. Constitution were passed to abolish slavery (the thirteenth), guarantee the rights of due process and equal legal standing to African Americans (the fourteenth), and ensure the right to vote for men of all races (the fifteenth). For a brief time federal troops stationed in the South enforced these constitutional rights and saw to it that they were honored, though often more in the breech than in reality. Under the Republican presidencies of Abraham Lincoln, Andrew Johnson, and Ulysses S. Grant, the franchise was extended and civil liberties enforced. By the end of the Reconstruction Era, 46 percent of adult Americans enjoyed full constitutional rights.3

In the disputed 1876 TildenHayes election, however, the Party of Lincoln abandoned its historic commitment to enforcing black civil rights in the southern states. In exchange for another term in the White House, Republicans agreed to withdraw federal troops from the South and turn a blind eye as former Confederate states imposed a new system of racial subordination. This caste system, which came to be known as Jim Crow, disenfranchised African Americans politically while subjecting them to separate and decidedly inferior treatment, an arrangement that was shamefully ratified by a conservative Supreme Court in the infamous Plessey decision of 1896.4 The suffering of millions of black citizens and the systematic abrogation of their constitutional rights was swept under the rug and hidden from sight for generations.

With its racial sins out of sight and out of mind, the nation turned to the business of making money, and lots of it. The end of the nineteenth century was a time of unprecedented wealth creation in the United States. Americans of European origin moved steadily westward to colonize the frontier, clear the land of its aboriginal inhabitants, and turn its natural bounty to fortune. Meanwhile, spurred by massive immigration from Europe, the nation urbanized and industrialized, and new manufacturing centers such as Chicago, Cleveland, and Philadelphia generated huge riches for the owners of factories and the businesses that served them and the new industrial work force.

Just after the Civil War, in 1868, the largest personal fortune in the United States—that of the railroad magnate Cornelius Vanderbilt—stood at $40 million. By 1890 the fortune inherited by his son, William, stood at $200 and by 1912 the nation’s richest man, John D. Rockefeller, held $1 billion in assets. Over this period, the ratio of the largest to the median fortune in the United States grew from 80,000:1 to 1,2500,000:1, a sixteen-fold increase in just over forty years.5 Although the U.S. Constitution, in theory, guaranteed rights and equality of opportunity to all males over the age of twenty-one, in practice the new divisions of wealth translated directly into huge disparities in power and political enfranchisement on the basis of class (not to mention skin color).

The rise of corporate trusts concentrated immense wealth within a small group of interbred families who together owned and managed most of the productive capacity of the United States. In the absence of federal efforts to control corporate power, the task was left to the states, which were forced to compete against one another to attract and retain industries, yielding a race to the bottom with respect to corporate governance and industrial safety and wages. State legislators were notoriously corrupt and were routinely suborned by wealthy interests to secure legislation favorable to themselves but inimical to the general welfare.

The power of families such as the Rockefellers, Vanderbilts, Carnegies, Fricks, Harrimans, Astors, Morgans, and Mellons was immense and amplified by the fact that U.S. senators were not elected by the public, but chosen by state legislatures. With a few modest payments to well-placed political bosses, a cooperative legislature could be purchased to secure the “election” of trusted members of the wealthy elite as U.S. senators. In 1902, one-third of U.S. senators were themselves millionaires, enough to guarantee that any piece of legislation threatening the class interests of the wealthy could be blocked through filibuster.6 During the “gilded age” of the late nineteenth century, some men were clearly more equal than others.

The third liberal moment came with the Progressive Era, which I date roughly from Theodore Roosevelt’s ascension to the presidency in 1901 to Woodrow Wilson’s retirement from office in 1921. In the course of these twenty years, the legitimate right of the federal government to control and regulate private business was established and the unbridled power of what Roosevelt called the “great malefactors of wealth” was constrained. Roosevelt, a liberal Republican of patrician origins, used the powers of government to arbitrate the bitter struggle between labor and capital, and to assure economic fairness he became serious in applying the 1890 Sherman Anti-Trust Act, working forcefully to break up corporate monopolies with a zeal and dedication that shocked his wealthy peers.

The Progressive Era reached a crescendo during the presidency of Woodrow Wilson, a Democrat. During his first year in office (1913), the idealistic former Princeton professor presided over the creation of the Federal Reserve banking system, thus establishing federal control over the nation’s money supply. To represent the interests of workers he created the U.S. Department of Labor, and to fund the expanded role of government he imposed a graduated federal income tax. Finally, he limited the power of the wealthy by instituting the direct election of U.S. senators, which was shortly followed by a sharp drop in the number of millionaire members.

Later in his administration, Wilson sought to project his liberal ideals into the global arena, calling for the extension of democracy worldwide through national self-determination and leading the charge to create a League of Nations that would preserve international order and avoid the perils of global warfare. Though the latter campaign ultimately failed, during his last year in office progressive forces finally achieved women’s suffrage with the ratification of the Nineteenth Amendment to the U.S. Constitution. At last women could freely vote in all federal, state, and local elections, and the share of the U.S. adult population enjoying full constitutional rights finally exceeded half, reaching 90 percent in that year.7

During the ensuing decade, the nation turned inward and once again focused on the business of making money, closing its doors to immigration, raising barriers to international trade, and relaxing Progressive Era reforms. The 1920s were the heyday of social Darwinism and laissez-faire economics. New fortunes were made in automobiles, entertainment, and electronics, and older treasures grew. By 1929, John D. Rockefeller’s wealth had mushroomed from $1 billion to $6.3 billion, and seven other people joined him in the ranks of the nation’s billionaires.8 On the eve of the stock market crash in late October, the richest 1 percent of the U.S. population earned 15 percent of the income, a share not reached again for nearly seventy years.

The collapse of the great speculative stock bubble of the 1920s ushered in the Great Depression of the 1930s, which set the stage for the fourth liberal moment, the New Deal, which began with Franklin D. Roosevelt’s election to the presidency in 1932 and ended with Dwight D. Eisenhower’s electoral triumph in 1952. Franklin Roosevelt picked up where his distant cousin Theodore had left off, again seeking to use the powers of the federal government to arbitrate the conflict between labor and capital and manage economic affairs in the public interest.

During his first administration he submitted a veritable blizzard of legislation to Congress in an effort to reform the excesses of the 1920s and restore the U. S. economy to health. Contrary to what his conservative critics asserted, he did not seek to impose socialism in the United States but to create more effective, efficient, fair, and transparent markets. The 1933 Banking Act tightened federal control over the banking industry and established the Federal Deposit Insurance Corporation to underwrite the deposits of individual citizens and restore solvency to the nation’s financial system. Additional legislation in 1933 established the Tennessee Valley Authority as a publicly chartered corporation to promote the construction of dams and the generation of cheap electricity that would promote industry and rural electrification. The 1934 Federal Securities Act established the Securities and Exchange Commission to police the nation’s financial and equities markets and curb insider trading and other forms of crony capitalism. To ensure the commission’s effective operation, Roosevelt named as its first chairman a well-known financier, a crony capitalist familiar with all the tricks of the trade—Joseph P. Kennedy.

Also in 1934, Roosevelt pushed through Congress the National Housing Act, which created the Federal Housing Administration and established the first federal underwriting program to create a national mortgage market that brought home ownership—and for many citizens the first real chance at significant wealth—to a majority of American families. In 1935 the Social Security Act established the first comprehensive federal retirement system, along with a survivor and disability insurance program and a national program of public relief. At the same time, the Public Works Administration was set up to generate employment through an “alphabet soup” of subsidiary agencies such as the Civilian Conservation Corps (CCC), the National Youth Administration (NYA), the Works Progress Administration (WPA), the Federal Arts Project (FAP), the Federal Theater Project (FTP), and the Federal Writers Project (FWP). Finally, the 1936 National Labor Relations Act affirmed the right of workers to join unions and to bargain collectively through freely elected representatives.

Although opponents alleged that Roosevelt sought to destroy capitalism, in reality he saved it from its own excesses and laid the foundations for unprecedented growth and prosperity during the postwar period, creating what the economist John Kenneth Galbraith in 1958 called the “Affluent Society,” one characterized by reliable economic growth and steadily rising material well-being.9 When he became president upon Roosevelt’s death in 1945,Harry Truman worked to maintain the political coalition underlying the New Deal, calling his program the “Fair Deal”; but his greatest success came in fulfilling Woodrow Wilson’s dream of creating a liberal world order.

Under Truman’s leadership, the United States joined with its wartime allies to establish the fundamental institutions that undergird today’s global economy—the United Nations, the North Atlantic Treaty Organization, the World Bank, the International Monetary Fund, and the General Agreement on Tariffs and Trade. Working with his Secretary of State, General George C. Marshall, he channeled U.S. capital through the World Bank to rebuild war-devastated Europe and Japan and contain the militant expansion of the Soviet Union. Then once these goals were achieved, the World Bank’s sights turned to promoting economic growth and development in a decolonizing Third World.

Billed as a fight against fascism,World War II brought the issue of black civil rights out from under the rug. How could the United States claim to fight for freedom abroad while denying it to so many at home? In 1942 the Roosevelt administration secured passage of the Fair Employment Act, which required private firms working on government contracts not to “discriminate against persons of any race, color, creed, or nationality in matters of employment.” In time of war, Southern Democrats felt unable to filibuster legislation that authorized so much spending on behalf of what Studs Terkel later called “the good war.”10

In 1948, President Truman attempted to push the civil rights agenda further by desegregating the U.S. Armed Forces and agreeing to a civil rights plank in the party platform at the Democratic National Convention (passed only after an impassioned speech by Hubert H. Humphrey). In response, Senator Strom Thurmond of South Carolina led angry Southern Democrats out of the party to run separately for the presidency as “Dixiecrats” in a failed attempt to deny Truman—and his civil rights platform—victory by stealing the electoral votes of the South. Truman won anyway, but the New Deal coalition was clearly beginning to come undone over the issue of race, liberalism’s major piece of unfinished business.

The 1950s were a time of peace and prosperity presided over by the revered patriarch and hero of World War II, Dwight D. Eisenhower, who steered a moderate political course that left intact most of what the New Deal had accomplished. He was not an ideological man and given his immense popularity was courted by both parties as a presidential candidate. Although not a period of retrenchment in rights and liberties, it was not a time of great forward progress either. The most important advancement in civil rights came not from the executive or legislative branch of government, but from the judiciary in the form of the Supreme Court’s 1954 Brown v. Topeka decision, which declared racial segregation in public schools to be unconstitutional.11 Only two years into the Eisenhower presidency, of course, most supreme court justices were liberals who had been appointed during the twenty years of the Roosevelt and Truman administrations.

The last liberal moment of the twentieth century came during the 1960s under presidents John F. Kennedy and Lyndon B. Johnson. They faced the daunting task of honoring liberalism’s moral commitment to civil rights while attending to the practical problem of keeping populist Southern Democrats within the New Deal coalition. Bowing to political exigencies, support for the civil rights struggle during the Kennedy administration emphasized the moral over the practical. Although Attorney General Robert F. Kennedy enforced the Brown decision in Southern colleges and JFK himself publicly sympathized with Martin Luther King and his Southern Christian Leadership Council, meaningful civil rights legislation languished in Congress, presumably awaiting Kennedy’s reelection in 1964.

Instead, it was a Southern Democrat, Lyndon B. Johnson, who proved to be the century’s great legislator of civil rights. Capitalizing on the moral authority of a martyred president, he used his formidable parliamentary abilities (honed earlier as Senate Majority Leader) to secure passage of the1964 Civil Rights Act, which forbade segregation in schools, accommodations, restaurants, stores, transportation, and public services nationwide. It also prohibited racial discrimination in hiring, promotion, and remuneration; and it created separate commissions on Civil Rights and Equal Employment Opportunity to monitor progress in the achievement of racial equality. In one fell swoop, Lyndon Johnson’s signature wiped out the legal basis for Jim Crow in the South and created a framework to combat informal but no less potent discrimination in the North.

With a landslide reelection under his belt in November of 1964, the president gave recalcitrant Southern legislators the “Johnson treatment” (a forceful mixture of reward, punishment, threat, and physical intimidation)12 to secure passage of the 1965 Voting Rights Act. Often heralded as the Second Reconstruction, this legislation empowered the federal government to enter Southern states to supervise and control local elections to ensure that they adhered to the principle of “one person, one vote” regardless of race. In the same year, amendments to the Immigration and Nationality Act eliminated discriminatory ethnic and racial quotas in the nation’s immigration system. Finally, in the wake of Martin Luther King’s assassination in 1968 and the subsequent wave of urban riots, Republican Senator Everett Dirksen brokered a compromise that broke a Southern filibuster to secure passage of a long-delayed Johnson project, the Fair Housing Act, which banned discrimination in the rental and sale of housing.13

The “Johnson treatment” was not only reserved for legislators who opposed civil rights legislation, however. The former senate majority leader also aspired to be the greatest social legislator in U.S. history and, more particularly, to create a “Great Society” in which poverty no longer existed. By the late 1960s, Galbraith’s affluent society had reached its fullest expression. Poverty had declined steadily since 1945, income inequality had dropped, average income had risen, and the fruits of this abundance had been distributed widely as never before. The only thing that remained was to undertake a frontal assault on “pockets of poverty” that persisted in America’s inner cities and rural areas.

To accomplish this goal, in his first State of the Union address Johnson called for an “unconditional war on poverty” and submitted to Congress what became the Economic Opportunity Act of 1964, which emulated Roosevelt’s New Deal in the “alphabet soup” of agencies that it created to generate employment and foment opportunity, including Head Start, the Job Corps, the Neighborhood Youth Corps, Volunteers in Service to America, and the controversial Community Action Program, which sought to ensure the “maximum feasible participation” of the poor themselves as soldiers in the new War on Poverty.14

The flurry of social legislation also included bills to create Medicare (the federal health insurance system for the aged), Medicaid (the program of subsidized medical assistance for the impoverished), the U.S. Department of Housing and Urban Development (to promote desegregation and coordinate to war on urban poverty), the U.S. Department of Transportation (to channel funds into public mass transit), the National Endowment for the Humanities, the National Endowment for the Arts, the Public Broadcasting System (to promote wider access to cultural products), and the Model Cities Program (to clear slums and rehabilitate low-income urban neighborhoods). He also secured passage of the Elementary and Secondary Education Act and the Higher Education Act, which made billions of dollars available for public education, as well as the Air Quality Act and the Water Quality Act to combat pollution of the environment.15

Johnson’s civil rights agenda sought to guarantee full rights for all citizens not only theoretically but also practically, and new laws forbade discrimination not only on the basis of race but also of sex, religion, and national origin, a list that was later expanded to include age and disability. With the passage of the Twenty-Sixth Amendment to the Constitution, which lowered the voting age to eighteen, the liberal dream of universal suffrage and full equality before the law reached its zenith. In 1972, 98 percent of the U.S. adult population was eligible to vote.16 At the same time, the War on Poverty was having its effect. From 1963 to 1970, the poverty rate dropped from 22.2 to 12.6 percent, a 43 percent drop in just seven years.17

The Retreat of Liberalism

The Johnson administration proved to be the high water mark of liberalism in the twentieth century. By the early 1970s, more Americans had been enfranchised, both economically and politically, than ever before in U.S. history. Average household income was at its height, income inequality had reached a nadir, and the proportion of families in poverty would never again be so low. Liberalism appeared triumphant. But its moment of triumph was also that of its downfall, as 1968 witnessed the beginnings of a tectonic political realignment that ultimately created a new governing coalition, bringing to power a new cadre of conservatives who were hostile to government and deeply suspicious of the uses to which it had been put under successive Democratic presidents. What brought liberalism’s downfall after decades of stunning political successes that had succeeded in benefiting so many people in so many ways?

The Achilles’ Heel

There are many reasons for liberalism’s demise in the last quarter of the twentieth century, but first and foremost among them is race. Race proved to be the quintessential wedge issue that Republican tacticians could cynically but effectively use to pry apart the New Deal coalition that had steadfastly supported populist, redistributive policies—but only as long as they did not directly benefit blacks and other minorities. From FHA and VA mortgages to Supplemental Security Income, and from Social Security benefits to Aid for Families with Dependent Children, the liberal programs of Roosevelt’s New Deal and Truman’s Fair Deal had been structured either to exclude minorities from participation or to delegate authority to the states to do so.18 Once liberals insisted on cutting blacks and Latinos into the populist action, Southern Democrats bolted and Republicans pounced to hack away at the Achilles’ heel of the New Deal coalition.

Following a blueprint first adumbrated by Kevin Phillips, Richard Nixon put together a “Southern strategy” that appealed in coded, symbolic ways to antiblack sentiment prevalent in the South as well as to socially conservative working-class values in the North.19 This strategy became the bedrock of a new political coalition in which Southerners joined the Republican party and allied themselves with nonmetropolitan, fundamentalist Christians, urban blue-collar workers, and wealthy corporate interests to capture first the White House and ultimately Congress. One by one, Southern Democratic legislators crossed the isle to switch parties, including former Democratic stalwarts such as Strom Thurmond (South Carolina), John Stennis (Mississippi), Jesse Helms (North Carolina), and James Eastland (Alabama). Those Southern senators who remained in the Democratic fold (J. William Fulbright, Albert Gore, Sr.) were targeted for electoral defeat and slowly picked off.

In the course of two decades, the party of the Great Emancipator became the party of a revived neo-confederacy. Republican leadership shifted away from moderate figures such as Nelson Rockefeller (New York), Everett Dirksen (Illinois), Gerald Ford (Michigan), Abraham Ribicoff (Connecticut), and Edward Brooke (Massachusetts) and into the hands of radical Southern conservatives such as Newt Gingrich (Georgia), Trent Lott (Mississippi), Tom DeLay (Texas), John Ashcroft (Missouri), and, of course, George W. Bush (Texas). The Californians Richard Nixon and Ronald Reagan proved to be but way stations on the road to the southernization of the Republican party.20

Though shocking to many, Senate Majority Leader Trent Lott’s unguarded remark in late 2002 should not have been surprising. At the 100th birthday celebration for the original Dixiecrat, Strom Thurmond, Lott crowed that “we voted for him. We’re proud of it. And if the rest of the country had followed our lead, we wouldn’t have had all these problems over all these years.”21 Indeed, as early as 1984 he boasted that “the spirit of Jefferson Davis lives in the 1984 Republican platform” and he expressed sympathy for the Jefferson Davis Society, a nonprofit organization founded in 1994 to honor the late president of the Confederate States of America.22

The political strategy of Republicans from Richard Nixon onward has been quite simple, a variation on Julius Caesar’s classic admonition to “divide and conquer.” Focus groups, political polls, and other research conducted during the 1960s, 1970s, and 1980s repeatedly showed the persistence of considerable animus toward blacks on the part of three crucial New Deal constituencies—ethnic blue-collar workers in the North, lower-class whites in the South, and nonmetropolitan whites in the Great Plains and Rocky Mountains.23 By appealing to antiblack sentiments in these constituencies, Republicans sought to convince lower-income whites to vote against their own economic interests in return for sticking it to “uppity” blacks and their elitist white allies (“limousine liberals”). From Ronald Reagan’s deployment of the “welfare queen” imagery, to George H. W. Bush’s infamous Willie Horton ad campaign, Republicans have successfully appealed to Americans’ baser prejudices and racist sentiments, deliberately dividing white from black to widen a ragged racial schism for short-term political profit.24

From the 1970s onward, conservatives have used race to mount a successful campaign to stop and then turn back the rising tide of civil rights, blocking ratification of the Equal Rights Amendment, launching court challenges and referenda to limit the use of affirmative action, watering down civil rights enforcement, and filling the judiciary with people hostile to the advancement of minorities in American society. In return for this symbolic stand against “undeserving minorities,” middle- and working-class whites acquiesced to a massive and unprecedented upward redistribution of wealth in American society, supporting the limitation of taxes at the state and local levels, the reduction of inheritance and dividends taxes at the federal level, and cuts in corporate and personal income taxes at all levels.25

Between 1970 to 1990, total tax payments became increasingly regressive and inequality surged: the effective tax rate experienced by the top 1 percent of the income distribution dropped from 69 to 26 percent, and from 1970 to 1997 their share of total U.S. income rose from 6 to 16 percent.26 Whereas the incomes earned by the poorest 60 percent of families actually declined from the mid-1970s to the mid-1990s, the top 5 percent saw theirs increase by a third and the top 1 percent saw theirs go up by 70 percent.27 Over the same period, U.S. household income inequality rose by a remarkable 13 percent,28 and inequality of wealth rose even faster as the share of wealth owned by the top 1 percent of households doubled from 20 to 40 percent.29

This rise in inequality would have been even greater if it had not been for the increase in hours worked by men and the unprecedented entry of women into the work force, which bolstered household incomes that were sagging in the face of declining real hourly wages. Disposable real earnings declined from an average of around $10.50 per hour in 1972 to under $9.50 in the mid-1990s. As hourly wages stagnated, households and families supplied more hours of labor simply to maintain their standard of living—Americans were running harder to stay in place. At the same time, the middle class was financing a larger share of total government expenses. From 1970 to 2000, the share of the federal tax burden borne by individual payroll taxes climbed from 18 to 31 percent.30

The Republican realignment also brought a dramatic decline in the regulation of markets and industries, handing the rich new opportunities to get much, much richer. The mean net worth of the Forbes 400 largest fortunes went from $396 million in 1983 to $1.6 billion in 1997 and over the same period the share of assets controlled by the wealthiest 1 percent of households jumped from 30 to 40 percent.31 Whereas the ratio between the income earned by chief executives and production workers stood at around 25 in 1968, by 1999 it reached 419. From 1990 to 1998, the average compensation package earned by executives in the largest corporations increased by 481 percent even though corporate profits rose by only 108 percent.32

If regressive tax policies were not enough to check the expansion of social welfare in the United States, they were accompanied, during the administrations of Ronald Reagan and George W. Bush, by massive increases in federal defense spending, which boosted the profits of government contractors but put severe fiscal pressures on federal discretionary and entitlement funding. The manufactured budget crisis provided the political leverage needed to enact massive cuts in social spending. Whereas indices of U.S. social health had risen lockstep with increasing GDP through 1970, afterward trends in the social and economic health of the nation departed from one another at an increasing rate.33 By pandering to the racial animus of white America and using antiblack sentiment as a wedge to force blue-collar Democrats away from their natural party, the southern takeover of the GOP succeeded by 2003 in dismantling much of the New Deal.

Other Reasons for Liberal Decline

As important as race is to understanding the collapse of liberalism in the late twentieth century, it is only part of the story. Whereas liberals might blame conservatives for pandering to American’s baser instincts or complain about the lamentable persistence of racial prejudice among whites, other reasons for liberalism’s decline are internal to the movement itself. Aside from the scar of race, the remaining wounds to the liberal agenda were largely self-inflicted and, taken together, they heartily abetted the Republican strategy of using race to divide and conquer the New Deal coalition.

Opposition to the programs of the Great Society was based only partly on race. As paradoxical as it may seem, resistance was also based on class, for by the 1970s the ruling elites of the Democratic party had grown increasingly arrogant, self-righteous, and callous toward the sensibilities of their working-class base. In addition to a racially coded symbolic politics, therefore, conservatives also appealed to simmering class resentments within the Democratic coalition, coining terms such as “pointy-headed intellectuals,” “nattering nabobs of negativism,” “poverty pimps,”“limousine liberals,” and “brie and Chardonnay activists” to conjure up images of affluent liberals who lived and worked in safe enclaves, where they dreamed up new programs to impose on working people, who bore the costs and experienced the consequences of liberals’ decisions.34

As the civil rights movement shifted out of the South and the war on poverty confronted concentrated disadvantage in the urban North, liberal Democrats naturally encountered resistance from entrenched social and political interests that were threatened by the changes.35 Rather than acknowledging the sacrifices that were being asked of working-class whites and their political bosses, and attempting to reach a political accommodation that offered benefits to counterbalance them, liberal elites treated lower-class opponents as racist obstructionists to be squelched using the powers of government. Rather than outlining a political argument to explain why desegregation was in their interests and providing money to ease the pain of transition, liberals increasingly turned to the courts and executive branch to force working-class whites and local political bosses to accept whatever changes they mandated from above.

During the 1960s and 1970s, well-educated, affluent liberal planners created urban renewal programs that blithely deemed working-class neighborhoods to be “slums,” systematically stripped workers of cherished homes, and, after razing the dwellings, converted the land to “higher” (i.e., middle-class) uses from which the original inhabitants received little or no benefit.36 Without prior consultation or compensation, public housing for poor families was plunked down within stable, working-class neighborhoods to undermine property values and impair safety, something that residents bitterly noted never happened to the neighborhoods where the liberals themselves lived.37 High-priced lawyers, well-educated bureaucrats, and upper-middle-class activists worked assiduously to end the passing of jobs, union memberships, and apprenticeships through family and friendship networks, which perforce excluded minorities.38 These liberals themselves, however, benefited from an institutionalized system of “legacy admissions” that guaranteed places for their children at the nation’s best colleges and universities, thus ensuring the reproduction of their own privileged class position.

The well-schooled children of the baby boom came of age during a period of unprecedented affluence in which material security was taken for granted.39 They believed education and jobs were permanent birthrights rather than fragile gains achieved through hard-won economic and political struggles.40 In the 1972 presidential election, youthful activists took control of the Democratic party and managed to communicate quite clearly the contempt they felt for the patriotism, faith, and social conservatism of the white working class, whom they derided as “hard hats,” “red necks,” and “racists.” Faced with such overt contempt and welcomed as a “silent majority” by Richard Nixon, Northern ethnic voters and Southern whites deserted the Democratic party in droves to produce a landslide for the Republicans. The self-destruction of Richard Nixon in Watergate was followed by the anomalous victory of Jimmy Carter in 1976 and sent the wrong signal to liberal Democrats, who preferred to see Nixon’s Southern strategy as a bad dream. They continued to neglect the warning signs of 1972, leading to a predictable return to Republican landslides in the 1980s.

The arrogance and self-righteousness of liberal elites manifested themselves in yet another way that had disastrous effects. The supreme irony of the Great Society is that the same liberal architects who promoted civil rights and social welfare also prosecuted a costly foreign war on the basis of lies, deception, and subterfuges that once again callously abused the faith and trust of the working class. As subsequent tapes and archives have clearly shown, liberals in the Johnson administration—including the president himself—manufactured a supposed attack on U.S. warships in the Gulf of Tonkin to secure a congressional authorization for military intervention in Vietnam. Then they systematically lied to voters about the costs and consequences of that engagement and its ultimate prospects for success.41

Aside from the betrayal of public trust, the Vietnam War also contributed to the demise of liberalism through fiscal means after 1968. Economically, Johnson’s attempt to support guns and butter without raising taxes laid the foundation for inflationary spirals and stagflation in the 1970s. The 1973 oil boycott would have dealt a serious blow to the U.S. economy under any circumstances, but the fiscal excess of the Great Society, combined with the Vietnam War, turned what in Europe and Japan were severe but manageable recessions into a disastrous brew of inflation, unemployment, and long-term recession in the United States.

A particular challenge to liberals stemmed from the fact that high rates of inflation in the 1970s produced rising nominal wages but declining spending power in real terms, causing a serious problem of “bracket creep” in the federal tax system. In the course of the 1970s, more and more Americans were pushed by inflation into income tax brackets that were originally intended to apply only to the very affluent. Middle-income Americans were working harder for less money in real terms, but were being taxed at higher and higher rates.42

High inflation also brought about an escalation in the value of real assets, particularly housing. Families with modest incomes suddenly found themselves owning homes—and paying real estate taxes—far above what they could really afford.43 Rather than sympathizing with the plight of middle-class families struggling to pay taxes in an era of stagflation, however, liberals viewed rising tax revenues as a source of easy money. Bracket creep and asset inflation offered legislators a seemingly costless way to raise taxes steadily without ever voting to do so.

But there were indeed costs. The unwillingness of Democratic legislators to adjust tax brackets or accommodate the inflation of housing prices set the stage for a middle-class tax revolt. As is often the case, the revolution began in California. By a large majority, voters in that state passed Proposition 13 in 1979 to cap property taxes permanently at unrealistically low levels, which led directly to a sustained decline in the quality and quantity of California’s public services, notably education.44

Riding the wave of middle-class anger and resentment, Ronald Reagan promised “morning again in America” and won a landslide victory over the hapless Jimmy Carter in 1980. One of his first acts was to cut tax rates sharply and to reduce their progressivity, which, when combined with a massive increase in defense spending, shut off the flow of money that had financed the expansion of liberalism. Following a path that led from intervention in Vietnam to hyperinflation to bracket creep, liberal Democrats—through a remarkable combination of arrogance and self-righteousness—dug their own graves in the 1970s and created the political conditions whereby conservatives could achieve their cherished goal of “de-funding” the New Deal.

The Vietnam War had one more effect on the American electorate that was less tangible but no less powerful: it forcefully underscored that liberal elites made the decisions while working-class whites paid the price, thus reinforcing a politics of class resentment manipulated so effectively by conservative Republicans. The soldiers who fought and died in Vietnam were disproportionately drawn from the America’s working and lower classes.45 The sons and daughters of upper-middle-class professionals—the people who held power, influence, and prestige in the Great Society—by and large did not serve in Vietnam. On the contrary, they evaded the draft through a combination of student deferments, personal connections, and a skillful use of medical disabilities. To add insult to injury, as they sat out hostilities on campus, they very vocally and visibly protested the war in southeast Asia and branded U.S. soldiers as “war criminals” and “baby killers.” Tellingly, once the system of student deferments was abandoned and the children of the upper-middle class faced the real risk of being drafted through random assignment, direct U.S. participation in the war quickly ended.

To blue-collar workers in the North and poor whites in the South (the latter always being over-represented in the U.S. military) it seemed as though liberal lawmakers favored the war as long as someone else’s children were serving and dying as soldiers, but as soon as their precious offspring were put at risk, they quickly ignored the sacrifices of the working classes, forgot about the 60,000 dead, and abandoned hundreds of POWs and MIAs in their haste to leave Vietnam. The ultimate result was the evolution of a working-class mythology of sellout by unpatriotic liberal elites (“America haters”), epitomized cinematically by the movies and roles of Sylvester Stallone, Chuck Norris, and Clint Eastwood, whose tag lines were appropriated to great political effect by Ronald Reagan. Although liberal Hollywood critics turned up their noses at Rambo and his ilk, the movies were immensely successful at the box office and clearly tapped into a rich vein of popular resentment against liberal elites, which liberals once again ignored at their peril.

During the 1980s and 1990s, as liberal Democrats began to be driven from the public sphere by the politics of race, combined with their own self-righteous blindness and arrogance, they responded in two equally unproductive ways. One response sought to remake Democrats as Republicans under the aegis of the Democratic Leadership Committee, positioning candidates in the political market as “Republicans lite.”46 This group achieved power and prominence under the charismatic leadership of Bill Clinton; but absent his charisma and lacking a clearly defined ideology to oppose the Republican right, they had the rug pulled out from under them in 2000 and went on to humiliating defeat in 2002. After all, faced with a choice between real and ersatz Republicans, why not pick the real thing? Americans never like phonies.

As the mainstream of the Democratic party turned rightward in a failed bid to emulate Republicans, other liberals retreated to the safe confines of academia, where under the banner of postmodernism, deconstructionism, critical theory, or, more popularly, “political correctness,” they prosecuted what became known as the “culture wars.”47 In the course of this new campaign, liberalism on campus became an Orwellian parody of itself, suppressing free expression to ensure liberal orthodoxy and seeking to instill through indoctrination what it could not achieve politically at the polls.48 To the delight of conservatives everywhere, liberals often ended up attacking each other—seeking to unmask a white male as a closet racist, and ferreting out the last vestiges of racism, sexism, classism, and ageism wherever they might remain, even in the nation’s most liberal quarters. Authors such as Dinesh D’Souza, Alan Bloom, Roger Kimball, and Robert Bork had a field day lampooning the tortured logic, breathless rhetoric, and impenetrable jargon offered up by the priesthood of postmodernism, further alienating liberals from their base among the poor and working classes.49

The essence of postmodern ideology is that there is no such thing as objective truth. There is only a subjective reality constructed by the powerful through a “hegemonic discourse” that serves the narrow interests of the privileged.50 The goal of post-modern scholarship is to “deconstruct” the hegemonic discourse to reveal how it serves the interests of power and privilege and then to substitute a counter-narrative that advances the interests of persons formerly marginalized in terms of class, race, ethnicity, gender, and sexual preference, and that celebrates their characteristics.

There is nothing illiberal about defending the interests of the marginalized and advocating the rights of the oppressed, of course. But whereas the instinct to stand up for the little guy may be liberal, the conceptual apparatus of postmodernism is not. Ultimately it is a neo-fascist ideology that seeks little more than to replace one tyranny for another. Rather than seeking to advance an open society in which the rights and opinions of all citizens are protected, it seeks to create a new social world in which an alternative hegemonic discourse favorable to the oppressed is imposed by an intellectual vanguard acting in the “interests of the people.”

The vocabulary of postmodernism, however, is so arcane, the prose so dense, and the propositions so convoluted that they can be mastered only with a great investment of time and energy by a leisured class of intellectuals, either independently wealthy or working on someone else’s nickel. Anyone who has ever tried to digest a postmodern tract quickly realizes that contempt for the uninformed and un-elect is built into the corpus of critical social theory. Is it any wonder that right-wing critics like Dinesh D’Souza find a ready mass audience when they rise to oppose the avatars of academic liberalism?

Liberalism at the Crossroads

Liberals accomplished great things during the first three-quarters of the twentieth century. They extended full constitutional rights to women, racial minorities, and young adults, finally giving the right to vote to a true majority of the U.S. population.51 They achieved the direct popular election of U.S. senators and established control over corporate trusts to reign in the power of the wealthy. They established the right of the federal government to manage the U.S. economy in the public interest and secured a place for workers at the economic bargaining table. They created a limited but serviceable system of social welfare that protected citizens from the worst ravages of market failures and provided basic access to health care. They created programs to underwrite markets for credit, capital, and insurance that dramatically increased household wealth and protected family incomes. Finally, they enacted programs to ensure that the civil rights of all Americans were honored and that the natural environment—air, water, land, forests—was protected.

As a result of these achievements, in 1975 Americans were healthier, wealthier, and more affluent than ever before, and collectively they had a greater say in their own governance than at any time in U.S. history. Since that time, however, liberals stumbled badly and let down key elements of their constituency—and the nation as a whole—in a variety of fundamental ways. When they encountered resistance to black civil rights among poor and working-class whites—some of it racially motivated, some of it not—rather than dealing with the resistance politically, liberal elites sought to impose solutions from above by taking advantage of their privileged access to judicial and executive power. Then, rather than telling Americans honestly about the likely costs and consequences of a military intervention in southeast Asia and trusting them to make the correct decision, they used lies and deception to trick voters into supporting an unwinnable war that was fought mostly by the poor and working classes; and when the war came too close to home, they quickly forgot about the lower-class combatants and the sacrifices that they had made. Then, after liberals’ attempt to support guns and butter set off hyperinflation to erode the real value of wages, they callously thought up new ways to spend the windfall of tax revenue rather than adjust tax brackets to relieve the unsustainable burden on the middle class. Finally, when faced with political revolt because of these misguided policies, they either sold out and attempted to appropriate conservative positions for the sake of election or they retreated into arcane ideologies to wage a rearguard cultural insurgency from the safety of the ivory tower.

From several perspectives, then, the last quarter of the century was a disaster for the cause of liberalism and a travesty for those constituencies it ostensibly represented. Rather moving forward to extend and enforce civil rights, liberal strategies since 1975 sparked a counter-revolution that steadily chipped away to reduce earlier gains. Liberal arrogance drove poor and middle-class whites into the waiting hands of conservatives, who used their support to engineer a massive upward redistribution of wealth and income. As a result, since 1975 the rich have become richer, middle-class families have worked harder to stay in place, and the poor have steadily grown poorer. As the needs of the poor and middle class grew in response to the rising inequalities, social services were cut and the safety net trimmed.

As a result, on virtually every indicator of social and economic well-being, the vast majority of Americans are no better, or are even worse off, than they were in 1975. At the dawn of the twenty-first century, the United States has evolved a political economy that clearly and consistently benefits only the top fifth of the income distribution. These people enjoy unprecedented wealth, magnificent health, and they increasingly live and work in safe, secure, and luxurious enclaves that are removed socially and spatially from the rest of American society. To preserve the status quo that has lavished upon them this beneficence, they vote in large numbers, participate heavily in politics, and donate substantial sums to candidates and political organizations that reflect their interests. Just as earlier in the century, the U.S. senate has increasingly become a club for millionaires (there are at least forty at last count) instead of a deliberative body that is representative of the American people.52

The obvious question is why and how a political economy that fails to deliver social and economic welfare to four-fifths of the population is allowed to persist in a democratic society. Part of the answer lies in the low rates of political participation and voting among those not in the top fifth of the class distribution. The explanation also lies in the disproportionate resources at the disposal of the favored classes and of the growing importance of money in politics. But a more fundamental problem is the lack of a coherent political alternative, the absence of a clear and convincing program of opposition that the unfortunate 80 percent can believe in, contribute to, and ultimately vote for.

The limitations of current social and economic arrangements are clear in national statistics that chart trends and differentials with respect to income, health, wealth, and work. People are aware of the situation and are eager for change, but liberals have not been able to articulate a salable political vision. Neither the watered-down Republicanism of the Democratic Leadership Committee nor the inward-looking identity politics of the campus offer an appealing model for political progress. In politics a failed paradigm is rarely discarded because it is seen to have problems. As in science, a paradigm is rejected only when a better model is available to serve as an alternative. During the past quarter-century, liberals have failed miserably at offering such an alternative.

The first national elections of the millennium were embarrassments. In 2000, Al Gore sought to replicate Clinton’s success without his political skills or personal magnetism; but whereas Clinton’s chameleon personality could read the political winds, shift course, and still offer a convincing message to voters, Al Gore’s wooden attempts to do the same thing simply made him look phony, insincere, and ultimately untrustworthy. With no coherent ideology and a candidate lacking in charm and grace, voters had little reason to support the Democratic platform, creating circumstances in which the election could be stolen against the wishes of most voters. Liberal ineptitude let George W. Bush became minority president through political chicanery at the state level and suborned justice at the national level.

Given the closeness of the 2000 elections, many looked to the midterm elections of 2002 for a corrective. But it didn’t happen. Despite a weak economy, the steady movement of the nation toward war in Iraq, and little tangible progress in the war on terror, the Republicans retook control of the Senate and increased their margin in the House of Representatives. The problem, once again, was the lack of a coherent, believable platform of opposition. If one needed convincing evidence that the strategy of “Republican lite” was a loser, the elections of 2002 supplied it.

The purpose of this book is to offer a coherent liberal alternative to the reigning conservative ideology. The Republican coalition is weak and vulnerable, out of touch with the mainstream and disconnected from the values, beliefs, and aspirations of most Americans. What is missing to convert this vulnerability to political victory is a convincing platform that tells voters why it is in their interests—and in the interests of the nation—to support liberal causes and candidates. Electoral victory cannot be accomplished by running away from the liberal label, by donning the garb of Republicans. If Americans are good at one thing, it is spotting phonies, and they will never support people who try to be something they are not. The time has come to end the charade and for candidates to step forward and say “I am a liberal and this is what I believe.” This book offers them something to say and a program of action to undertake.