The 1950s was a decade of political retrenchment across the region. The cold war contributed to this trend. Segregationist Southerners denounced civil rights activists as either outright Communists or tools of the Communist conspiracy. Senator James Eastland of Mississippi chaired the Senate Internal Security Subcommittee, which regularly called witnesses to testify about alleged links between the civil rights movement and politically subversive organizations. Southern state legislatures held similar hearings that investigated civil rights organizations with alleged ties to subversive organizations or in some cases became the basis for new laws that helped to deter civil rights organizations.
More than anything, however, the Supreme Court’s 1954 decision in Brown v. Board of Education, striking down school segregation laws, precipitated white Southerners’ organized resistance to racial change. The Ku Klux Klan experienced a third wave of revival in the region, following the Reconstruction period and its resurgence in the 1920s. Organized resistance also took a more middle-class form with the Citizens’ Council, which began in 1954 in the Mississippi Delta. The councils styled themselves as a modern political interest group, replete with a monthly publication and a series of television programs featuring interviews with policy makers. Council leaders denounced violence publicly, but they often turned a blind eye or even subtly encouraged violence by working-class whites. This grassroots organizing underwrote high-profile acts of resistance during the late 1950s and early 1960s. In 1957 Governor Orval Faubus of Arkansas, who up to that point was viewed as a racial moderate, defied federal authorities in blocking the entrance of African American students into Little Rock’s Central High School. Three years later, white resistance to school desegregation in New Orleans led to violent clashes throughout the city. In 1962 Governor Ross Barnett of Mississippi led his state into a constitutional showdown with President John Kennedy over the admission of the African American James Meredith to the University of Mississippi.
After Wallace won the governorship in 1962 on a hard-line segregationist platform, he had a Klansman draft his inaugural address. Wallace defied Kennedy administration officials in 1963, standing “in the schoolhouse door” to symbolically block the admission of African American students to the University of Alabama. In 1964 he became the face of the white backlash against civil rights when he ran surprisingly well in Democratic presidential primaries in Wisconsin, Indiana, and Maryland. Four years later, running as the candidate of the American Independent Party, Wallace narrowly missed out on his goal of throwing the election into the House of Representatives. Wallace lost North Carolina and Tennessee by statistically insignificant margins; a win in either of those states accompanied by a shift of less than 1 percent of the vote from Richard Nixon to Hubert Humphrey in New Jersey or Ohio would have been enough to do the trick. He was reelected as governor in 1970 in a notoriously racist campaign. In May 1972 he was a leading candidate in a chaotic race for the Democratic presidential nomination. Wallace was shot while campaigning in Maryland. He survived the shooting but was relegated to a wheelchair for the rest of his life, all but ending his national political aspirations.
Yet Wallace’s impact could be measured in other ways. In one sense, Wallace helped “southernize” national politics. His parodying of government bureaucrats, liberal elites, and anti-American political activists gave a populist bent to the post–World War II American conservative movement. Up to that point, conservatism consisted largely of a loose coalition of corporate executives, renegade intellectuals, and anti-Communist hardliners. Wallace, however, pioneered appeals to white working-class and lower middle-class Americans, pleas that worked equally well outside the South. Wallace was elected governor of Alabama twice more, in 1974 and 1982, and he returned to his racially moderate, economic populist roots. By that time, however, the antigovernment slogans that had sustained his national aspirations had been taken up by a new generation of ideological conservatives in the Republican Party.
For many people, Lyndon Johnson summed up the conventional wisdom on the southern GOP in the 1960s. On the night he signed the 1964 Civil Rights Act, Johnson lamented to an aide, “I think we just delivered the South to the Republican Party for a long time to come.” Yet white racism was not the only factor spurring two-party politics in the South. Urban Southerners showed their distaste for one-party rule in the 1950s, when a majority of the region’s city dwellers twice voted for the modern Republicanism of Dwight Eisenhower. Business-oriented, racially moderate urban and suburban Southerners were an important source for the growing southern GOP. Federal court decisions in the early 1960s that upheld the principle of “one man, one vote” gave a boost to moderate metropolitan Southerners. Thanks to outdated apportionment laws, rural interests had dominated southern politics for years. Under the county unit system in Georgia, for example, each county was assigned “units” that ranged from two units for the smallest county to six for the largest. As a result, residents of rural counties held political power far beyond their proportion of the state population. A vote in tiny Echols County, Georgia, was worth 99 times as much as a vote in Atlanta.
Metropolitan Southerners joined urban and suburban citizens of other expanding areas across the Southwest and far West to make up what commentators came to describe as the Sunbelt. From the 1960s through the end of the century, southern states from the Carolinas to California were the most economically dynamic areas of the country. Southern state industrial programs attracted new industries through a mix of tax breaks and other economic subsidies, and southern legislatures passed right-to-work laws that suppressed union membership. Cold war military spending benefited the Sunbelt disproportionately; military contracts poured into southern states like Texas, Florida, and Georgia. Powerful southern congressmen directed other defense dollars into their home districts. These new Sunbelt jobs attracted college-educated middle- and upper middle-class migrants to the region. The 1960s was the first decade since the 1870s that more people moved into the South than out of it. Many of these new Southerners settled in expanding suburban neighborhoods. From 1960 to 1968, the suburbs of Houston grew by 50 percent, New Orleans 45.5 percent, Washington 39.2 percent, and Atlanta 33.6 percent. During the 1960s, per capita income increased in the South 14 percent faster than anyplace else.
These new Sunbelt residents were a natural fit for the Republican Party. Yet, despite the dynamic social and economic changes in the region, southern GOP advances in the 1960s and 1970s were surprisingly mixed. Republican presidential candidates did well across the region, but the party struggled in state and local elections. One reason was that the hard-line conservatives who built the southern parties were still a minority within a national party in which moderate and liberal Republicans played a major role. Also, in many parts of the South, Republican candidates struggled to shed an image as the party of the country club set.
Most important to Democratic perseverance, however, was the 1965 Voting Rights Act, which restored voting rights to thousands of African American voters across the region. African Americans first started to drop their historic allegiance to the party of Lincoln during the New Deal, and thanks to liberal Democrats’ passionate support for civil rights, these new southern black voters almost uniformly identified as Democrats. In southern states with African American populations that ranged from a quarter to well over a third of the total population, successful Republicans had to amass supermajorities among white voters.
From these electoral dynamics were born the New South Democrats: progressive, racially moderate, practical-minded politicians who assembled coalitions of working-class whites, black voters, and urban liberals. Prominent examples included Reubin Askew and Lawton Chiles of Florida, Jimmy Carter and Sam Nunn of Georgia, Dale Bumpers and David Pryor of Arkansas, John West and Richard Riley of South Carolina, and Bill Waller and William Winter of Mississippi.
No politician better symbolized the regional and national potential of New South Democrats than Jimmy Carter, the Georgia governor who vaulted into national politics in the wake of the Watergate crisis. Carter pursued centrist policies that angered the liberal wing of his party. He insisted on a balanced budget and attempted to cut spending and jobs programs that were bedrocks for liberal Democratic constituencies. He appointed a staunch advocate of deregulation as head of the Civilian Aeronautics Board and signed legislation that deregulated a number of industries. But Carter’s major failure was one of timing—he presided over the White House during a period when the historic post–World War II economic boom petered out. His talk of limits and the need for Americans to scale back was rooted in his Southern Baptist faith, his sense of humility and stewardship. Political opponents, however, easily parodied it as rudderless leadership and weak-kneed defeatism.
A notable aspect of Carter’s politics was his open discussion of his religious faith. It reflected his genuine personal devotion, but it also played into his appeal as the post-Watergate antipolitician—a man who would never lie to the American people. It was ironic then that Carter, who spoke sincerely of his personal relationship with Jesus Christ, would come to be so vehemently opposed by other southern Christian conservatives. White Southerners played prominent roles in what came to be known as the Religious Right. Jerry Falwell helped establish the Moral Majority in 1979, a conservative Christian advocacy group that was credited with playing a major role in Ronald Reagan’s successful presidential campaign in 1980. Pat Robertson headed the Christian Broadcasting Network in Virginia Beach, Virginia, which became a focus of Religious Right broadcasting in the 1970s and 1980s. Jesse Helms of North Carolina was the leading voice in the Senate for conservative Christian concerns.
By the 1980s, the Religious Right was a key constituency in Ronald Reagan’s conservative coalition. Reagan’s charisma and Hollywood glamour won over countless white Southerners. No figure did more in encouraging white Southerners to shift their political identity from the Democratic to the Republican Party. Reagan articulated in a genial way the reaction against social and political liberalism that had been such a defining part of the region’s modern politics. For the most conservative white Southerners, Reagan’s election provided a sense of vindication that their opposition to the transformations of the 1960s was not so misguided after all. Southern Republicans played leadership roles in the dominant conservative wing of the party. Most prominently, in the 1994 midterm elections, Newt Gingrich of Georgia orchestrated the Republicans’ Contract with America, a set of conservative policy positions that was credited with helping the GOP gain control of the House of Representatives for the first time in 40 years.
Despite Republican successes, moderate southern Democrats continued to exert a powerful influence on national Democratic Party politics. Bill Clinton showed the lingering power of the moderate New South model when he teamed with fellow Southerner Al Gore to become the first twice-elected Democratic president since Franklin Roosevelt. He did so following the same centrist path that Jimmy Carter had blazed in the 1970s. Clinton declared that the era of big government was over, signed the North American Free Trade Agreement, and backed a welfare reform bill that alienated liberals in his own party. He might have continued to provide moderate pragmatic leadership for the Democrats—the only path that had provided any significant electoral gains for the party since the 1960s—had the final years of his presidency not been marred by personal scandal.
By the twentieth-first century, the South remained a source of consternation for progressive political forces, the seeming heart of Republican-dominated “red America.” Some Democrats counseled their party to hand the region over to the Republicans, to “whistle past Dixie,” and focus on liberal ideas that would appeal to voters in more traditionally progressive areas of the country. Other Democrats argued that Southerners were no different from other Americans, that they were motivated by the same concerns about economic security, health care, and education. That position was bolstered in the 2008 presidential election, when Barack Obama carried Virginia, North Carolina, and Florida—made possible by a huge turnout of African American voters.
In the roughly 135 years since the end of Reconstruction, the South underwent enormous transformations. The differences between the region and the nation had eroded enough to lead many to question what, if anything, remained distinctive about the South. And yet within national politics, many Americans still found it relevant to talk about the South as a discrete entity, a place with a unique past that continued to shape its politics in subtle yet powerful ways.
See also Democratic Party; race and politics; Reconstruction Era, 1865–77; segregation and Jim Crow.
FURTHER READING. Edward Ayers, The Promise of the New South, 1992; Numan Bartley, The New South, 1945–1980, 1995; Numan Bartley and Hugh Davis Graham, Southern Politics and the Second Reconstruction, 1975; Earl Black and Merle Black, Politics and Society in the South, 1995; Idem, The Rise of Southern Republicans, 2002; Joseph Crespino, In Search of Another Country: Mississippi and the Conservative Counterrevolution, 2007; V. O. Key, Southern Politics in State and Nation, 1949; Matthew Lassiter, The Silent Majority: Suburban Politics in the Sunbelt South, 2005; Bruce Schulman, From Cotton Belt to Sunbelt: Federal Policy, Economic Development, and the Transformation of the South, 1938–1980, 1991; George Tindall, The Emergence of the New South, 1913–1945, 1967; C. Vann Woodward, Origins of the New South, 1877–1913, 1951.
JOSEPH CRESPINO
John Hay, then secretary of state, called it the “splendid little war.” Almost 100 years after its start, historian Walter LaFeber called it the first modern war of the twentieth century for the United States. The Spanish-American War, and the more than decade of fighting that followed in the Philippines, reflected the tremendous growth in U.S. power in the late nineteenth century and the changing nature of domestic and international politics involving the United States. This war also provided the occasion for introducing some of the policies that shaped the U.S. polity as it moved into the twentieth century. The Spanish-American War was particularly influential in six areas: (1) development of ideas and practice of the modern presidency, (2) modification of traditional U.S. military doctrine, (3) reflection of new approaches to democratic politics and public opinion, (4) opportunities for state building, (5) creation of layers of empire, and (6) new struggles over the nature of American identity.
Cubans organized as early as the 1860s to achieve independence from Spain. Interest in Cuba was strong because of U.S.-owned sugar and tobacco plantations, but Americans were also altruistically sympathetic to the Cuban cause. Mass-circulation newspapers and U.S. labor organizations supported the Cubans, increasingly so through the 1890s, as Spanish repression of the Cuban independence movement became harsher.
President William McKinley, elected on the Republican ticket in 1896, was cautious but determined to support U.S. interests. He felt compelled in early 1898 to send the USS Maine to Havana harbor, where a few weeks later, it blew up. The “yellow press,” led by William Randolph Hearst’s New York Journal, issued the battle cry “Remember the Maine.” Congress declared war in April 1898, but the U.S. fleet had already begun moving into position near Spanish possessions in both the Caribbean (Cuba and Puerto Rico) and the Pacific (Philippines). The small and under-trained U.S. Army was augmented by numerous enthusiastic volunteers, including the famous Rough Riders, organized by Teddy Roosevelt, who benefited from admiring publicity. The navy performed better than the army, but both performed better than the Spanish, and the fighting was over within a few weeks, costing the United States few dead, and most of those from disease rather than battle. During the war, the United States annexed Hawai‘i. As a result of the war, although not until 1902, Cuba became independent. Puerto Rico, Guam, and the Philippines became U.S. colonies, technically called “unincorporated territories.”
Filipinos also had been fighting for their independence from Spain. While initially working with U.S. forces or tolerating their presence, Filipino independence fighters soon realized U.S. liberation from Spain would not mean independence, and they took up arms against U.S. soldiers. This part of the war was costly for the United States, with more than 70,000 U.S. troops in the islands at the peak of the conflict and U.S. deaths of more than 4,000. At least 20,000 Filipinos were killed as a direct result of fighting. The Philippine Insurrection, which Filipinos call the Philippine-American War, officially lasted until 1902. Fighting continued in various parts of the islands until 1913, especially in the southern island of Mindanao, which has never fully acquiesced in any kind of rule from Manila, the capital of the Philippines.
The narrative above is factual and familiar but obscures more than it reveals. McKinley appears passive in it, reacting to popular media and events in both Cuba and Congress. This image of McKinley prevailed for years among scholars, many of whom repeated Theodore Roosevelt’s claim that he had the backbone of a chocolate éclair. Timing of the declaration of war suggests otherwise, however. Both a majority in Congress and many mass-circulation newspapers had been advocating war for months before McKinley submitted his carefully crafted request for a declaration of war. McKinley drafted the declaration so that it allowed him to pursue almost any policy he wanted, subject only to the promise in the Teller Amendment not to annex Cuba.
The president was more than merely astute, however. First, he was thinking in an integrated manner about U.S. global interests and was working to coordinate the consequences of the war to serve a variety of U.S. interests, including maintaining an open system in China for U.S. trade and investment, sufficient control over areas such as Cuba where U.S. investment was substantial and growing, and creation of types of control in both Asia and the Caribbean in concert with the loose, minimally bureaucratic character of twentieth-century U.S. imperialism.
Second, McKinley used new technology effectively to increase his own power. Both the telegraph and telephone allowed him rapid, personal contact with other U.S. officials and the military in the field. He used these advantages to communicate directly and left less of a paper trail than previous presidents, which had the effect of decreasing freedom of action by subordinates while also making it more difficult for historians to trace the ultimate authority for decisions. Finally, McKinley and his closest personal advisors were men who believed in applying principles for the efficient organization of large corporations to running the government. They worked to continue professionalizing, organizing, and making government bureaucracy more effective. This too increased the power of the executive, especially at the expense of the still amateur and small congressional staffs. With or without the Spanish-American War, McKinley would have worked to increase executive power; the war gave him a large canvas on which to work.
The military, both navy and army, naturally was the government entity initially most affected by the war. The traditional narrative emphasizes the effectiveness of the U.S. Navy, which while not yet impressive in comparison with the British Navy, had several able advocates who had successfully promoted acquisition of modern and far-ranging ships. Alfred Thayer Mahan epitomizes this group who saw a larger, better trained navy as essential to projecting U.S. power into the world’s oceans in support of increased U.S. commerce and control. The U.S. Navy handily demonstrated its superiority over the Spanish, even defeating a Spanish fleet at such far reaches as the Philippines. The battle in Cuba, for which the Spanish were more prepared, went scarcely better for them. The Spanish-American War confirmed for these navy advocates that they had been right; the acquisition of far-flung colonies provided them with continuing justification for a large and modern navy.
The U.S. Army generally looks less capable in accounts of the Spanish-American War. Its tiny size of less than 30,000 in 1898 meant that the war could not be fought without calling on thousands of volunteers, no doubt eager but ill-trained, and the militias, perhaps less eager and trained for different tasks. Logistics proved embarrassingly poor: U.S. soldiers lacked proper clothes for the tropics, were fed poisonously bad food, and died of disease or poorly treated wounds in greater numbers than from battle. The “splendid little” part of the war, the fighting against Spain, revealed an army poorly prepared for the type of fighting required. The next task was less splendid and little; the U.S. Army was called on to subdue Cubans and Filipinos who had different ideas than did U.S. officials about what the end of the Spanish-American War meant. This fighting often was characterized by brutality, as both regular army and militia employed tactics of repression or even extermination they had learned fighting against Native Americans, and which all too often resembled what the Spanish had done to their former subjects. Simultaneously, however, the army was the first to carry out the “benevolent” components of U.S. rule, including improving sanitation, building infrastructure, and opening schools. Violence and benevolence were intertwined, as they usually are in imperial projects. The U.S. Army began to develop nation-building capacities that have characterized its mission up to the present day.
The long buildup to a declaration of war allowed plenty of political maneuvering and public involvement, allowing the display of key developments in late-nineteenth-and early-twentieth-century domestic political organization. A familiar part of the story of the Spanish-American War is the way the “yellow press” promoted sympathy for Cubans. These mass-circulation newspapers continued the American tradition of a partisan press but depended on technological developments and increased literacy to present ever more realistic, if also lurid and emotional, images to an entranced public. The press did not create the war, but it did create conditions in which Americans enthusiastically accepted a war arguably remote from the interests of ordinary citizens. Public opinion was led in directions that served a variety of interests.
As the importance of political parties began to wane in the early twentieth century, presidents and newspaper publishers began to appeal directly to the mass public, unmediated by the party hierarchy. President McKinley went on a speaking tour with the stated purpose of gauging public support for acquiring a colony in the Philippines but with the hidden intent of promoting public support for that action. Much of the language used in these public appeals and discussions about the war and the responsibilities stemming from it reflected concerns about honor and manliness. Cubans, and later Filipinos, needed chivalrous rescuers; the Spanish deserved punishment for their misdeeds from honorable soldiers; American men could bravely demonstrate their willingness to sacrifice on the battlefield. Roosevelt’s Rough Riders, volunteers from all walks of life from the most rough-and-tumble to the most elite, epitomized for many the benefits of testing American men in battle. At the turn to the twentieth century, American men were less likely than in preceding decades to vote and participate actively in party politics, more likely to work in large, hierarchical organizations, and to learn about the world through the medium of mass-circulation newspapers. Politicians used these developments to shape public attitudes about the war and the consequences of it.
Although the war itself was relatively short and easily won, it posed logistical challenges to an underdeveloped U.S. state. Both the war and the overseas colonies acquired as a consequence provided officials with opportunities to build U.S. state institutions. The military was the most dramatic example. The navy began to develop more far-reaching capacities in the years leading up to the war, and the army followed suit during and after the war. Both branches acquired permanent overseas responsibilities. The logistical requirements of permanent deployment outside the continental United States help explain trends toward professionalization, bureaucratization, and growth of both the army and navy in the early twentieth century. The decision to locate the Bureau of Insular Affairs, the government agency charged with governing the colonies, in the War Department further increased that department’s growth. Even in ways not explicitly related to fighting the war or governing the colonies, U.S. governmental institutions took on new responsibilities as a result of the war, including some related to immigration and the conduct of foreign relations.
A key outcome of the Spanish-American War was the acquisition of overseas territories, arguably for the first time in U.S. history. The United States became an imperial power, owning colonies it had no intention of incorporating into the nation as states. It newly ruled over Hawai‘i, Puerto Rico, Guam, and the Philippines directly. The United States also exercised a large amount of indirect control over Cuba through the mechanism of the Platt Amendment, a U.S. law whose substance was written into the Cuban constitution. It placed limits on Cuban sovereignty regarding financial affairs and the nature of Cuba’s government, mandated U.S. ownership of a base at Guantanamo, and forced Cuban acquiescence in U.S. intervention to guarantee these measures.
The U.S. empire was a layered one. Cuba experienced effective control, but indirectly. Hawai‘i was governed as an incorporated territory, theoretically eligible for statehood, but its racial mix made that an unappealing prospect for many Americans. Hawai‘i did not become a state until 1959. Guam was ruled directly by the U.S. Navy—which used it as a coaling station—and it remains part of the United States, governed by the Office of Insular Affairs in the Department of the Interior. Both the Philippines and Puerto Rico were governed directly as colonies through the Bureau of Insular Affairs, but their paths quickly diverged. Puerto Rico developed close links with the United States through revolving migration, economic and tourism ties, and increased political rights for its citizens. Puerto Rico is still part of the United States, as the Commonwealth of Puerto Rico. The Philippines developed more modest and ambiguous relations with the United States, since Filipinos had restricted migration rights, and U.S. economic investment in the islands was limited. The Philippines achieved independence in 1946. The layered and decentralized nature of the U.S. empire developed out of the particular legal and political processes used to decide how to rule over territories acquired in the Spanish-American War. These decisions were widely and publicly debated in the early twentieth century, as Americans wrestled with the changing nature of territorial expansion involving overseas colonies.
The Spanish-American War and the resulting acquisition of colonies prompted heated debates in the United States about what it meant to be American. These debates may well be among the most important consequences of the war for the nation. One set of agruments revolved around whether the United States should acquire overseas colonies, and if so, how they should be governed. A vocal and prominent anti-imperialist movement had many older leaders, representatives of a fading generation. Most politicians of the day advocated acquiring the colonies as demonstration of U.S. power and benevolence.
Still, there remained a contentious debate about the status of these new territories, legally settled only by the U.S. Supreme Court in the Insular Cases, beginning in 1901 with Downes v. Bidwell and confirmed subsequently by almost two dozen additional cases. The 1901 decision found that Puerto Rico was “not a foreign country” but “foreign to the United States in a domestic sense.” In other words, not all laws or constitutional protections extended to unincorporated territories such as Puerto Rico and the Philippines. Many Americans were disturbed by these decisions, finding no provision in the Constitution that anticipated ruling land not intended to be part of the United States. They worried that an important part of U.S. political identity was being discarded.
Overriding those concerns, however, was a strong desire on the part of almost all white Americans to avoid the racial implications of incorporating places like the Philippines and especially Cuba into the body politic. Jim Crow segregation was established by the 1890s, and colonial acquisitions promised to complicate an already contentious racial situation in the United States. Cuba was filled with what many commentators called an “unappealing racial mix” of descendants of Spaniards, indigenous peoples, and Africans brought to the island as slaves. U.S. politicians had no desire to bring the racial politics of Cuba into the nation; so Cuba was not annexed. The Philippines was almost as problematic: Filipinos might be the “little brown brothers” of Americans, but in the end they were Asians, including many ethnic Chinese. During these years of Chinese exclusion, the status of Filipinos, U.S. nationals eligible to enter to the United States but not eligible to become citizens, was contested and contradictory. Movement toward granting independence seemed a good way to exclude Filipinos altogether. Puerto Ricans were, apparently, white enough. When they moved to the continental United States, they could naturalize as U.S. citizens, and in 1917, citizenship was extended to all Puerto Ricans. Regarding these groups, however, racial politics complicated both colonial governance and conceptions of U.S. identity.
Hostilities between Spain and the United States were brief and minor, but this splendid little war changed the United States into a colonial power; provided opportunities for the growth of executive government agencies, both the presidency itself and some departments; highlighted developments in mass media and party politics; and opened new lines of debate about the meaning of American identity.
See also foreign policy and domestic politics, 1865–1933; press and politics; race and politics; territorial government.
FURTHER READING. H. W. Brands, Bound to Empire: The United States and the Philippines, 1982; Kristin L. Hoganson, Fighting for American Manhood: How Gender Politics Provoked the Spanish-American and Philippine-American Wars, 1998; Paul A. Kramer, The Blood of Government: Race, Empire, the United States and the Philippines, 2006; Walter LaFeber, The American Search for Opportunity, 1865–1913, 1993; Louis A. Perez, “Incurring a Debt of Gratitude: 1898 and the Moral Sources of United States Hegemony in Cuba,” American Historical Review 104 (April 1999), 356–98.
ANNE L. FOSTER
State constitutions are more easily amended and address a broader range of issues than the U.S. Constitution and, as a consequence, have often played an important role in American political development. On many occasions, the relative flexibility of state constitutions has permitted political reforms to be more easily adopted by states, and only implemented later, if at all, at the federal level. At times also, the greater range of issues addressed in state constitutions, which is due in part to the nature of the federal system and in part to conscious choices made by state constitution makers, means that many political issues have been regulated primarily or even exclusively by state constitutions. This importance of state constitutions can be seen throughout the course of the American regime but is particularly evident in the founding era, Jacksonian era, Progressive Era, and after the reapportionment revolution of the 1960s.
Eleven of the original thirteen states adopted constitutions prior to the drafting of the U.S. Constitution (Connecticut and Rhode Island retained their colonial charters until 1818 and 1842, respectively), and it was through drafting these state constitutions that Americans first developed principles of constitutionalism and republicanism that were then adopted at the federal level.
State constitution makers were the first to grapple with the appropriate process for writing and adopting a constitution. The first state constitutions drafted in early 1776 in New Hampshire and South Carolina were intended to be temporary. However, drafters of subsequent state constitutions came to view them as enduring charters. Meanwhile, some of the early state constitutions were drafted by legislators or by officials who were not selected specifically for the purpose of constitution making. Eventually, though, it came to be understood that constitutions should be written by delegates who were chosen for this express purpose and who assembled in a convention. A similar evolution took place in the understanding of how a constitution should be approved. Although the earliest state constitutions took effect by proclamation of the drafters, this changed when the Massachusetts Constitution of 1780 was submitted for popular ratification. Not only have state constitution makers generally followed this process in drafting and revising the 146 documents that have been in effect in the 50 states, but this was also the model that federal constitution makers followed in holding a convention in 1787 and submitting their work for ratification by the 13 states.
State constitution makers also had extensive opportunities in the 1770s and 1780s to debate the means of designing governing institutions that would best embody republican principles, and these debates influenced the design of the federal constitution in various ways. Some of the first state constitutions were quite democratic in form. The outstanding example was the Pennsylvania constitution of 1776, which established a unicameral legislature whose members stood for annual election and were subject to term limits and whose work could not be vetoed by the executive. As state constitution making progressed, and as concerns arose over the ineffective performance of governing institutions, efforts were made to limit legislatures and secure a greater degree of deliberation, such as by adopting bicameralism, lengthening legislators’ terms, eliminating term limits, and creating a strong executive. The New York constitution of 1777 was the first to adopt a number of these features; the Massachusetts constitution of 1780 went even further in embodying these developments; and, by the time that Pennsylvanians adopted a revised constitution in 1790, they had eliminated many of the more democratic features of the 1776 document. When delegates assembled at the federal convention of 1787, they drew heavily from the state constitutional experience.
The Jacksonian era brought calls for the democratization of both state and federal governing institutions. Though some changes were made at the federal level, the changes were even greater in the states, and these changes frequently took the form of constitutional amendments. In particular, states retained responsibility for most issues of governance during the nineteenth century, including efforts to expand the suffrage, which required changes in state law, and frequently in state constitutions. Moreover, the flexibility of state constitutions, in contrast to the rigidity of the federal amendment process, meant that certain institutional reforms, such as the popular election of judges, were only adopted at the state level.
In terms of the suffrage, the principal developments during this period were the elimination of freeholder and taxpayer requirements for voting, and these changes were frequently achieved through state constitutional amendments. Additionally, although federal acts would later remove from state discretion other voting qualifications—such as those regarding race, sex, and age—states were the innovators in each of these areas, both during the Jacksonian era and in later years. Thus, New England states permitted blacks to vote well before the Fifteenth Amendment. Wyoming, followed by numerous other states, permitted women to vote long before the Nineteenth Amendment. And Georgia and Kentucky enfranchised 18-year-olds several decades prior to ratification of the Twenty-Sixth Amendment. Moreover, states have retained control over other suffrage requirements, such as those concerning citizenship and felony conviction. Therefore, battles over suffrage requirements continued to be waged to a great degree in state constitutional forums long after the federal government began to establish national suffrage requirements in the Reconstruction era. In fact, although Reconstruction-era state conventions were required by federal law to enfranchise African Americans, these gains were mostly reversed by state conventions in the 1890s that adopted various disenfranchising mechanisms.
State constitution makers during the Jacksonian era also made attempts to democratize governing institutions. Not a single federal amendment was ratified between passage of the Twelfth Amendment in 1804 and the Thirteenth Amendment in 1865. State constitutions, though, were easier to amend, and amendment procedures were made even more flexible during this period. Some states went so far as to require that a popular referendum be held periodically on whether to hold a new constitutional convention.
As a result, constitutional reformers were able to experiment with alternative institutional arrangements at the state level. Some constitutions imposed procedural restrictions on the legislature, such as requiring that bills be read three times and contain a single subject accurately reflected in the title. There were also substantive restrictions on the legislature, such as prohibiting the incurring of debt or the lending of state credit. A number of states also moved during this period to adopt a plural executive of sorts: a wide range of executive department heads were subject to popular election, along with the governor. Most notably, beginning with Mississippi’s adoption in 1832 of popular elections for all state judges and accelerating in the 1840s, states increasingly provided for an elected judiciary. State constitution makers also democratized governing institutions by changing inequitable apportionment plans that had long privileged older tidewater regions over growing piedmont and mountain regions. The Ohio Constitution of 1851 was the first to provide for an apportionment board that removed the decennial task of redistricting from the legislative process. This approach was later emulated by several other states.
Among the chief concerns of Progressive reformers was securing protection for workers in the face of obstructionist legislatures and courts. The flexibility of state amendment processes enabled reformers to adopt numerous constitutional changes introduced to bypass legislatures or overturn court decisions.
Progressive reformers pushed for a variety of protective measures for workers—including an eight-hour day, minimum wage, workers’ compensation, and child labor regulations. But they experienced mixed success in getting these measures approved by legislators and then sustained by the courts. Even when Congress and state legislatures did enact protective laws, federal and state courts occasionally invalidated them. The only federal constitutional amendment formally proposed in this area was a child labor amendment, but it failed to be ratified by the requisite number of state legislatures and so never took effect. Reformers had more success securing enactment of state provisions to protect workers. Some amendments were intended to bypass legislatures, by mandating an eight-hour day or prohibiting child labor. Other state constitutional changes sought to overturn court decisions, by declaring that the legislature was empowered to pass a minimum-wage law or establish a workers’ compensation system or enact other protective measures, regardless of state court rulings to the contrary.
Progressive reformers also tried to restructure governing institutions they viewed as insufficiently responsive to public opinion and overly susceptible to special-interest influence. The only structural change adopted at the federal level in the early twentieth century was direct senatorial elections, as a result of passage of the Seventeenth Amendment. But state reformers had more success. The South Dakota constitution was amended in 1898 to provide for the popular initiative and referendum, and during the twentieth century half of the states adopted similar reforms. A third of the states provided for the constitutional initiative. The Ohio constitution was amended in 1912 to permit judicial review to be exercised only by a supermajority of the state supreme court judges, and two other states soon adopted similar provisions. A number of states in the first two decades of the twentieth century also adopted the recall of judges and other public officials.
After a long period in the mid-twentieth century of relative inattention to constitutional revision, the U.S. Supreme Court’s reapportionment decisions in the early 1960s led to a significant number of changes in state constitutions. States had to bring their redistricting laws into compliance with the Court’s rulings, and constitution makers took the opportunity to modernize other state governing institutions as well. They also added rights provisions that had no counterpart in the federal constitution. For example, several states provided for a right to privacy or a prohibition against sex discrimination. Other states guaranteed a right to a clean environment. Victims’ rights clauses were added to other state constitutions.
Much of the renewed interest in state constitutions in the late twentieth century was generated not by amendments but as a result of liberal state court interpretations of state bills of rights. When the Supreme Court under chief justices Burger and Rehnquist proved less aggressive than the Warren Court in interpreting the federal Bill of Rights in an expansive fashion, state judges began to provide redress for civil liberties claimants. Thus, the U.S. Supreme Court declined to recognize a federal constitutional right to equal school financing, but a number of state supreme courts found such a right in their state constitutions. And whereas the U.S. Supreme Court declined to rule that the death penalty violated the federal cruel and unusual punishment clause, as long as it was administered in a proper fashion, several state supreme courts interpreted their state bills of rights as prohibiting capital punishment in all circumstances. Then, at the turn of the twenty-first century, several state courts interpreted their bills of rights as guaranteeing a right to same-sex civil unions or same-sex marriage. Although state amendment processes permitted citizens to eventually adopt constitutional amendments overturning some of these state court rulings, many of these decisions were left undisturbed, ensuring that state constitutions will be a continuing battleground in civil liberties debates in the years to come.
See also Constitution, federal; state government.
FURTHER READING. Willi Paul Adams, The First American Constitutions: Republican Ideology and the Making of the State Constitutions in the Revolutionary Era, 1980; James Q. Dealey, Growth of American State Constitutions from 1776 to the End of the Year 1914, 1915; John J. Dinan, The American State Constitutional Tradition, 2006; Walter F. Dodd, The Revision and Amendment of State Constitutions, 1910; Daniel J. Elazar, “The Principles and Traditions Underlying State Constitutions,” Publius 12 (Winter 1982), 11–25; Christian G. Fritz, “Fallacies of American Constitutionalism,” Rutgers Law Journal 35 (Summer 2004), 1327–69; James A. Gardner, Interpreting State Constitutions: A Jurisprudence of Function in a Federal System, 2005; Marc W. Kruman, Between Authority and Liberty: State Constitution Making in Revolutionary America, 1997; Donald S. Lutz, “The Purposes of American State Constitutions,” Publius 12 (Winter 1982), 27–44; Laura J. Scalia, America’s Jeffersonian Experiment: Remaking State Constitutions, 1820–1850, 1999; Albert L. Sturm, Thirty Years of State Constitution-Making, 1938–1968, 1970; G. Alan Tarr, Understanding State Constitutions, 1998; G. Alan Tarr, Robert F. Williams, and Frank P. Grad, eds., State Constitutions for the Twenty-first Century, 3 vols., 2006.
JOHN DINAN
History texts have traditionally depicted the evolution of the American republic as progressing from the disunion of the Confederation era to the consolidation of the twentieth century, with state subordination ensured by the triumph of the national regime in the Civil War. According to this scenario, state governments survived in the shadow of policy makers in Washington, D.C., who gradually whittled away at the residual powers of lawmakers in places like Albany and Sacramento. Yet one of the most significant, though often overlooked, features of American political history is the persistent and powerful role of state governments in the lives of citizens. The United States is and always has been a federal republic; its very name makes clear that it is a union of states. Throughout its history the states have provided the lion’s share of government affecting the everyday life of the average American. The professors who write the history texts work at state universities; their students are products of the public schools established, supervised, and to a large degree funded by the states. Every day of the year state police disturb the domestic tranquility of speeders driving along highways built and owned by the states. The municipalities that provide the water necessary for human survival are creations of the states, and these water supplies are subject to state supervision. Though the American nation is united, it is not a seamless polity. In the twenty-first century state government remains a ubiquitous element of American life.
The original 13 states of 1776 were heirs to the governmental traditions of the colonial period. The first state constitutions retained the office of governor, though they reduced executive powers and granted the bulk of authority to a legislative branch that most often consisted of two houses. The federal Constitution of 1787, however, clearly limited the authority of these seemingly all-powerful legislatures. It declared federal laws, treaties, and the federal Constitution itself supreme over state laws. Yet it left the plenary power to govern with the states, creating a national government of delegated powers. The Tenth Amendment ratified in 1791 reinforced this fact. All powers not granted to the federal government nor specifically forbidden to the states were reserved to the states and the people thereof.
During the first 80 years of the nation’s history, democratization gradually transformed the structure of state government as the people’s role expanded and legislative supremacy eroded. Property or taxpaying qualifications gave way to white manhood suffrage, legislative appointment of executive and judicial officials yielded to popular election, and voter approval of constitutional amendments and revisions became a prerequisite in most states. Meanwhile, the rise of political parties imposed a limited degree of partisan discipline on legislators. They were no longer responsible only to their constituents but also had to answer to party leaders.
During these same years, state governments expanded their functions, assuming broader responsibility for economic development, education, and treatment of the disabled. Internal improvement programs with ambitious blueprints for new roads, rail lines, and canals drove some states to the brink of bankruptcy but also produced a network of artificial waterways in the Northeast and Midwest. Most notably, New York’s Erie Canal funneled western commerce through the Mohawk Valley and ensured that the Empire State would wield imperial sway over much of the nation’s economy. State governments also funded new universities as well as common schools, laying the foundation for their later dominance in the field of education. State schools for the blind and deaf and asylums for the insane reflected a new confidence that human disabilities could be transcended. Moreover, states constructed penitentiaries to punish and reform malefactors.
Reacting to legislative activism and the public indebtedness incurred by ambitious transportation schemes, mid-nineteenth century Americans demanded a curb on state authority. From 1843 to 1853, 15 of the 31 states drafted new constitutions, resulting in new limits on state debt and internal improvements. In addition, they restricted local and special legislation that was flooding state legislatures and benefiting favored interests. As suspicion of state-chartered business corporations mounted, wary Americans sought constitutional guarantees against legislative giveaways that might enrich the privileged few at the expense of the general public.
Union victory in the Civil War confirmed that the states were not free to withdraw from the nation at will, and the Reconstruction amendments to the federal Constitution seemingly leveled additional blows at state power. Henceforth, the Constitution prohibited the states from depriving persons of life, liberty, or property without due process of the law or denying anyone equal protection of the laws. In addition, states could not limit a person’s right to vote on the basis of race, color, or previous condition of servitude. The United States Supreme Court, however, interpreted the guarantees of racial equality narrowly, allowing the states maximum leeway in discriminating against African Americans. By the early twentieth century, most southern states had disenfranchised most blacks through such devices as literacy tests and poll taxes and had enacted a growing array of segregation laws that mandated separate accommodations for African Americans in schools, on transportation lines, and other facilities. The Civil War had decided that states could no longer impose slavery on blacks, but by 1900 state governments had full authority to ensure that African Americans suffered second-class citizenship.
When confronted with the expanding volume of state economic regulatory legislation, the federal Supreme Court was not always so generous toward the states. In 1869 Massachusetts created a state railroad commission with largely advisory powers; it investigated and publicized the practices of rail companies. Illinois pioneered tougher regulation when, in the early 1870s, it authorized a rail commission with the power to fix maximum rates and thereby protect shippers from excessive charges. Twenty-four states had established rail commissions by 1886; some replicated the advisory function of the Massachusetts body while others, especially in the Midwest, followed the rate-setting example of Illinois. That same year, however, the United States Supreme Court limited state regulatory authority to intrastate rail traffic, reserving the supervision of interstate rates to the federal government. It thereby significantly curbed state power over transportation corporations.
To further protect the public, legislatures created state boards of health, with Massachusetts leading the way in 1869. During the next two decades, 29 other states followed Massachusetts’s example. The new agencies were largely restricted to the investigation of health conditions and collection of data. By the last decade of the century, however, a few state boards were exercising veto powers over local water and sewerage projects, ensuring that municipalities met adequate health standards.
In the early twentieth century, state activism accelerated, resulting in even greater intervention in the economy. From 1907 through 1913, 22 states embarked on regulation of public utilities, extending their rate-fixing authority not only to railroads but to gas, electric, streetcar, and telephone services. Meanwhile, state governments moved to protect injured workers by adopting worker compensation programs; between 1910 and 1920, 42 of the 48 states enacted legislation guaranteeing employees compensation for injuries resulting from on-the-job accidents.
The early twentieth century not only witnessed advances in state paternalism but also changes in the distribution of power within state government. During the late nineteenth century, critics increasingly lambasted state legislatures as founts of corruption where bribed lawmakers churned out nefarious legislation at the behest of lobbyists. The criticisms were exaggerated, but owing to the attacks, faith in the legislative branch declined. Exploiting this popular distrust, a new breed of governors presented themselves as tribunes of the people, who unlike legislators, spoke not for parochial or special interests but championed the commonweal. Charles Evans Hughes in New York, Woodrow Wilson in New Jersey, Robert La Follette in Wisconsin, and Hiram Johnson in California all assumed unprecedented leadership in setting the legislative agenda, posing as popular champions who could prod state lawmakers into passing necessary reforms.
Further reflecting popular skepticism about legislative rule was the campaign for initiative and referendum procedures at the state level. In 1897 South Dakota adopted the first initiative and referendum amendment to a state constitution, allowing voters to initiate legislation and approve it by popular vote. The electorate could thereby bypass an unresponsive legislature. Moreover, voters could demand a referendum on measures passed by the legislature, and a majority of the electorate could thus undo the supposed misdeeds of their representatives. Twenty states had embraced both initiative and referendum by 1919, with Oregon and California making especially frequent use of the new procedure.
Among the issues troubling early-twentieth-century voters was state taxation. Traditionally, states had relied on the property tax, which placed an inordinate burden on land-rich farmers. Seeking to remedy this situation, Wisconsin in 1911 adopted the first effective state graduated income tax for both individuals and corporations. It proved a success, and 13 states had enacted income taxes by 1922, accounting for almost 11 percent of the state tax receipts in the nation.
During the early twentieth century the advent of the automobile greatly expanded the role of state government and further necessitated state tax reforms. Throughout most of the nineteenth century, the construction and maintenance of roads had remained almost wholly a local government responsibility. In the 1890s, however, the growing corps of bicyclists lobbied for better roadways. Responding to this pressure, state legislatures began adopting laws that authorized funding for highways that conformed to state construction standards. Nineteen states had established state road agencies by 1905. Over the following two decades, the burgeoning number of automobile owners forced the other states to follow suit, resulting in a sharp increase in highway expenditures. In 1916 and 1921 Congress approved some funding for state highway programs, but the states still shouldered the great bulk of road financing and construction. From 1921 through 1930 state governments appropriated 7.9 billion for highways; federal aid accounted for only
839 million of this total.
To fund this extraordinary expansion of state responsibility, legislatures embraced the gasoline tax. In 1919 Oregon adopted the first gasoline levy, and over the next ten years every other state followed. This tax on fuel consumption was easy to administer, highly lucrative, and popular with motorists, who were willing to pay for better roads.
With the onset of economic depression in the 1930s, however, the states confronted a new financial crisis. Unable to collect property taxes from cash-strapped home and business owners and faced with mounting relief expenditures, local authorities turned to the state governments for help. State legislators responded by adopting yet another new source of revenue, the retail sales tax. Mississippi led the way in 1932, and by the close of the 1930s, 23 states imposed this new levy. As early as 1936, it was second only to the gasoline tax as a source of state tax revenue. A massive influx of federal money also helped states improve services and facilities. For example, the federal Civilian Conservation Corps was instrumental in the development of state park systems.
State governments used their new sales tax revenues to bail out local school districts. From 1930 to 1940 the states’ share of school funding almost doubled from 16.9 percent to 30.3 percent. Throughout the early twentieth century, states had gradually imposed stricter standards on local districts, centralizing control over public education. The financial crisis of the 1930s, however, accelerated the pace, as states were forced to intervene and maintain adequate levels of schooling.
The Great Depression also stirred new interest among the states in economic development. This was especially true in the South, where leaders tried to wean the region from its dependence on agriculture and to lure industrial plants. In 1936 Mississippi adopted its Balance Agriculture with Industry program, permitting local governments, subject to the approval of a state board, to issue bonds to finance the construction of manufacturing facilities. Meanwhile, the Southern Governors Conference, founded in 1937, lobbied the federal Interstate Commerce Commission to revise its rail charges and thereby eliminate the rate discrimination hampering the development of southern manufacturing.
After World War II the southern states fought a two-front war, defending their heritage of racial segregation while taking the offensive on economic development. First, the Supreme Court and then Congress began putting teeth in the guarantees of racial equality embodied in the Reconstruction amendments to the Constitution. The Court held state segregation laws unconstitutional, and, in the Voting Rights Act of 1965, Congress dismantled southern state restrictions on black voting. Adding to the North-South clash of the postwar era, however, was the southern economic counteroffensive. Southern governors led raiding parties on northern cities, promoting their states to corporate executives tired of the heavily unionized northern labor force and attracted to more business-friendly climes.
During the last decades of the twentieth century, northern governors responded with economic initiatives of their own. By the 1980s states were not only attempting to steal industrial plants from other parts of the country but were also embarking on high-technology incubator programs aimed at fostering innovative growth industries. Governors preached high-tech venture capitalism and promoted each of their states as the next Silicon Valley. Moreover, economic promotion trips to both Europe and Asia became regular events on governors’ schedules; German and Japanese moguls welcomed one business-hungry state executive after another. With overseas trade offices and careful calculations of state exports and potential foreign markets, state governments were forging international links. State governments were no longer simply domestic units of rule but also international players.
Educational demands were meanwhile pushing the states to assume ever larger obligations. The state share of public education funding continued to rise, reaching 46.8 percent in 1979–80. Moreover, states funded the expansion of existing public universities as well as the creation of new four-year campuses and systems of two-year community colleges. To pay their mounting education bills, states raised taxes, and most of those that had not previously adopted sales or income levies did so.
With heightened responsibilities, the states seemingly needed to upgrade their governmental structures. In the 1960s the Supreme Court mandated reapportionment of state legislatures; representation was to be based solely on population. Moreover, in the 1960s and 1970s there were mounting demands for professionalization of the state legislatures. Historically, state legislators were part-time lawmakers who met for a few months each biennium, earning a modest salary for their service. During the last decades of the twentieth century, annual legislative sessions became the norm, and, in the largest states, law making became a full-time job, with state legislators acquiring a corps of paid staff. Reacting to the increasing number of entrenched professional legislators, a term-limits movement swept the nation in the early 1990s. Twenty-two states had approved term limits for lawmakers by 1995, restricting them to a maximum service of 6, 8, or 12 years.
In the early twenty-first century state governments were confronting everything from stem cell research to smart-growth land-use planning. The United States remained very much a union of states, jealous of their powers and resistant to incursions by the national regime. State governments survived as significant molders of the policies governing the everyday lives of the nation’s 300 million people.
See also state constitutions.
FURTHER READING. Ballard C. Campbell, “Public Policy and State Government,” in The Gilded Age: Essays on the Origins of Modern America, edited by Charles W. Calhoun, 309–29, 1996; Idem, Representative Democracy: Public Policy and Midwestern Legislatures in the Late Nineteenth Century, 1980; James Quayle Dealey, Growth of American State Constitutions from 1776 to the End of the Year 1914, 1915; R. Scott Fosler, ed., The New Economic Role of American States: Strategies in a Competitive Economy, 1988; Leslie Lipson, The American Governor: From Figurehead to Leader, 1939; James T. Patterson, The New Deal and the States: Federalism in Transition, 1969; Colman B. Ransone, Jr., The Office of Governor in the United States, 1956; Charles F. Ritter and Jon L. Wakelyn, American Legislative Leaders, 1850–1910, 1989; Ira Sharkansky, The Maligned States: Policy Accomplishments, Problems, and Opportunities, 1972; Jon C. Teaford, The Rise of the States: Evolution of American State Government, 2002.
JON C. TEAFORD
The United States became a suburban nation during the second half of the twentieth century, according to conventional wisdom, popular consciousness, and scholarly consensus. The most dramatic transformation took place during the two decades after World War II, when the percentage of suburban residents doubled from one-sixth to one-third of the population. Federal policies that promoted both single-family homeownership and racial segregation subsidized the migration of millions of white families from the central cities and the countryside to suburbia, the location of about 85 percent of new residential construction in the postwar era. In 1968 suburban residents cast a plurality of ballots for the first time in a presidential election. In a period of social unrest in cities and on college campuses, Republican candidate Richard Nixon reached the White House by appealing to the “great majority” of “forgotten Americans” who worked hard, paid their taxes, and upheld the principles of the “American Dream.” By 1992 suburban voters represented an outright majority of the American electorate, more than the urban and rural populations combined. During a period of economic recession and downward mobility for blue-collar workers, Democratic presidential nominee Bill Clinton reclaimed the political center with a time-honored populist appeal that championed the “forgotten, hard-working middle-class families of America.”
The bipartisan pursuit of swing voters of “middle America” reveals the pivotal role played by the rise of suburban power in national politics since the 1950s. The broader story of suburban political culture in modern U.S. history also includes public policies that constructed sprawling metropolitan regions, the grassroots influence of homeowner and taxpayer movements, and the persistent ideology of the white middle-class nuclear family ideal. A clear-cut definition of the “suburbs” is elusive because the label simultaneously represents a specific yet somewhat arbitrary census category (everything in a metropolitan statistical area outside of the central city limits), a particular form of land-use development (the decentralized sprawl of single-family houses in automobile-dependent neighborhoods physically separated from shopping and employment districts), and a powerful set of cultural meanings (the American Dream of homeownership, upward mobility, safety and security, and private family life in cul-de-sac settings of racial and economic exclusion). Residents of suburbs have long defined themselves in cultural and political opposition to the urban-industrial center, from the growth of commuter towns along railroad lines in the mid-1800s to the highway-based “edge cities” and exurbs of the late twentieth century. Although scholars emphasize the diversity of suburban forms and debate the extent to which many former bedroom communities have become urbanized, the utopian imagery of the nineteenth-century “garden suburb” maintains a powerful sway. Suburban politics continues to revolve around efforts to create and defend private refuges of single-family homes that exclude commercial functions and the poor while achieving a harmonious synthesis between the residential setting and the natural environment.
Beginning in the late 1800s and early 1900s, the quest for local autonomy through municipal incorporation and zoning regulations emerged as a defining feature of suburban political culture. Garden suburbs outside cities such as Boston and Philadelphia incorporated as separate municipalities in order to prevent urban annexation and maintain local control over taxes and services. The 1898 merger that created the five boroughs of New York City stood as a rare exception to the twentieth-century pattern of metropolitan fragmentation into autonomous political districts, especially in the Northeast and Midwest (pro-urban laws continued to facilitate annexation of suburbs in a number of southern and western states). Municipal incorporation enabled affluent suburbs to implement land-use policies of exclusionary zoning that banned industry and multifamily units from homogeneous neighborhoods of single-family homes. The concurrent spread of private racial covenants forbade ownership or rental of property in particular areas by “any person other than of the white or Caucasian race.” By the 1920s, real estate developers and homeowners associations promoted restrictive covenants in most new subdivisions, often specifically excluding occupancy by African Americans, Asians, Mexicans, Puerto Ricans, American Indians, and Jews. The NAACP repeatedly challenged the constitutionality of racial covenants during the early decades of the modern civil rights movement, but the Supreme Court continued to permit their enforcement until the Shelley v. Kraemer decision of 1948.
By 1930 one-sixth of the American population lived in the suburbs, with public policies and private practices combining to segregate neighborhoods comprehensively by race and class. From the local to the national levels, suburban politics revolved around the protection of white middle-class family life through the defense of private property values. Following the turn to restrictive covenants, the National Association of Real Estate Boards instructed its members that “a realtor should never be instrumental in introducing into a neighborhood a character of property or occupancy, members of any race or nationality, or any individual whose presence will clearly be detrimental to property values in the neighborhood.” During the depths of the Great Depression, the federal government established the Home Owners Loan Corporation, which provided emergency mortgages in single-family neighborhoods while “redlining” areas that contained industry, multifamily housing units, and low-income or minority residents. The National Housing Act of 1934 chartered the Federal Housing Administration (FHA), which insured low-interest mortgage loans issued by private banks and thereby revolutionized the market for suburban residential construction. The FHA also endorsed restrictive racial covenants and maintained the guideline that “if a neighborhood is to retain stability, it is necessary that properties shall continue to be occupied by the same social and racial classes.” Between 1934 and 1960, the federal government provided 117 billion in mortgage insurance for private homes, with racial minorities formally excluded from 98 percent of these new developments, almost all of which were located in suburbs or outlying neighborhoods within city limits.
During the sustained economic boom that followed World War II, suburbs became the primary residential destination for white-collar and blue-collar families alike, although neighborhoods remained stratified by socioeconomics as well as segregated by race. Suburban development became a powerful validation of the New Deal social contract, embodied in President Franklin Roosevelt’s promise that the national state would secure “the right of every family to a decent home.” The GI Bill of 1944 offered interest-free mortgages to millions of military veterans and enabled many white working-class and middle-class families to achieve the suburban dream of a detached house with a yard. The federal government also subsidized suburban growth through interstate highways and other road-building projects that connected bedroom communities to downtown business districts and accelerated the decentralization of shopping malls and office parks to the metropolitan fringe. The number of American families that owned their own homes increased from 40 to 60 percent between the 1940s and the 1960s, the period of the great white middle-class migration to the suburbs.
In 1947 the corporate-designed Levittown on Long Island became the national symbol of this new suburban prosperity, an all-white town of 70,000 residents marketed as “the most perfectly planned community in America.” Social critics mocked Levittown and similar mass-produced developments for their cookie-cutter houses and the allegedly bland and conformist lifestyles of their inhabitants. But defenders credited the suburbs with achieving “the ideal of prosperity for all in a classless society,” the epitome of the consumer-based freedoms that would assure the victory of the United States in the cold war, as Vice President Richard M. Nixon proclaimed in the 1959 “kitchen debate” with Premier Nikita Khrushchev of the Soviet Union.
The suburban political culture of the 1950s celebrated consensus, consumer affluence, and a domestic ideology marked by rigid gender roles within the heterosexual nuclear family. On the surface, in the television family sitcoms and mass-market magazines, postwar America seemed to be a place of white upper-middle-class contentment, with fathers commuting to corporate jobs while stay-at-home mothers watched their baby boomer youth play with toys advertised by Walt Disney and other large corporations. In the presidential elections of 1952 and 1956, a substantial majority of voters twice rejected the reformist liberalism of Adlai Stevenson for the moderate conservatism of President Dwight D. Eisenhower, later labeled “the great Republican hero of the suburban middle class” by GOP strategist Kevin Phillips.
But this consensus ideology of 1950s suburban prosperity existed alongside a growing crisis of domesticity that would soon explode in the social movements of the 1960s. Sociologist William Whyte characterized white-collar managers from the affluent suburbs as the collective “Organization Man,” a generation that had sacrificed individuality to the demands of corporate conformity. Betty Friedan critiqued the “feminine mystique” for promoting therapeutic rather than political solutions to issues of sex discrimination and proclaimed: “We can no longer ignore that voice within women that says: ‘I want something more than my husband and my children and my home.’ ” In 1962 Students for a Democratic Society called for a new left that rejected the utopian promises of suburban tranquility: “We are people of this generation, bred in at least modest comfort, housed now in universities, looking uncomfortably to the world we inherit.”
In the 1960s and 1970s, the greatest challenge to suburban political autonomy came from the civil rights campaign for school and housing integration. A growing white backlash greeted these efforts to open up the suburbs, signaled by the passage of Proposition 14 by California voters in 1964. Three-fourths of the white suburban electorate supported this referendum to repeal California’s fair-housing law and protect the private right to discriminate on the basis of race in the sale and renting of property. The open-housing movement, which had been attacking policies of suburban exclusion for half a century, gained new urgency with the race riots that erupted in American cities in the mid-to-late 1960s. During the summer of 1966, Martin Luther King Jr. led open-housing marches into several of Chicago’s all-white suburbs, but the violent reaction of white homeowners did not persuade Congress to pass a federal open-housing law.
In 1968 the National Advisory Commission on Civil Disorders (Kerner Commission) issued a dire warning that the United States was divided into “two societies; one, largely Negro and poor, located in the central cities; the other, predominantly white and affluent, located in the suburbs and outlying areas.” The Kerner Report also placed blame for the urban crisis on public policies of suburban exclusion: “What white Americans have never fully understood—but what the Negro can never forget—is that white society is deeply implicated in the ghetto. White institutions created it, white institutions maintain it, and white society condones it.” Congress responded by passing the landmark Fair Housing Act of 1968, which banned discrimination on the basis of race, color, religion, and national origin in the sale and renting of property.
The political backlash against the civil rights movement galvanized white voters in working-class and middle-class suburbs alike. In 1966 Republican candidate Ronald Reagan won the California gubernatorial election by denouncing fair-housing legislation, calling for “law and order” crackdowns against urban criminals and campus protesters, and blaming liberal welfare programs for squandering the tax dollars of mainstream Americans. In 1968 Richard Nixon’s pledge to defend middle–American homeowners and taxpayers from the excesses of liberalism carried white-collar suburbs across the nation and made inroads among blue-collar Democrats. During his first term in office, Nixon vigorously defended the principle of suburban autonomy by opposing court-ordered busing to integrate public schools and by resisting inclusionary zoning to scatter low-income housing throughout metropolitan regions. “Forced integration of the suburbs,” Nixon declared in 1971, “is not in the national interest.” In his 1972 reelection campaign, Nixon won 49 states by uniting working-class and middle-class white voters in a populist antiliberal alliance that he labeled the “silent majority.” In the 1980s, Ronald Reagan strengthened the Republican base in the suburbs by blaming the Democrats for economic recession, welfare cheaters, racial quotas, court-ordered busing, urban crime, and high taxes imposed on the hard-working majority to pay for failed antipoverty programs. Capitalizing on grassroots movements such as the California property tax revolt of 1978, Reagan dominated the suburban electorate by a 55-to-35 margin in 1980 and a 61-to-38 landslide in 1984.
Reagan’s victories in the 1980s represented the culmination of a suburban-driven realignment that ultimately destroyed the political base of New Deal liberalism, but the temporary triumph of Republican conservatism soon gave way to new forms of suburban diversity and heightened levels of electoral competitiveness. In 1985 a group of moderate Democrats formed the Democratic Leadership Council (DLC) to expand beyond the party’s urban base by becoming “competitive in suburban areas” and recognizing that “sprawl is where the voters are.” In the 1992 presidential election, Bill Clinton won by turning the DLC agenda into a campaign to honor the values of America’s “forgotten middle class, . . . like individual responsibility, hard work, family, community.” Clinton championed programs such as universal health care to address middle–American economic insecurity while neutralizing the GOP by promising to cut middle-class taxes, enact welfare reform, and be tough on crime. Clinton won a plurality of suburban votes in the three-way elections of 1992 (41 to 39 percent) and 1996 (47 to 42 percent), while maintaining the traditional Democratic base in the central cities. At the same time, the Democratic resurgence reflected the increasing heterogeneity of American suburbia, home to 54 percent of Asian Americans, 49 percent of Hispanics, 39 percent of African Americans, and 73 percent of whites at the time of the 2000 census (based on the 102 largest metropolitan regions). By century’s end, some political strategists were predicting an “emerging Democratic majority” based on the party’s newfound ability to appeal to white swing voters in the middle-class suburbs (a fiscally and culturally moderate electorate) and to capture the high-tech, multiracial metropolises of the booming Sunbelt.
Republican George W. Bush reclaimed the suburban vote by narrow margins in 2000 (49 to 47 percent) and 2004 (52 to 47 percent), but the dynamics of recent elections suggest the problematic nature of viewing contemporary metropolitan politics through the stark urban/liberal versus suburban/conservative dichotomy that took hold in the 1950s. Republican “family values” campaigns have mobilized the outer-ring suburbs that are home to large numbers of white married couples with young children, and Bush won 97 of the nation’s 100 fastest-growing exurban counties in the 2000 election. Democrats have found new bases of support in older inner-ring suburbs, many of which are diversifying as racial and ethnic minorities settle outside the city limits, as well as with middle-income women and white-collar professionals who dislike the cultural agenda of the religious right.
Suburban political culture is also in flux in other ways that challenge the partisan divisions that emerged during the postwar decades. The bellwether state of California, which led the national suburban backlash against civil rights and liberal programs in the 1960s, now combines an anti-tax political culture with a massive prison-industrial complex, a multiracial electorate with a deeply conflicted stance toward immigration, and some of the nation’s most progressive policies on environmental regulation and cultural issues.
Perhaps the most important consequence of the suburbanization of American politics is the way in which the partisan affiliations of voters has often mattered less than the identification of suburban residents as homeowners, taxpayers, and school parents. Regardless of which party controls Washington, America’s suburbs have proved to be quite successful at defending their property values, maintaining middle-class entitlement programs, resisting policies of redistributive taxation, preventing meaningful racial and economic integration, and thereby policing the cultural and political boundaries of the American Dream.
See also cities and politics.
FURTHER READING. Rosalyn Baxandall and Elizabeth Ewen, Picture Windows: How the Suburbs Happened, 2000; Lizabeth Cohen, A Consumers’ Republic: The Politics of Mass Consumption in Postwar America, 2003; Marjorie Connelly, “How Americans Voted: A Political Portrait,” New York Times, November 7, 2004; Robert M. Fogelson, Bourgeois Nightmares: Suburbia, 1870–1930, 2005; David M. P. Freund, Colored Property: State Policy and White Racial Politics in Suburban America, 2007; William Frey, “Melting Pot Suburbs: A Census 2000 Study of Suburban Diversity,” Washington, DC: Brookings Institution, June 2001, downloadable from http://www.brookings.edu/~/media/Files/rc/reports/2001/
06demographics_frey/frey.pdf; Stanley B. Greenberg, Middle Class Dreams: The Politics and Power of the New American Majority, revised ed., 1996; Kenneth T. Jackson, Crabgrass Frontier: The Suburbanization of the United States, 1985; Matthew D. Lassiter, The Silent Majority: Suburban Politics in the Sunbelt South, 2006; Lisa McGirr, Suburban Warriors: The Origins of the New American Right, 2001; Stephen Grant Meyer, As Long as They Don’t Move Next Door: Segregation and Racial Conflict in American Neighborhoods, 2000; Becky M. Nicolaides and Andrew Wiese, eds., The Suburb Reader, 2006; William Schneider, “The Suburban Century Begins,” Atlantic Monthly (July 1992), 33–44, downloadable from http://www.theatlantic.com/politics/ecbig/schnsub.htm; Robert O. Self, American Babylon: Race and the Struggle for Postwar Oakland, 2003; Andrew Wiese, Places of Their Own: African American Suburbanization in the Twentieth Century, 2004.
MATTHEW D. LASSITER
See voting; woman suffrage.
There is no doubt that the U.S. Supreme Court has influenced the politics of the country. As a public body, the Court is a highly visible part of the federal government. This has always been so, even when the justices met briefly twice a year in the drafty basement of the Capitol. Yet the idea that the Court itself is a political institution is controversial.
The justices themselves have disputed that fact. Indeed, the Court has gone to great pains to avoid the appearance of making political decisions. In Luther v. Borden (1849), the Court adopted a self-denying “prudential” (judge-made) rule that it would avoid hearing cases that the legislative branch, or the people, could decide for themselves, the “political questions.” In 1946 Justice Felix Frankfurter reiterated this principle in Colegrove v. Green. Because it did nothing but hear and decide cases and controversies brought before it, and its decisions affected only the parties in those cases and controversies, Alexander Hamilton assured doubters, in the Federalist Papers, No. 78, that the High Court was “the weakest branch” of the new federal government.
There are other apparent constraints on the Court’s participation in politics that arise from within the canons of the legal profession. Judges are supposed to be neutral in their approach to cases, and learned appellate court judges are supposed to ground their opinions in precedent and logic. On the Court, the high opinion of their peers and the legal community allegedly means more to them than popularity.
Conceding such legal, professional, and self-imposed constraints, the Court is a vital part of U.S. politics for three reasons. First, the Court is part of a constitutional system that is inherently political. Even before the rise of the first national two-party system in the mid-1790s, the Court found itself involved in politics. The Court declined to act as an advisory body to President George Washington on the matter of veterans’ benefits, asserting the separation of powers doctrine. In 1793 the Court ordered the state of Georgia to pay what it owed a man named Chisholm, an out-of-state creditor, causing a constitutional crisis that prompted the passage of the Eleventh Amendment. Those kinds of political frictions—among the branches of the federal government and between the High Court and the states—continue to draw the Court into politics.
After the advent of the national two-party system, partisanship became institutionalized, with the result that appointments to the Court have always been political. Nominees often have political careers before they agree to serve. They are almost always members of the president’s political party. The role the Senate plays in consenting to the president’s nominations (or, in slightly under one-fourth of the nominations, refusing to consent), further politicizes the Court, for the Senate divides along party and ideological lines in such votes. The confirmation debates and, after 1916, the hearings, are riven with politics, and once on the Court, the justices’ political views are often remarkably accurate predictors of their stances in cases that involve sensitive issues. The controversies surrounding President Ronald W. Reagan’s nomination of Robert Bork and President George H. W. Bush’s nomination of Clarence Thomas are recent examples of this observation.
Similarly, once on the Court the justices do not necessarily abandon their political aspirations. Salmon P. Chase, Stephen J. Field, Charles Evans Hughes, Frank Murphy, and William O. Douglas all wanted to be president of the United States, and Chief Justice William Howard Taft was an ex-president when he assumed the center chair. Felix Frankfurter and Abe Fortas continued to advise their respective presidents while on the bench.
Finally, the output of the Court has a major impact on the politics of the day. While it is not always true that the Court follows the election returns, it is true that the Court can influence them. In the first 60 years of the nineteenth century, slavery cases fit this description. Even after the Thirteenth Amendment abolished slavery, politically sensitive civil rights cases continued to come to the High Court. Labor relations cases, taxation cases, antitrust cases, and, more recently, privacy cases all had political impact.
Even the self-denying stance the Court adopted in political questions was subject to revision. In a famous footnote to Carolene Products v. U.S. (1938), the Court announced that it would pay particularly close attention to state actions that discriminated against “discrete and insular minorities” precisely because they were not protected by democratic “political processes.” In the 1940s, the Court struck down state laws denying persons of color the right to vote in election primaries. Later decisions barred states from drawing legislative districts intended to dilute the votes of minority citizens. By the 1960s, the Court’s abstinence in political questions had worn thin. In a series of “reapportionment cases,” the Court determined that states could not frame state or congressional electoral districts unfairly. The High Court’s rulings, sometimes known as the “one man, one vote” doctrine, remade state and federal electoral politics.
Perhaps the most appropriate way to demonstrate the Court’s complex institutional politics is to describe its most prominent cases. The very first of the Court’s great cases, Marbury v. Madison (1803) involved political relations within government, the partisan composition of the Court, and the political impact of a decision. It began when the Republican Party of Thomas Jefferson and James Madison won control of the presidency and both houses of Congress in what Jefferson called the revolution of 1800.
The Jeffersonian Republicans wanted to purge the judiciary of their rivals, the Federalists, and eliminate many of the so-called midnight appointments. In the coming years, Congress would impeach and remove Federalist district court judge Timothy Pickering and impeach Federalist Supreme Court justice Samuel Chase. Into this highly charged partisan arena came the case of William Marbury.
Marbury was supposed to receive a commission as a justice of the peace for the District of Columbia. However, when he was the outgoing secretary of state, John Marshall failed to send the commission on, and the incoming secretary of state, James Madison, with the assent of President Jefferson, did not remedy Marshall’s oversight. When Marbury did not get the commission, he filed suit with the clerk of the Supreme Court under the provisions of the Judiciary Act of 1789, which gave the Court original jurisdiction in such matters.
Thus, the case went directly to the Court. The issue, as Marshall, who was now chief justice, framed it, was whether the Court had jurisdiction over the case. He intentionally ignored the political context of the suit. It seems obvious that the issue was political, but in a long opinion for that day (26 pages), Marshall wrote for a unanimous Court that the justices could not issue the writ because it was not one of the kinds of original jurisdiction given the Court in Article III of the Constitution. The Constitution controlled or limited what Congress could do, and prohibited the Congress from expanding the original jurisdiction of the Court. Congress had violated the Constitution by giving this authority to the Court. In short, he struck down that part of the Judiciary Act of 1789 as unconstitutional.
The power that Marshall assumed in the Court to find acts of Congress unconstitutional, and thus null and void, was immensely important politically within the government structure, for it protected the independence of the Court from Congress, implied that the Court was the final arbiter of the meaning of the Constitution (the doctrine of judicial review), and reminded everyone that the Constitution was the supreme law against which every act of Congress had to be measured. Although critics of Marbury decried judicial tyranny and asserted that the opinion was colored by Marshall’s party affiliation, a political challenge to Marshall’s opinion was not possible because it did not require any action. Marbury went away empty-handed.
Marbury managed to keep the Court out of politics in a formal sense, though it was deeply political; the “self-inflicted wound” of Dred Scott v. Sanford (1857) plunged the Court into the center of the political maelstrom. What to do about slavery in the territories was a suppurating wound in antebellum national politics. By the late 1850s, the controversy had destroyed one national party, the Whigs, and led to the formation of a new party, the Republicans, based wholly in the North and dedicated to preventing the expansion of slavery.
Against this background of intensifying partisanship and sectional passion, the Supreme Court might have elected to avoid making general pronouncements about slavery and stick to the facts of cases, narrowing the precedent. However, in 1856 newly elected Democratic president James Buchanan asked the Court to find a comprehensive solution to the controversy when Congress deadlocked over the admission of Kansas as a slave state. A Democratic majority was led by long-term chief justice Roger B. Taney of Maryland, a dedicated states’ rights Democrat who had been Andrew Jackson’s reliable aide in the war against the second Bank of the United States.
Dred Scott was the slave of U.S. Army doctor John Emerson, and was taken with him from Louisiana to posts in free states and free territories. In 1843 Emerson returned to a family home in Missouri, a slave state, and Scott went with him. In 1846, three years after Emerson’s death, for himself and his family, Scott sued for his freedom. After two trials and four years had passed, the Missouri trial court ruled in his favor. The Missouri Supreme Court reversed that decision in 1852. Northern personal liberty laws, the response to the Fugitive Slave Act of 1850, angered Missouri slaveholding interests, and the new policy that the state’s supreme court adopted in Scott reflected that anger.
But Scott’s cause had also gained new friends, “free soil” and abolitionist interests that believed his case raised crucial issues. Because Emerson’s estate had a New York executor, John Sanford, Scott could bring his suit for freedom in federal court under the diversity clause of the Judiciary Act of 1789. This litigation could only go forward if Scott were a citizen, but the federal circuit court sitting in St. Louis decided to hear the suit. In 1854, however, the federal court agreed with the Missouri supreme court: under Missouri law, Scott was still a slave.
The U.S. Supreme Court agreed to a full dress hearing of Scott’s appeal in 1856. Oral argument took four days, and the Court’s final ruling was delayed another year, after the presidential election of 1856. Joined by six of the other justices, Taney ruled that the lower federal court was correct: under Missouri law, Scott had no case. Nor should the case have come to the federal courts, for Scott was not a citizen. The law behind this decision was clear, and it was enough to resolve the case. But Taney added two dicta, readings of history and law that were not necessary to resolve the case but would, if followed, have settled the political questions of black citizenship and free soil.
Taney wrote that no person of African descent brought to America to labor could ever be a citizen of the United States. Such individuals might be citizens of particular states, but this did not confer national citizenship on them. In a second dictum, Taney opined that the Fifth Amendment to the Constitution, guaranteeing that no man’s property might be taken without due process of law, barred Congress from denying slavery expansion into the territories. In effect, Taney retroactively declared the Missouri Compromise of 1820, barring slavery in territories north of 36° 30′ north latitude, unconstitutional.
The opinion was celebrated in the South and excoriated in the North. Northern public opinion, never friendly to abolitionism, now found the possibility of slavery moving north frightening. Abraham Lincoln used it to undermine his rival for the Illinois Senate seat, Stephen Douglas. Lincoln lost the race (Douglas and the Democrats controlled the legislature), but he won the debates and found an issue on which to campaign for president in 1860.
In his first inaugural address, President Lincoln issued a subtle warning to the holdover Democratic majority on the Court, and to Taney in particular. The will of the people, embodied in the electoral victory of the Republicans, would not tolerate a Court that defended secession. The justices took the hint. They agreed to the blockade of the Confederate coastline and accepted the administration view that the Confederacy did not legally exist. By the end of the war, Lincoln was able to add enough Republicans to the Court, including a new chief justice, Salmon Chase, to ensure that Republican policies would not be overturned. For example, the majority of the Court found that “greenbacks,” paper money issued by the federal government to finance the war, were legal tender.
The Reconstruction amendments profoundly changed the constitutional landscape, giving the federal government increased supervision over the states. Insofar as the High Court had already claimed pride of place in interpreting the meaning of the Constitution, the Thirteenth, the Fourteenth, and the Fifteenth Amendments, along with the Civil Rights Acts of 1866, 1870, 1871, and 1875, should have led to deeper Court involvement in the politics of the South. Instead, the Court’s repeated refusal to intervene reflected the white consensus that nothing further could be done to aid the newly freed slaves in the South, or the black people of the North, for that matter.
The so-called voting rights cases were inherently political because they touched the most basic rights of citizens in a democracy—the right to participate in the political process. In these cases, the Court deployed the first kind of politics, the politics of federalism, in response to the third kind of politics, the wider politics of party. By 1876, the Radical Republican impulse to enforce an aggressive Reconstruction policy had spent itself. In the election of 1876, the Republican nominee, Rutherford B. Hayes, promised that he would end the military occupation of the former Confederate states, in effect turning over state and local government to the “Redeemers,” former Confederate political leaders, and leaving the fate of the former slaves to their past masters.
In U.S. v. Hiram Reese and U.S. v. William Cruikshank et al., decided in 1875 and 1876, the Court found ways to back the Redeemers. In the former case, a Kentucky state voting registrar refused to allow Garner, an African American, to pay the poll tax. The motive was as much political as racial, as the state was Democratic and the party leaders assumed that every black voter was a Republican. A circuit court had dismissed the prosecutor’s indictments. The High Court affirmed the lower court. In the latter case, a mob of whites attacked blacks guarding a courthouse in New Orleans. Again the federal circuit court had found the indictments wanting. The High Court agreed.
Was the Court concerned about the political implications of the two cases? They were heard in 1875, but the decision was not announced until the next year. In his opinion for the Court in Cruikshank, Chief Justice Morrison R. Waite introduced the concept of “state action,” a limitation on the reach of the Fourteenth Amendment’s due process and equal protection clauses. The New Orleans mob was not an agent of the state, so the Fourteenth Amendment and the civil rights acts did not apply. The door was now wide open for the Redeemers to pass Jim Crow laws, segregating public facilities in the South, and deny freedmen their rights using supposedly neutral restrictions like literacy tests for voting as well as “whites only” primaries for the most important elections—those in the Democratic primary. Outright discrimination received Court approval in the case of Plessy v. Ferguson (1896), in which “equal but separate” laws, more popularly known as “separate but equal,” became the rule of the land.
The politicization of the High Court in the Gilded Age, a period of rapid industrialization, was nowhere more apparent than in a trio of highly political cases that arrived at the Court in 1894 and 1895. The first of the cases arose when the federal government prosecuted the E. C. Knight sugar-refining company and other refining operations, all part of the same sugar trust, for violation of the 1890 Sherman Antitrust Act.
Chief Justice Melville Fuller wrote the opinion at the end of 1894. Congress had the power to regulate interstate commerce but, according to Fuller, the refineries were manufacturing plants wholly within the states of Delaware, Pennsylvania, and New Jersey, and thus not subject to federal law. The Court, by a vote of 8 to 1, had refused to let the progressives in the government enjoin (legally stop), combination of the sugar refineries. It was a victory for the monopolies and the politicians they had lobbied. By the same lopsided vote, in In Re Debs (1895) the Court upheld a lower-court injunction sought by the federal government against the American Railway Union for striking. It too was a triumph for conservative political forces.
The third time in which the Fuller Court delved into the great political causes of the day was an income tax case. Democratic voters in rural areas favored the reintroduction of an income tax. The tax Congress passed during the Civil War expired in 1872. In 1894 Congress passed a flat 2 percent income tax on all incomes over 4,000—the equivalent of about
91,000 in 2005 dollars. Defenders of the sacredness of private wealth were aghast and feared that the measure brought the nation one step closer to socialism. In Pollock v. Farmers Loan and Trust Company (1895), Fuller and the Court agreed and set aside the entire act of Congress, not just the offending corporate provisions.
All three of the High Court’s opinions angered the Populists and other reformers. William Jennings Bryan, the former congressman who captured the Democratic Party nomination in 1896, won over a much divided convention, in part, with an attack on the Court’s dismissive view of the working man. But Bryan sounded like a dangerous extremist in a decade filled with radicalism. Better financed, supported by most of the major newspapers, the Republicans and McKinley won a landslide victory, with 271 electoral votes to Bryan’s 176.
During U.S. participation in World War I, nothing could have been more political than the antiwar protests of 1917–18, and the Court handled these with a heavy hand. Here the Court acted not as an independent check on the other branches of the federal government, upholding the Bill of Rights, but as the handmaiden of the other branches’ claims to wartime powers. In such cases, the Court was political in the first sense, as part of the larger operation of the federal government.
When pro-German, antiwar, or radical spokesmen appeared to interfere with the draft by making speeches, passing out leaflets, or writing editorials, or when they conspired to carry out any act that might interfere with the draft, the federal government arrested, tried, and convicted them under the Espionage Act of 1917. The High Court found no protection for such speech in the First Amendment. As Justice Oliver Wendell Holmes Jr. wrote in upholding the conviction of Socialist Party leader Eugene V. Debs, Debs’s avowed Socialist commitments could not be tolerated by a nation at war. The government had to protect itself against such upsetting speech. Holmes would reverse himself in U.S. v. Abrams (1919), but antigovernment political speech in time of war did not receive protection from the Court under the First Amendment until the Vietnam War.
In the New Deal Era, the Court thrust itself into the center of the political arena. By first upholding federal and state intervention in the economy, then striking down congressional acts, and then deferring to Congress, the Court proved that external political considerations could be as powerful an influence as the justices’ own political views. The New Deal, from 1933 to 1941, was, in reality, two distinct political and economic periods. Most of the more controversial programs from the first New Deal, like the National Recovery Administration, the Court (led by the conservative quartet of George Sutherland, Willis Van Devanter, Pierce Butler, and James C. McReynolds, joined by Owen Roberts) struck down, under the substantive due process, doctrine it originally announced in the case of Lochner v. New York (1905).
With the Depression largely unaffected by the first New Deal, Franklin Roosevelt’s administration and Congress enacted more egalitarian reforms in 1935. Among these were programs to provide jobs (the Works Progress Administration), the Social Security Act, the Rural Electrification Administration, and the National Labor Relations Act. The last of these finally ended the antilabor injunction, in effect overruling the Court’s attempts to protect it. The stage was set for a constitutional crisis between Roosevelt and the Court. In 1937, however, Justice Roberts shifted his stance, joining, among others, Chief Justice Charles Evans Hughes to uphold the constitutionality of the Social Security Administration, the National Labor Relations Board, and other New Deal agencies. What had happened to change the constitutional landscape?
One factor could have been Roosevelt’s plan to revise the membership of the Court. Congress had done this before, adding justices or (at the end of the Civil War) reducing the number of justices. Roosevelt would have added justices to the Court for every justice over the age of 70, in effect “packing it” with New Deal supporters. From their new building, dubbed “the marble palace” by journalists, all the justices disliked the packing plan. No one knew what Roosevelt’s plan would bring or if Congress would accede to the president’s wishes. In fact, the Senate quashed the initiative. But by that time the High Court had shifted its views enough to let key measures of the second New Deal escape, including Social Security, collective bargaining for labor, and minimum wage laws.
With the retirement of one conservative after another, Roosevelt would be able to fill the Court in a more conventional way with New Deal supporters. From 1937 to 1943, turnover in the Court was unmatched. The new justices included Hugo Black, Stanley Reed, Felix Frankfurter, William O. Douglas, Frank Murphy, James F. Byrnes, Robert Jackson, and Wiley Rutledge. All, to one degree or another, believed in deference to popularly elected legislatures.
After World War II, the cold war and the so-called second Red Scare again required the Court to step carefully through a political minefield. At the height of the cold war, the House Un-American Activities Committee and the Senate’s Permanent Subcommittee on Investigations, led by Senator Joseph McCarthy of Wisconsin, sought to uncover Communists in government posts. A wider scare led to blacklists of former Communists and their alleged conspirators in Hollywood, among New York State school teachers, and elsewhere.
Here the Court’s majority followed the election returns in such cases as Dennis v. United States (1951). The case arose out of Attorney General Tom Clark’s orders to prosecute the leaders of the Communist Party-USA (CPUSA) for violating the 1940 Smith Act, which forbade any advocacy or conspiracy to advocate the violent overthrow of the government. Although this was only one of many cases stemming from the “Foley Square Trials” in the federal district court in New York City, the High Court had yet to rule on the First Amendment issues involved.
Chief Justice Fred Vinson warned that the government did not have to wait to act as the Communist Party organized and gathered strength. The Smith Act was clear and constitutional—and the Communist Party, to which Dennis and the others indicted under the act belonged, had as its policy the violent overthrow of the government. Justices Hugo Black and William O. Douglas dissented.
In so doing, they initiated a great debate over whether the Court should adopt an absolutist or more flexible interpretation of the Bill of Rights. Black’s dissent was that the First Amendment’s declaration that Congress shall make no law meant “no law.” Conceding some power to the government, Douglas, an author himself, would be the first to grant that printed words could lead to action, and he made plain his dislike of the Communists’ required reading list. But, he reasoned, “If the books themselves are not outlawed, if they can lawfully remain on library shelves, by what reasoning does their use in a classroom become a crime?” Within a decade, Douglas’s views would triumph.
Civil rights again thrust the Court into the center of political agitation, except this time it spoke not in political terms but in moral ones. The civil rights decisions of the Warren Court elevated it above the politics of the justices, and the politics of the men who put the justices in the marble palace. Earl Warren, chief justice during this “rights revolution,” came to personify the Court’s new unanimity. He was first and foremost a politician. His meteoric rise in California politics, from humble beginnings to state attorney general, and then governor, was accompanied by a gradual shift from conservative Republicanism to a more moderate, centrist position—one that favored government programs to help the poor and regulation of business in the public interest. In return for Warren’s support at the 1952 national convention, newly elected President Dwight D. Eisenhower promised him a place on the Court. The first vacancy was the chief justiceship, and Eisenhower somewhat reluctantly kept his word.
Warren did not have a distinguished civil rights or civil liberties record in California. During World War II, he had been a strong proponent of the forced relocation of Japanese Americans from their homes on the West Coast to internment camps. But on the Court he saw that the politics of civil rights and the Fourteenth Amendment’s plain meaning required the end of racial discrimination in schools, public facilities, and voting.
There can be no doubt that the Court’s decisions in Brown v. Board of Education (1954), Cooper v. Aaron (1958), and subsequent school desegregation cases had a major political impact. Certainly southern members of Congress recognized that impact when they joined in a “manifesto” denouncing the Court for exceeding its role in the federal system and the federal government for its intrusion into southern state affairs. President Eisenhower was so disturbed by the Court’s role in the rights revolution that he reportedly said—referring to Warren and Justice William J. Brennan—that his two worst mistakes were sitting on the Supreme Court.
In more recent times, the nomination process itself has become the beginning of an ongoing politicization of the Court. The abortive nominations of the Nixon and Reagan years proved that the presidency and Congress at last considered the Court a full partner—with the result that every nominee was scrutinized more carefully. There would be no more Earl Warrens, at least in theory. The effect was a hearing process that had become a national spectacle of partisan politics.
The focal point of that spectacle has been another of the Court’s decisions—Roe v. Wade (1973). Every candidate for the Court is asked where he or she stands on this case that legalized abortion for most pregnancies. Oddly enough, it was President Nixon’s Court, which he had constructed in reaction to the Warren Court’s rulings on criminal procedure, that produced this ruling.
Chief Justice Warren Burger knew the importance of Roe v. Wade and its companion case, Doe v. Bolton, from the moment they arrived in 1971 as two class-action suits challenging Texas and Georgia abortion laws, respectively. In both cases, federal district court panels of three judges struck down the state laws as violating the federal Constitution’s protection of a woman’s privacy rights, themselves a politically charged issue after the Court’s decision in Griswold v. Connecticut (1965), which struck down a Connecticut law banning the distribution of birth control materials, largely on privacy grounds.
The majority of the justices agreed with the lower courts but labored to find a constitutional formula allowing pregnant women to determine their reproductive fates. Justice Harry Blackmun, a Nixon appointee, was assigned the opinion for the majority, and based the right on the due process clause of the Fourteenth Amendment, though he clearly had more interest in the sanctity of the doctor-patient relationship than in the rights of women. His formulation relied on the division of a pregnancy into trimesters. In the first of these, a woman needed only the consent of her doctor. In the second and third trimesters, after the twentieth week, the state’s interest in the potential life allowed it to impose increasingly stiff regulations on abortions.
The 7-to-2 decision invalidated most of the abortion laws in the country and nationalized what had been a very local, very personal issue. Roe would become one of the most controverted and controversial of the Court’s opinions since Dred Scott, to which some of its critics, including Justice Antonin Scalia, would later compare it. For women’s rights advocates it was a decision that recognized a right, but only barely, with qualifications, on a constitutional theory ripe for attack. Opponents of abortion jeered a decision that recognized a state interest in the fetus but denied that life began at conception. They would mobilize against the desecration of religion, motherhood, and the family that they felt the decision represented. The position a nominee took on Roe became a litmus test. Congressional and presidential elections turned on the abortion rights question, as new and potent political action groups, in particular religious lobbies, entered the national arena for the first time to battle over Roe.
If more proof of the place of the Court in American politics were needed, it came at the end of the hotly contested 2000 presidential election campaign between Albert Gore Jr. and George W. Bush. As in Marbury, Bush v. Gore (2000) exemplified all three of the political aspects of the Court’s place in U.S. history. First, it was a federalism case. Second, the division on the Court matched the political background of the justices. Finally, no case or opinion could have a more obvious impact on politics in that it determined the outcome of a presidential election.
To be sure, there was a precedent. In 1877 another hotly contested election ended with a decision that was clearly controversial, and five justices of the High Court played the deciding role in that case as well, voting along party lines to seat all the electors for Republican Rutherford B. Hayes and none of the electors for Democrat Samuel J. Tilden as part of the commission set up to resolve the dispute. But in Bush v. Gore, the disputed results in the Florida balloting never reached Congress. Instead, the justices voted to end the Florida Supreme Court–ordered recount and declare Bush the winner in Florida, and thereby the newly elected president. The majority disclaimed any partisan intent.
Whatever stance one takes on Bush v. Gore, the case, like those before it, offers proof that the Court has a role in the institutional politics of the nation, that the members of the Court are political players themselves, and that the Court’s decisions can dramatically affect the nation’s political fate.
See also House of Representatives; presidency; Senate.
FURTHER READING. Henry Abraham, Justices, Presidents, and Senators, rev. ed., 1999; Jan Crawford Greenberg, Supreme Conflict: The Inside Story of the Struggle for Control of the United States Supreme Court, 2007; Kermit Hall et al., eds., The Oxford Companion to the Supreme Court of the United States, 2nd ed., 2005; Peter Charles Hoffer, Williamjames Hull Hoffer, and N.E.H. Hull, The Supreme Court: An Essential History, 2007; Richard Neely, How Courts Govern America, 1981; David M. O’Brien, Storm Center: The Supreme Court in American Politics, 7th ed., 2005; Jeffrey Rosen, The Supreme Court: The Personalities and the Rivalries That Defined America, 2007; Idem, The Most Democratic Branch: How the Courts Serve America, 2006; Christopher Tomlins, ed., The United States Supreme Court: The Pursuit of Justice, 2005; Jeffrey Toobin, The Nine: Inside the Secret World of the Supreme Court, 2007.
WILLIAMJAMES HULL HOFFER