A Sense of We

Our national motto is e pluribus unum—Latin for out of many, one. Constructing that one, what Robert Putnam called a “sense of ‘we,’” is an important task for democracy.1

The logic of a democratic system requires that people accept the legitimacy of contrary views and parties. The world’s foremost students of our attitudes toward each other’s views worry about the corrosive effect of intolerance on democracy. It saps support for the rights of those who disagree, limits the ability to raise issues, let alone solve them, and can lead to violence. Rolling over dissent, mob psychology can open democracy’s door to dictatorship and tragedy.2 In short, intolerance is dangerous, the autoimmune disease of democracy.3

Democracy, however, requires much more than our tolerance of each other’s beliefs; it requires cooperation. Robert Putnam touched a chord in 1993 with a prize-winning book on the development of democracy in Italy. In MAKING DEMOCRACY WORK, Putnam concluded that historically there were a great deal more civic and social activities outside the family in northern than in southern Italy. Confining activity to the family stifles the broader connections needed to run a democratic government. Community activities, by contrast, helped democracy take hold earlier and more firmly in northern than in southern Italy.4 He concluded that it is not enough that people be educated about democracy’s virtues, want, or believe in it; economic incentives and enlightened self-interest are insufficient for democracy to take root. For democracy to thrive, people have to trust, cooperate, and care about one another across the broader community. Democracy is based on “social capital”—the things people do together that develop loyalties to each other, to a larger population, and to the greater good. Putnam used “bowling alone” as a metaphor for what he saw as the contemporary decline in routine social activities that build social capital.5 His conclusion that social capital has declined is very controversial. But the need for social capital is fundamental.

Many encapsulated Putnam as saying democracy depends on mutual trust, although trust is only part of Putnam’s point.6 Trust enables cooperation and makes it possible to let political opponents govern whenever the people vote for them. It is crucial to be able to rely on basic fairness from opponents, on their respect for your vital interests, even while more peripheral matters may be decided against your views or interests. The decency of our behavior toward each other creates either a virtuous cycle that reinforces democracy or a vicious cycle that destroys it.

Social capital does not always cross party, class, gender, religious, ethnic, and racial lines.7 When the population polarizes so that people despise or fear each other, it becomes harder to trust opponents’ fairness when in power. Constitutions and bills of rights attempt to guaranty essential protections to the losers, but they too rest on trust that those in charge will respect the paper guarantees.8

An almost forgotten work adds a crucial thread through these relationships of tolerance, trust, and trustworthiness.9 David McClelland, in a work much discussed when it appeared in 1961, concluded that the flowering of democracy depended on matching people’s desire for power by their desire for the affection of others. People who care only for power would gladly destroy democracy to gain or keep control. Democracy depends on a culture that puts the rules of the political contest above one’s own choices. Successful democratic societies put a high value on getting along and winning the esteem of others. By comparison with other countries, McClelland found that Americans cared deeply about both power and the affection of others. Americans want to be liked, as David Riesman classically described Americans in the “lonely crowd”; they crave affection.10

McClelland’s methods were controversial but his insight was brilliant.11 The politicians’ creed—coupling their desire for power with their need to be liked—lies at the foundation of democracy. When we want the approval of others, it is easier to accept democratic values and to respect rather than threaten or coerce the public. The sense of community, of mutual concern and need for approval, is critical to the way power is shared in a democracy. Democracy demands concern for others.

Tolerance, social capital, desire for approval, and concern for the opinions and welfare of others are all part of the way people collaborate with others across society; they are critical for democracy to succeed. Looking around the world, it is clear those needs and relationships are neither genetic nor automatic, but cultural. They have to be nourished and protected.

These conclusions of modern social scientists corroborate the founders’ insistence that democracy depended on a republican frame of mind defined by a shared sense of civic virtue and public responsibility. A republican education was crucial precisely because selfishness could easily destroy democracy.12 A similar insight lay behind anti-Federalist resistance to the Constitution because they doubted that people could develop the necessary loyalties beyond their own states.13

If tolerance, social capital, mutual care, concern and a need for approval are all necessary to sustain democracy, do we have a problem? If democracy needs to bring all of society under its wing, what builds the necessary relationships? And what do law and courts have to do with that?

Do Americans Have a Problem?

Alexis de Tocqueville noticed American intolerance in 1830 while touring the young nation: “I know of no country in which there is so little independence of mind and real freedom of discussion as in America.”14 Other great students of American politics have described the ways public opinion can censor and stifle thought.15

Polls revealed a surprisingly large percentage of Americans ready to silence those with whom they disagree.16 That finding has been robust through significant scientific improvements in understanding tolerance.17 Willingness to have government silence others goes well beyond constitutional and other appropriate limits.18 Tolerance and compromise are in decline.19

Divided Minds

Political scientists have been concerned about those findings since they were documented more than half a century ago.20 There is evidence that Americans are aware of each other’s intolerance. Gibson found that nearly three-quarters of Americans would not put a political sign on their lawns because people might get upset, and a majority would not show their politics in public with a bumper sticker. Americans are reluctant even to take action in private, like writing a congressman.21

Although Americans claim pride in freedom of speech, they often find disagreement inappropriate, even rude, and they consider dissenters an embarrassment. Many resist allowing a forum to those who disagree. As Robert J. Samuelson put it, “Americans disdain fierce moral combat,” and disdain protesters and their grievances.22

Attacking free expression is sometimes used as a “tough,” honorable stance. Mayor Giuliani conducted a running battle with the New York Civil Liberties Union over whether taxi drivers or others could demonstrate in front of City Hall, or use a megaphone to be heard without his approval, though he mostly lost in court.23 Mayor Bloomberg created holding pens for demonstrators, out of view of those they demonstrated against, with access blocked to many wanting to join the demonstration, and exits blocked for those seeking a restroom.24 For many, demonstrations became a violation against civil order and obedience. That too is a form of intolerance, and when officially enforced, a violation of the First Amendment. One has to assume that Giuliani and Bloomberg thought their actions would be popular among their supporters.

It may be indicative of our polarization that efforts have been made to impeach four of our last six presidents—Nixon, Reagan, Clinton, and Obama. The level of partisanship led a deeply divided Supreme Court to stop the 2000 presidential ballot count, lest it embarrass one of the candidates,25 ignoring precedent and specific provisions of the federal Constitution which defined how a disputed presidential election should be resolved. Five justices feared leaving the republic in Al Gore’s Democratic hands.26

A 2003 report of the Pew Research Center for the People and the Press found “[t]he extraordinary spirit of national unity that followed the calamitous events of Sept. 11, 2001 has dissolved amid rising polarization and anger.”27 The report found the country “further apart than ever in its political values.”28 Andrew Kohut, the founding director of the Pew Research Center, described “the anger level [as] so high that if the demonstrators of 1968 had felt like this, ‘there would have been gunfire in the streets.’”29 Actually, some put up websites with gun-sight style hash marks for opponents while others lobbed bricks through the windows of congressional representatives; one malcontent shot Congresswoman Gabby Giffords, and several made attempts on Obama’s life. Although clearly not typical, extremist behavior reflects the tension. The Pew Research Center reported that the partisan gap grew by 25 percent between 2003 and 2012, and it continues to deepen.30

The temperate middle ground now has dubious associations as opportunistic, lukewarm, compromising, and vacuous. Writing in THE CHRISTIAN CENTURY, Gary Dorrien described the loss of an appealing and principled Christian middle ground, leaving the religious in the hands of a variety of radical faiths.31 “Maverick” congressmen in this polarized political world are Republicans whose support for George W. Bush reached only 96 percent, and ticket splitting at the polls has declined.32 Richard Tomkins accurately predicted that polarization would have a major impact “on Capitol Hill where changing party demographics and geographics have contributed to hardening partisanship on and off the floor.”33

A Zogby poll portrayed both separate red and blue “nations” and “distinct moral world views”—a divide that continued through the subsequent elections of Presidents Bush and Obama.34 John White, a professor of politics at Catholic University, in a much-discussed essay, wrote, “Not since the Civil War and post-Reconstruction period has the country been so divided.”35

Polarization in national politics—with standoffs over the debt ceiling, the “fiscal cliff,” and executive appointments—brought the government to its knees, lowered the national credit rating, and extended the recession. The parties have alternated their control of Congress. But for all their talk about compromise, the geographical segregation of voters is magnified by the political system and shows no sign of changing.

Congressmen and senators have retired saying the atmosphere has become unhealthy. Their departure continues the trend. Compromise has come to mean selling out, resulting in indecision. As the title of Jim Hightower’s book says, THERE’S NOTHING IN THE MIDDLE OF THE ROAD BUT YELLOW STRIPES AND DEAD ARMADILLOS.

More than the difficulty of legislating, this level of tension and partisanship reflects the loss of many forms of the social capital necessary to sustain democratic government: tolerance, trust, trustworthiness, empathy, and the need for the approval of fellow citizens. As Dr. Jean Twenge put it, “Income inequality seems to be a key predictor of social capital—and thus the health of our nation.”36

The Human Divide Underneath

In 1950 my father put a five-hundred-dollar deposit on a home in Levittown, on Long Island. A few months later, when the Korean War broke out, he got cold feet, and we never made the move. Putting down that original deposit, however, had nothing to do with “white flight.” There were no black families in our neighborhood to flee. If there had been, we would have welcomed them; many of us were already rooting for the Dodgers’ new cleanup hitter, Jackie Robinson.

There were plenty of other reasons to flee the city. Some were trying to reconnect with their families’ pasts, before the move to cities had taken them away from homes in the country. Arthur Miller immortalized the dream of a white house in the country in his DEATH OF A SALESMAN. It had nothing to do with race.

Cities were hot, muggy, and stifling in the summer. We spent our summers upstate. Before the Salk vaccine, our trip was also fueled by fear of polio. The city seemed to be where it spread. The March of Dimes collected coins in little boxes in almost every store. Occasionally, we would see pictures of children in iron lungs, unable to do anything except lie there, enclosed in the machinery that helped them breath. It was terrifying.

Suburbs separated people, white people, to an extent that cities did not. People from different economic strata typically rubbed shoulders in mid-twentieth-century city neighborhoods. In our neighborhood, apartment houses lined many avenues, while one- and two-family homes lined the side streets. Urban buildings had gone up independently or at most in small groups. Levittown, built right after World War II, encompassed virtually an entire town. When it and subsequent developers cleared large tracts and offered a limited number of designs in a single price range, they homogenized large areas by economic class. The Levittown model, built separately for lower, middle, and “upscale” homes, now dominates suburbia. My dad later spent twenty years living in a large, gated Florida retirement community built for people like him.

These new suburbs would be bedroom communities for the city before establishing their own economic cores. It was easy to establish separate lives in the suburbs, separate from people on a different rung of the income ladder, driven to different suburbs by the price, not fear or dislike, just money.

Local governments use zoning to wall people off by wealth, now with one and five acre tracts. Robert Reich, former secretary of labor, observed, “the fortunate fifth is quietly seceding from the rest of the nation,” ticking off the myriad ways they separate themselves. Shopping malls ended the need to travel into the older cities. Private health, golf, tennis, and skating clubs replace the need to take advantage of public parks and playgrounds. Condominiums and residential enclaves provide internal streets, sidewalks, landscaping, swimming pools, and security guards for residents only. Wealthy children go to private schools and a large number of middle-class offspring now do the same. The revitalization of inner cities has also been aimed at making them attractive to those with considerable wealth, but the office complexes are self-sufficient—there is no need to leave. Even charity has now detached itself from the needs of the poor. Voluntary organizations often mirror geographical, gender, and class biases. Contributions go largely to the pleasures of wealth: art museums, opera houses, theaters, orchestras, ballet, private hospitals, and elite universities. The wealthy are increasingly unlikely to experience how other Americans live or be aware of their interdependence.37

The suburbs changed more than money. Some came to the United States with the intention to form separate, segregated, and often isolated religious communities. Mennonites, Amish, Pennsylvania Dutch, Chasidic Jews, and similar groups managed to retain that character. Either way people came and still come to America looking for people familiar from the old country—family, friends, or just people from the same places. That led to communities defined by nationality, ethnicity, and faith. For most, segregation broke down relatively quickly, partly influenced by the integrative traditions that we have described.

Nevertheless, Thomas Schelling, a Nobel Prize–winning economist, noticed that some suburbs that had originally been religiously mixed, soon resorted themselves so that each suburb was more homogenous. In a famous article, Schelling demonstrated that this could be true even if virtually everyone preferred to live in a mixed environment, so long as people’s definitions of a comfortable mixture varied.38

Many reasons have in fact divided people by faith—some because of affinity for what became red and blue states39—to settle near their houses of worship, for convenience or because communities organize social activities around their places of worship. The religious revival now taking place in America leads people to organize activities around religious experience. Many children are schooled at home, or in religious schools, skipping whatever diversity public schools have to offer. Believers form networks and use the institutions they belong to in order to attract other believers. Prayer groups allow people to find each other.

Traditionally, Catholic Charities, Jewish Family Services, and similar organizations were nonsectarian, partly because it was legally required or encouraged by the tax code. But the faith-based initiatives of Clinton and Bush now allow religious groups to be much more explicit about their religious mission in delivering services than merely advertising their sponsorship of charitable activities. Thrivent Financial for Lutherans advertises for Lutheran investors and bars others.40 For recreation we turn to private or religiously affiliated health clubs. Religion is not just about one’s relation to God. It is also about one’s relation to a community. These trends have undercut some of the purposes of the common, or public, school system.

African Americans, of course, were the last to benefit from the deliberate mixing of Americans in schools over the past two centuries, and their participation is still controversial. Some will be surprised that racial divisions grew after the celebrated decision in Brown v. Board of Education, especially in northern states. As blacks went north looking for jobs that opened up as a result of World War II, the Federal Highway Administration was building the roads that opened the suburbs. But those suburbs were “red-lined” by the Federal Housing Administration (FHA), which means that you could have your mortgage guaranteed, but only if you were white.41 Federal officials made sure that blacks were not welcome. The suburbs became lily white—not because of white flight from blacks, but because only whites were now able to realize the American dream of a house in the country. Gradually, as whites left for the suburbs, they took their businesses, and the jobs they funded, with them. As the central cities deteriorated and minority areas became increasingly impoverished, later emigrants may well have had different reasons for leaving but the die had long since been cast.

The Truman and subsequent administrations oversaw a large effort at “slum clearance”; blacks called it “Negro clearance.” Their parts of the cities looked dilapidated to whites who tore them down. Storied neighborhoods, alive with businesses, were torn down to make way for offices and white-owned stores. The demolition and renovation undercut all the work, social, and religious connections that make healthy communities. It would not be the first time white America eyed and took land from racial minorities.

Whether “urban renewal,” “slum clearance,” or “Negro clearance,” tearing down their old neighborhoods cut black businesses off from their customers. Under the rules of eminent domain, business gets nothing for what lawyers and businesses call “good will”—loyal customers. As businesses and their customers were forced out, they had to try to start over.

Before Brown, there was an extensive black economy—from universities, hospitals, and insurance companies to skilled trades and a host of small businesses. For black businesses, the Civil Rights Act of 196442 was a mixed blessing as customers exercised their right to shop and eat at newly integrated white stores and lunch counters. Minority-owned mom-and-pop businesses declined by half from 1960 to 1980.43 Whether forced out by “slum clearance” or loss of business to integration, switching to the white economy meant starting again at the bottom of the ladder, playing by white rules and customs, and hoping for white good faith in hiring, training, educating, and promoting blacks, but running instead into decades of resistance.44 The black community followed precisely the pattern of ingenuity, savvy, and dedication to self-help that other immigrant groups did—but the fruits were repeatedly pulled out from under them. Blacks were caught in a dilemma: Discrimination by the FHA and others were segregating the north, while affirmative action, the most effective remedy for discrimination against blacks, was attacked as discrimination against whites.

Northern segregation, largely the result of official Washington, DC, decisions, has had enormous implications. It is easy to forget how recent the pathologies are that are now routinely associated with black ghettos. In the 1950s, street crime was a product of white gangs—in the street we feared black leather jackets, not black skin. In the 1960s, drugs were a medical problem. Crime in black communities came later. When school boundaries were changed in the early 1960s, teachers like my father reacted with relief because the black children assigned to their schools were easier to handle than the white toughs now assigned elsewhere. Skin color changed on the block of a wealthy friend. No problem—good neighbors came in all colors.

A group led by Gary Orfield, a Harvard professor and expert on desegregation orders, reported in 2003 that although “desegregation of black students . . . increased continuously from the 1950s to the late 1980s,” it had “receded to levels not seen” since 1968. Orfield noted that the South had become “the nation’s most integrated region for both blacks and whites,” but “it is the region that is most rapidly going backwards as the courts terminate many major and successful desegregation orders.”45 Unfortunately that is similar to what the great historian C. Van Woodward described as “the strange career of Jim Crow” at the end of the nineteenth century when the Northern states lost interest, the KKK instituted a reign of terror, and the South used segregation to turn back the clock.46

Most black Americans live in racially segregated areas where they have little contact with whites.47 Minority communities are “isolat[ed] from social networks,” and “mismatch[ed]” by residence for job opportunities and for financing both for housing and business opportunities.48 Immigration also has increased residential segregation for Asians and Hispanics. In turn, residential segregation in the United States contributes to the subsequent segregation of business leaders which becomes enormously important in guiding the interactions of the groups.49 On average what the Bureau of the Census calls “non-Hispanic whites,” that is whites without Hispanic surnames, have the least interracial exposure of any group in the United States.50

Steven Greenhouse documented workplace segregation of people of Vietnamese and Hispanic origin in different company departments, generating about a five-thousand-dollar annual differential.51 Wards Cove Packing Co., Inc. v. Atonio52 documented the complete segregation of original Alaskan peoples, employed only in assembly line jobs, from the rest of the workforce working in white-collar jobs, in separate buildings. The company accomplished that result by recruiting for the assembly line jobs only among the native peoples in Alaska and recruiting for the white-collar jobs exclusively on the West Coast of the lower forty-eight. The U.S. Supreme Court chose to look the other way, saying they did not mean it.

Elizabeth Anderson, at the University of Michigan, comments that: “Firms located outside black neighborhoods and beyond the reach of public transportation are significantly less likely to hire black employees.” The same is true for Hispanics and other minorities. In other words, government policies that have shaped the suburbs since the 1940s have separated blacks from jobs. But nearly 60 percent of white-owned firms in major biracial metropolitan areas have no minority employees, and even in black neighborhoods, “one-third still have no minority employees.” By contrast, nearly 90 percent of black-owned firms in those metropolitan areas employ minority workers for three-quarters of the positions. The extent of separation goes much deeper because “employers practice occupational segregation” within companies. One survey showed that half of all job titles were only occupied by whites, “and one-quarter of blacks worked in jobs to which only blacks were assigned.” She gives the example of a “slaughterhouse in North Carolina [which] assigns the butchering jobs to black men, knife work to Mexicans, warehouse jobs to Indians, and mechanic and supervisor positions to whites.”53 White Americans may share the building but not much else.

Eleven o’clock on Sunday mornings has been described as the most racially segregated hour in America.54 Curiously, the most mixed congregations are conservative, nondenominational Protestant congregations with little awareness of the systemic problems that face their black congregants. Michael Emerson, codirector of the Kinder Institute for Social Research at Rice University, and Christian Smith, director of the Center for the Study of Religion and Society at the University of Notre Dame, teamed up on an award-winning study of race and religion. They point out that whites in those congregations are unaware of the extent to which blacks lack the social ties that provide “[a]ccess to valued resources—such as jobs, prestige, wealth, and power.”55 Therefore whites believe that blacks are responsible for their own status and that the press is responsible for making an issue of it. According to Emerson and Smith: “[Those congregants] tend to be socially and culturally isolated. So they do not see the hurdles that our segregated society places in front of blacks.”56 By contrast, those congregations in which the membership does see and object to “the hurdles that our segregated society places in front of blacks” tend to be even more racially segregated. Religion does not seem to be playing a significant part in breaking down racial isolation despite the earnest efforts of a number of religiously based organizations of both liberal and conservative orientations.

It is possible that we could get to know each other just by moving around to different places. But two-thirds of those born in the United States live in the state of our birth.57 Most who move, stay in the same county. Those who move across state boundaries are likely to be better educated and wealthier. They move for work-related reasons. Those with lower incomes and education are more likely to stay put and move for family rather than work-related reasons.58

Immobility is also generational. About half the men will have an income like their fathers.59 The other half will move up or down a quarter of the income scale. Mobility for daughters has been a bit larger.60 Much generational mobility is the result of the availability of a college education since World War II, but financial pressures make that harder to maintain.61

Americans do much less mixing than we want to believe. Most Americans stay put geographically62 and financially63 and remain segregated at work64 and in prayer.65 Solidarity across class lines is for political campaigns only. No American knows a representative cross-section of America.66 Despite the polls that attempt to tell us what each other thinks, we remain divided by birth and by condition. That separation affects our tensions and our understanding.

The Connections

Some of the institutions on which Americans relied to unite as one people have unraveled. The draft brought together Americans from all regions and backgrounds; but it became a casualty of the unpopular war in Vietnam. Schools were designed to bring Americans together; but the growth of large urban areas with segregated suburbs and deteriorated city neighborhoods, plus the push to move students into charter schools or give them vouchers they can take to private schools, undermine the unification that schools used to provide. We have separated the communities we live in, the amenities we have sought outside of our homes and offices, and the people we work, worship, and play with.

Our differences, our red and blue states, urban, suburban and rural environments, ethnic, class, religious, and racial backgrounds—all contribute to mutual incomprehension. Destroying the bonds across families and communities can wreak havoc. The loss of social capital can hollow out the democratic system and corrode its mechanisms.67

Statistics systematically miss the impact of events not captured in prior data. As a result, statistics can underestimate the creativity of politicians and their ability to capitalize on changes in the political environment by trying new strategies. The future of clean elections depends on who thinks democracy more or less important than victory, and what opportunities they have to pursue power. It has been barely half a century since party “machines” functioned as vast vote-falsifying organizations and elections became relatively clean and accurate reflections of public choices. But that decent period is even shorter if one takes into account racial exclusions, the stopped count in the 2000 election, and the demise in 2013 of the crucial sections of the Voting Rights Act at the hands of the U.S. Supreme Court. If politicians choose to subvert the system, by falsifying ballots and counts, or by other means, much depends on the help they can expect from their supporters or the likelihood their fraud will be uncovered with criminal penalties to follow. Democracy has consistently taken second place to the promotion of capitalism by the U.S. government in foreign affairs. It is not clear how well protected it will be domestically.

Conflict, over jobs, patronage, values, or principles can lead people to subvert democracy, particularly where means for acceptable resolution are not at hand or valued. Once polarized, people can behave like mobs. The scene of Bush supporters screaming in front of television cameras at the vote counters in Florida, in an obvious attempt to intimidate them, shows how politics have changed. Mobs used in politics were once poor, unemployed, and criminal gangs. The mobs in the 2000 election appeared upscale but they were mobs nonetheless—otherwise respectable people turned into a screaming mass to intimidate vote counters, seeking victory at any cost, as if they believed in a Manichean good versus evil contest in which God had decreed Bush should be president and the Devil was backing Al Gore. Supreme Court justices, tied by blood or marriage to people with much to gain from the election but refusing to recuse themselves, reflected the same determined partisanship. Such polarization threatens all the fundamentals of democracy, the trust and expectation of trustworthy behavior, the tolerance and willingness to accept defeat, the mutual care, concern, respect, and desire for approval on which democracy depends.

The Law’s Role in Dividing America

The contemporary picture of American politics is very different from the Tweedledum-Tweedledee politics of the 1950s and early 1960s that Goldwater and many in his generation decried. Many echoed Goldwater’s call for “a choice, not an echo” of the other party, and as early as 1950, political scientists called for “responsible parties” with clearly distinct platforms and the ability to enforce discipline on representatives, so that voters knew what voting for a party meant.68 Both groups have finally achieved what they wished for even though many dislike the results. Some blame the Bush administration69 or the Democrats70 for polarizing America. Many writers focus on changes in religious leanings,71 the culture wars,72 the decisions of political leaders,73 other national political voices, the availability of computers, and the balkanization of America into separate suburbs, jobs, and faiths.74 Those explanations, however, are incomplete.

What has changed substantially is the way the public is sliced and fed back for political representation. Law organizes how we are informed and represented. Legal changes share the blame for the contentiousness of contemporary politics. They have piled on one another sufficiently to have a powerful effect on the political culture in the United States. They will help polarize politics as long as they remain on the books. Law shapes the bitterness in American politics.

Stanford political scientist Morris Fiorina comments that over the course of a century the substitution of civil service rules for government jobs, public services like garbage collection, and government procurement rules for much of the patronage jobs that had once been benefits of politics, led the majority of the public to withdraw from politics because they assumed that there was little left that affected their daily lives. Fiorina argued that their withdrawal left politics to the most committed and ideological among us.75

Let me add a few more legal changes over the last several decades that have made a difference. We have rewritten the law of speech and politics to substitute insanity for milquetoast. For much of the twentieth century, the rules governing the national media and the political nomination process favored broad appeals to the public. Following the 1968 Democratic Convention, that changed for the nomination process.76 By 1980, that also changed for the national media.77 Now American media and political institutions magnify the differences among us.

Law is a powerful tool; it alters what is possible, sometimes intentionally, sometimes in ways that are unanticipated. Law matters.

Legal Changes to the Shape of the Media

From the birth of radio, national policy created a broadcasting oligopoly. In the 1920s, then Secretary of Commerce Herbert Hoover, and later the new Federal Radio Commission, systematically stripped universities of their radio stations in favor of commercial broadcasters.78 The Federal Communications Commission (FCC), which took over in 1934,79 looked for middle-of-the-road ownership, sometimes with political ties, such as its award of broadcast licenses to newspapers that had supported Eisenhower’s election.80 The FCC prevented unions and political parties from owning or acquiring broadcasting stations, announcing that stations were henceforth to be apolitical.81 It restricted broadcasting to a maximum of three national networks through a policy known as localism, which allocated station licenses with signals that had a short reach to as many different communities and congressional districts as possible. To prevent broadcast signals from interfering with each other, the policy of localism made it impossible until recently to give any but the largest cities more than three licensed broadcasters, and most had fewer.82 The FCC later repeated those policy choices in setting up broadcast television. The result was a large set of locally centrist stations with near monopoly status in their broadcasting markets. In turn, by contracting with the big three networks for prime time and other parts of their daily schedules, they received the benefit of bigger budget national programs.83

Vying for a broad audience, networks excluded whatever would offend any part of their audience. They avoided politics except for equal time to candidates during campaigns and some Sunday news interview programs.84 Network politics were distinctly centrist. Counterculture protest songs made the rounds attacking the war in Vietnam but rarely on the tube.85 Extremism gained a foothold on marginal stations, in communities out of the mainstream and during the Great Depression of the 1930s,86 but centrist incentives kept it to a minimum.

The FCC unintentionally reinforced bland television with the fairness doctrine, until abandoned in the 1970s.87 It required broadcasters to provide conflicting points of view on controversial issues of public importance. Actually, it encouraged broadcasters either to stress conflict or avoid anything that looked or sounded like a point of view. Points of view required time for a response to make the opposite point. If the disagreement bored or antagonized the audience, ratings would plummet, advertising would leave, and the networks would foot the bill. So typically they did their best to avoid controversy.88 Broadcast centrism shattered after the media were splintered by cable and the Internet.

The networks enlarged news broadcasts in the early 1960s just in time for the signature moments of national political coverage.89 Americans mourned the Kennedy assassination together on national television. Network news covered civil rights demonstrations in the South, showing demonstrators kneeling in prayer where they were barred from registering to vote; kneeling by the Edmund Pettus Bridge in Selma, Alabama, where state troopers attacked with clubs and gas; or marching to protest segregation only to be attacked with dogs and fire hoses. The national network audience for these events played a large role in the bipartisan passage of the Civil Rights Acts of 1964 and 1965. Later in the decade the whole country saw dramatic images of riots across the United States and carnage in Vietnam. News organizations in this era made conscious efforts to be neutral; biases tended to flow from unexamined stereotypes rather than deliberate manipulation of news.90 However, by focusing everyone on the same images, the same ideas, and the same people, these televised events had a huge impact on American values and politics.91

Beginning in 1976, legal changes boosted cable companies. Prior to 1976, the FCC sharply restricted cable carriage of broadcast signals.92 The Copyright Act of 1976 provided for a compulsory license of broadcast signals on behalf of cable companies.93 That statute conflicted with the FCC’s protection of local broadcasters against cable importation of distant signals and led the FCC to dismantle its regulations.94 As cable reached an increasing proportion of American homes, television viewers soon had many more options than the three provided by the national broadcast networks. The FCC then finally authorized new broadcast networks.95 In the 1990s the Internet added more options. As the viewing audience splintered, there was room for networks with strong political points of view. Differences between networks grew. Fox News and MSNBC do not duplicate CBS or ABC. The market for the center began to dwindle. American politics splintered along with the broadcast audience. Many refer to affirmative action as a “wedge issue” that split the Democrats.96 The splintering of the media may have been just as important; it changed the incentive structure within the newsroom. Where the old three channel broadcast oligopoly tried to avoid angering any part of its audience, the new niche media seeks to excite its audience and deliberately insults others to dramatize disagreement.

Legal Changes to the Responsibility of the Media

Other changes in the law of media liability removed both incentives for and enforcement of responsible journalism. The first significant legal change dates from 1964 when several southern juries were poised to bankrupt the New York Times in a series of libel cases. In New York Times v. Sullivan, a southern jury found the New York Times Company guilty of libel because of several misstatements in an ad placed by leaders of the civil rights movement. The jury reported $500,000 in damages, a very large award in the 1960s, now equivalent to several million dollars.

The then existing law made the press responsible for any misstatement, regardless of how innocent. The sizable award was possible because the law allowed the jury to estimate the damage to the sheriff’s reputation, regardless of whether he actually suffered any financial injury; indeed the errors probably improved his standing among the electorate.97 The Supreme Court responded by blocking legal liability for misstatements about public officials, which was later expanded to public figures, unless they were made with knowledge of their falsity or reckless disregard of the truth.98 The rule, however, immunizes innocent misstatements.99

The second major change, about a decade later, crippled the fairness doctrine, which had required all broadcasters to provide a balanced presentation of controversial issues of public importance.100 Lack of fairness could put demerits in broadcasters’ files to be considered when their licenses came up for renewal. Although nonrenewal was rare,101 broadcasters took that threat very seriously.

NBC broadcast a documentary called “Pensions: The Broken Promise.” It highlighted abuses of pension plans where workers were denied pensions after spending their careers working for a company that had promised retirement benefits. The documentary won awards and was part of the run-up to the passage of pension legislation in Washington, DC. The specific examples were neither controversial nor disputed. However, Accuracy in Media, a conservative watchdog organization, argued that NBC had not made a balanced presentation of whether pension plans were good or bad in general. NBC responded that it had not addressed that issue, and whatever anyone in the documentary said about the topic was balanced. The documentary’s explicit point was that abuses took place and existing law made it possible.102 On those examples there was no controversy—the facts were accurate. With nothing to balance, the fairness doctrine was satisfied.

The FCC sided with Accuracy in Media, and the network appealed. The court of appeals reversed. Any set of facts could cast light on a large number of issues, but it was unreasonable to require broadcasters to address every inference people might draw. The documentary dealt with a significant problem and performed a public service. To require more would discourage networks from addressing significant issues. They would have to take time from otherwise accurate and hard-hitting documentaries to face other issues or run another program. Either alternative would cost them some of the audience for documentaries.

The court of appeals’ approach narrowed the fairness doctrine substantially. Henceforth, only explicit issues would be covered. Following the circuit court decision and questions about the constitutionality of the doctrine, the FCC eliminated the fairness doctrine entirely in 1987. As a result, broadcasters are no longer required to address issues in a fair and balanced way.103

Several years later, the FCC also withdrew from the comparative licensing process intended to license public-spirited broadcasters who would reflect the interests and needs of their communities.104 Licensing had been abused, from official partisanship to favoring middle-of-the-road broadcasters and excluding racial and political minorities.105 Terminating a faulty system was overdue. But that left quality totally dependent on the market. And the arrival of niche broadcasting changed the market’s discipline from rewarding inoffensive broadcasting to rewarding deliberately outrageous programming.

With the development of the Internet, legal changes added additional dodges. One protects Internet providers from responsibility for material posted on an Internet service by someone else.106 The courts held that Internet services are excused from responsibility for what they edit, pay for, or refuse to take down even if they know the content is false. An Internet posting directed people to call Ken Zeran for T-shirts and other memorabilia of the bombing of the Alfred P. Murrah Federal Building in Oklahoma City just six days earlier. The posting implied that Zeran had condoned the bombing. As the hoax spread, he received so many calls, including death threats, that he could no longer carry on his business. He asked AOL to take the hoax down but it did nothing while the story spread around the country. Eventually courts excused AOL from responsibility. When the media investigated the incident and reported that he had never offered such items, the angry calls declined to some eighteen per day.107

Litigation over inaccuracies in the Drudge Report and other Internet sources have been similarly fruitless; the courts almost routinely had found that the information was delivered by someone else, so no one was responsible who could be identified and held liable.108

As a result of these legal changes, the media marketplace has shifted from a highly regulated market toward one that is legally wide open, with remarkable websites alongside toxic swamps.

Nevertheless, the current marketplace was shaped by government decisions that favored commercial over public or educational broadcasting. The FCC spent half a century protecting the media oligopoly that Hoover had created.109 It was finally forced to provide a spectrum for the public broadcasting system in the late 1960s. Most broadcasting, by decree, has been private, for profit, and dependent on advertising. When the era of niche media arrived and the wraps came off both licensing and content, commercial broadcasters were ready to advocate a point of view. However, in a segmented market there are few brakes on what people say.

Before his appointment to the Supreme Court, Justice Lewis Powell prepared a report for the U.S. Chamber of Commerce arguing that liberals (but not businesses) were getting their point of view across.110 He urged corporations to organize to present their point of view to the public. Following his advice, conservatives funded think tanks to advocate economic laissez faire and other conservative causes. Politically inclined investors and religious institutions established new stations and networks, including several with substantial media interests.111

The market does not put a brake on the partisan use of segmented media.112 Very large rewards are available through the political system. The economic benefits from redirecting the course of legislation—and convincing people to support claims for deregulation, tax exemptions, friendly legislators or other causes—can be enormous.113 Segmented markets, however, are not effective in policing the use of the facts. Checking on information has large costs—in terms of time, effort, and sometimes money. Even if they are eventually self-correcting, the passage of time matters, just as water in the mountains is not instantly available at sea level just because it flows downhill. In a fractured marketplace, there is plenty of room for angry, partisan broadcasters, yet very little room for those who are annoyed to object to broadcasts, especially if they are not part of a loyal target audience, and those who want to discover competing views without taking the time and trouble to search for them.

The legal changes outlined above permit an organized and deliberate ideological use of the media, but there are many reasons for media bias: to persuade, to draw attention, and to save on the costs of news gathering.114 Much of it is happenstance, from press releases, from whoever happens to be president, or from unexamined prejudices and stereotypes.115 A study that appeared in the neoconservative journal, THE PUBLIC INTEREST, concluded that news had a conservative impact in the middle and late 1960s because of the riots in ghettos around the country, while the entertainment media had a liberal impact because of their concentration on stories about current social issues.

Jettisoning rules of responsibility makes it easier for portions of the press to move toward the extremes. There are always other factors. The press was highly partisan for much of the middle and late nineteenth century, yet it cannot be blamed for all the partisanship of that era. In the first half of the twentieth century, the press deliberately moved toward a “neutral,” nonpartisan format so that advertisers could reach potential buyers on both sides of the political spectrum (although advertisers reinserted themselves in subtler ways).116 The legal changes of the last decades facilitated another shift toward a press that is more partisan and less responsible. And that has an impact on our politics.

Many liberal voices urged a more diverse press and fought restrictions that led to oligopolistic control of the media in the name of free speech. That the changes fanned the flames of a conservative counterrevolution is partly the happenstance of a democratic system in which Americans do not believe in controlling the press. Nor are liberals urging that we return to the era of dominant broadcasters and squelch the Internet. The trend of change seems proper, but the justification does not erase the impact.

Journalistic Principles

Journalism also changed. At the start of the twentieth century, the so-called penny press sought a mass audience. Advertisers wanted to appeal to customers without regard to politics. Opinion was confined to the editorial page and news took on a hard facts approach, presenting a record of who said what and what happened where. Joe McCarthy upset that model by using it too well. By reporting his unsupported allegations of disloyalty without comment, the news media gave McCarthy enormous influence. In the wake of McCarthyism, the press looked for ways to report the news without being so vulnerable to unscrupulous allegations. The newsman’s obligation was to get at the truth, to provide a more carefully digested record.

A liberal or a conservative truth was not supposed to exist. Truth should be independent of party and partisanship. But the news media came under attack. Nixon’s first vice president, Spiro Agnew, attacked the national media as “effete” snobs before he was forced out of office for corruption. Bob Woodward and Carl Bernstein’s investigation of the Watergate scandal brought Nixon down. The impeachment was bipartisan, but the reaction to it was highly partisan. Many on the right wing of the Republican Party blamed the press instead of Nixon for hounding him out of office.

Journalists increasingly settled on a new paradigm for reporting. Conflict provided drama, and drama became the engine to sell papers and win viewers. Get an allegation, and then ask someone on the other side for a response. Now you have conflict plus “unbiased” and “responsible” reporting, which does not always make clear what actually happened. The sides are easier to tell apart. But the ubiquitous combat storyline contributes to polarization.117

As a result, a splintered press, partly freed of responsibility, found its theme in the maximization of antagonism on paper, screen, and radio. That reorienting of press content would be matched by a reorientation of politics.

Legal Changes to the Nomination System

Mass meetings replaced nomination by elected officials in the Jackson era and radicalized politics. Political bosses reasserted control with a variety of tools, some legal, some not. Storied political machines—from Tammany Hall and Boss Tweed in New York to the Prendergast machine in Missouri—decided nominations in infamous smoke-filled back rooms of party conventions. In the late nineteenth century, Progressives backed legislation to require parties to nominate by primary elections.118 Several decades later, primaries helped clean up American politics. Like many reforms, it had both intended and unintended consequences.

Conventions, and the professionals who ran them, looked for coalitions to win the general elections. They wanted competitive candidates. Primaries are relatively divisive, particularly where a plurality suffices to win the nomination. Where only party members can vote in primaries, winners reflect the party’s base but not necessarily the general public that votes in general elections. Candidates elected by each party’s base will be far apart. The late V. O. Key documented the pressure of primaries toward the extremes in work repeatedly corroborated since.119 The mathematics is simple. If each party has close to 50 percent of the voting public, then it takes just above half of either party, and often less, to nominate a candidate who can go on to win the general election. Just over half a party, which is roughly half of the public, is equivalent to 25 percent of the general public. In a well-fought election, that 25 percent of the voting population, and often much less, is enough to control the political system.

Primaries were held for state offices until the Democratic Party changed its presidential selection rules in the 1970s.120 The party wanted to secure black participation in southern state delegations and to take back the party from bosses like Mayor Daley, who used the police to abuse demonstrators during the 1968 Convention in Chicago. The Democrats determined to end boss control by substituting more democratic selection systems throughout the country. In practice this meant a national move to primaries for selection of state delegations to the national party conventions. In the wake of those changes, the Republican nomination process changed as well.

After those changes, party conventions lost control over presidential nominations, which now were decided in the primaries before the conventions opened. Use of primaries to nominate presidential candidates pushed both parties further from the center, driven by the mathematical logic of the primary nominating process. Both parties tried to alter the national conventions in favor of centrist delegates by adding so-called super delegates, who were existing office holders and not nominated in the presidential primaries. The primaries, however, dominated the conventions.

The percentage of the public that it takes to control presidential selection depends somewhat on whether the parties allow each state to use the winner-takes-all system to allocate the states’ votes in the conventions. The controlling proportion has to be adjusted slightly upward to the extent that parties allow independents to vote in party primaries, or divide their delegates in proportion to the votes for each candidate.121

The mathematics are equally applicable to the Republican primaries. Primaries have been the battleground between liberal and conservative Republicans. The dominance of the right-wing Republicans over their party might have been achieved in other ways; they got their workers out and into every available party position. Nevertheless, the right wing has dominated many elections without the support of more than about a quarter of the general public.

So the extension of the primary process has tended to deepen the political divide.

Legal Changes behind Safe Seats

Law drives the political divide in yet another way. Gerrymandering proliferated after the Supreme Court held malapportionment unconstitutional in the early 1960s.122 Malapportioned legislatures benefitted “geographically protected oligarchies of rural and small-town legislators” by assigning them more representation.123 Gerrymandering works by designing districts that make elections safe from voter judgment, both in individual seats and in the complexion of the state legislative delegation.124 Districts are made safe for one of the parties by excluding voters from the opposite party.125

The lack of competition in safe districts encourages politicians to move toward the extremes since nothing pulls them toward the center. “As a result, members speak more to their parties’ ‘bases,’ which provide most electoral and financial support.”126 The move to the extremes supports a very bitter politics. As Samuelson puts it, “stridency is a strategy.”127 Partisan extremism is reinforced by the overall gerrymandered pattern protecting the safety of one political party’s legislative wing.

That combination of the constitutional revolution against malapportionment, and the subsequent proliferation of gerrymandering to write incumbent and partisan protection into the laws shaping districts have helped stamp stridency on modern American politics. The Supreme Court has protected legislative gerrymandering, saying it has not found a “manageable, reliable measure.”128

Legal Construction of Campaign Money

In the early 1970s, the federal campaign finance statutes were designed to equalize the positions of the candidates and give them more control over their own campaigns. The Supreme Court immediately denied the first and reversed the second. It sustained limitations on contributions but only on the ground that contributions could corrupt politics, functioning like bribes or at least appear to do so. The Court also overturned limitations on independent contributions saying they did not have the same likelihood to corrupt or appear to corrupt. By unleashing independent expenditures, it took control of the campaigns out of the hands of the candidates—a consequence that has been growing in the light of subsequent decisions overturning pieces of the campaign funding statutes coupled with the increased need for financial support for campaigns.129

The statutes created limits on what could be contributed to individual candidates, political parties, total annual political contributions to all recipients and other restrictions on how money could be raised and spent.130 Coordinated expenses would be treated as contributions and would likely exceed contribution limitations or violate statutory prohibitions. Therefore, candidates and their committees had to do their own fundraising without coordination. Political committees that supported several different races also had to separate their state and federal fundraising.131

This combination of the Supreme Court decisions and the statutory provisions appears to have intensified the forces driving politics away from the center by splintering the campaign among many independent committees,132 and increasing the importance of major donors.133

One consequence of the new campaign finance rules was noticed immediately—they made fundraising much harder, and therefore took much more of the time of members of Congress as well as presidential candidates than fundraising had taken before. The increased difficulty of fundraising guaranteed that those who could accomplish it would take on increased importance.134

A succession of legendary fundraisers in both parties kept upping the ante with new and different voter lists and techniques to reach them in both large and small amounts.135 Everything was available for auction—a chance to meet the president, alone or in groups; a visit to the White House, including a stay in the Lincoln bedroom or a tour of other private parts of the building, each for a given size of donation or other service to the candidate or party.136 In what became known as the K Street Project, congressional Republicans under the leadership of Tom Delay made lobbyist access dependent on exclusive contributions to and work for the Republican Party.137 Fundraising became dominated by so-called bundlers, those who raise large sums from many different people, each of whose donations are within the statutory limits, but are then turned in together in large aggregates, often by bundlers who collect and turn over more than a million dollars.138

As noted earlier, the rewards of political success can easily be large enough to justify large donations. Significant changes in taxation and regulation create very large stakes large for many. Recent tax cuts, which reduced the progressivity of the tax code, rewarded large donors and shifted the burden further down the income scale.139 The larger the gulf between the parties, the greater the stakes. As stakes appeared larger, the size of donations followed and their complexion became more ideological, with donors seeking to move politics further left or right. Two much discussed examples among many were the Koch brothers’ support of the Tea Party and Sheldon Adelson’s support of Newt Gingrich in the 2012 Republican presidential primaries.

Now a major donor who wanted to influence the campaign could not do it within the party or candidate campaigns. Independent expenditures became the legally required vehicle for major contributions. The statutory requirements also became the justification for donors to work independently of the candidates without earning condemnation for interference. The statute and court decisions changed the culture of campaign finance.

In other words, instead of taking money out of the political equation, political scientists found that money mattered more than it had before the statute was written. Powerful and independent donors—using statutory requirements for separate, independent organizations whose campaigns are uncoordinated with the candidates—have seized the political megaphone and now contribute to the polarization and splintering of American politics.

The polarization may have been increased by the anonymity of the new vehicles for independent expenditures because they have not been required to disclose their donors. That became possible as the result of a complex interaction of tax and federal election campaign law with Supreme Court, Federal Election Commission, and Internal Revenue Service decisions.140 But the effect is to free the contributors from disclosure and any pushback as the result of their positions. In political terms, organizations may feel free to take more strident positions than some of them would if their donors were disclosed.

Legal Resegregation of the Landscape

Law has also played a significant role in separating us into communities by wealth, faith, and race. Those physical divisions make it harder to trust, collaborate, and empathize because we do not know each other. In those ways they contribute to political extremism. Because the housing market and other private construction projects are involved, people make the mistake of thinking that residential separation is all about private choice, what the courts call de facto segregation. American geographic politics aggravate the separation of Americans.

Government facilitates what Robert Reich called the “secession of the successful.” Public roads make new communities possible, and then encircle and separate them from older, poorer ones.141 State rules for municipal incorporation facilitate formal secession from older cities and communities and prevent central cities from expanding to include newly settled areas.142 State law protects new communities from responsibility for the schools, pools, parks, and other amenities for the children of the people who clean, sweep, guard, and otherwise serve the wealthy, leaving their own communities to deteriorate.143 Local governments zone to separate poorer from wealthier residents, requiring one and five acre tracts which force people to travel further for work and other needs.144 In combination, roads and zoning protect residents’ access to some jobs and block access to selected others. The physical divisions allow the wealthy to believe that everyone is on their own, as if they are citizens of separate countries, and not entangled in each other’s fortunes.

Government has begun separating us by faith in the ways it handles schools, zoning, public funds, and discrimination. Religious service organizations, like Catholic Charities or Jewish Family Services, long operated in nonsectarian ways, partly to administer government funds and other legal requirements, including the tax code. The faith-based initiatives of Clinton and Bush now allow religious groups to be much more explicit about their religious mission in delivering services than merely advertising their sponsorship of charitable activities.145 Federal law now forces cities and towns to loosen their zoning rules for the benefit of religion.146 Vouchers and tax credits now ease exit from the public schools. Charter schools cannot discriminate, but they make it easier for communities to regroup within the public school system.

The government is also behind the racial resegregation of America. The Court likes to describe contemporary segregation as de facto, as if it were the natural result of lots of individual decisions, beginning with “white flight,” but that is a white wash. As noted at the beginning of this chapter, people had many reasons to leave the cities that had nothing to do with race, and government legislation made sure blacks would (or could) not follow.

Decisions in the FHA were intended to ensure different treatment of African American would-be home buyers. Those were clear violations of the federal equal protection law even under the narrow definitions the Supreme Court gave it after Chief Justice Warren had left the Court. Contemporary with the decision to red-line black neighborhoods and bar blacks from obtaining mortgages in the suburbs financed by the FHA, were decisions to tear down black communities, scattering them into new neighborhoods in equally dilapidated housing less desired by whites or warehousing them in inhospitable “projects.” States facilitated the incorporation of new towns with no responsibility for the old neighborhoods they abandoned, which zoned them out and blocked construction of housing blacks could have rented near jobs they might have held.

Contemporary segregation is the source of many present social ills. It has the imprimatur of government all over it. All the pathologies we associate with poor black ghettos are a product of the past half century and stem from decisions in the white world.

Consequences—A Perfect Storm

Journalistic conventions that emphasize conflict; media that now appeal to narrow segments of the public and are held by ideologically driven owners; gerrymandering that segregates the voting population into safe districts; primary elections and campaign finance rules that pass control to politically active portions of the parties; and a population segregated into separate communities—all come together in a bare-knuckles brawl of extremists. That is what the papers like, what people find on their screens, and the way primary election math adds up. The winners reinforce the point by demonizing everyone who disagrees. The two parties have become fighting faiths. That outcome is facilitated by government residential policies that obscure the stakes of large segments of the population in each other’s welfare.

We have had fighting faiths before—before the Civil War, America’s bloodiest, and perhaps holiest, conflict; during the Populist movement, when many died over the union movement; and the beginning of Jim Crow. In that period, segregation was designed largely to divide the Populists and keep power in wealthy hands.147

There are ideological divides behind these contemporary fighting faiths. Political systems can moderate divisions or make them worse. The legal changes support the latter. Changes in law and judicial opinions would not themselves suffice to cause all the polarization we have seen. But it would be foolhardy to conclude that the changes in the law played no part in facilitating and exaggerating the changes we have seen.

Democracy is in peril when the objectives of partisanship override respect for those opposed. It is in peril when the splits among us organize themselves so that the same people are so consistently opposed that we understand each other as enemies, with no stake in each other’s welfare, so that we seek only to take advantage of others or defend ourselves, when progress is defined as someone else’s sacrifice and there is no shared welfare toward which all jointly need to contribute and protect one another. Democracy is not only in peril when we are pitted against each other as our country’s worst enemies, but also when we lose faith in the care and concern of our own governments, when sensible caution turns into outright hostility, and when “government is the enemy” rather than a friend. When totalitarianism comes to this country, it will surely come dressed in red, white, and blue, “saving” us from ourselves.148

The political process magnifies the differences among the public. No single person or politician has the power to change public opinion, but a chorus does. And the feedback loop continues. Once the public is on fire, it pushes politics further. Polarization becomes a self-aggravating feedback loop, and therein lies the danger.

There are contrary forces. Presidential politics, for example, can encourage centrifugal or centripetal forces and other changes can undercut the polarization encouraged by a splintered media, primaries, and segregated privilege and poverty. One can hope.

Risks to Democracy

Half a century ago, American political scientists would have welcomed greater polarization of the parties to give Americans a much clearer choice.149 Half a century later, the entire political environment has undergone large changes and the effects on the polarization of the parties seems to have gone too far. For scholars of democracy and its history, polarization and intolerance threaten commitment to democracy.150 The polarization of the leadership is even more threatening than the polarization of the masses.151 Democracy has been brought down repeatedly when competing partisans came to believe each other disloyal to the democratic system.152 Both the reality and the perception are dangerous. Mutual charges of fundamentally undemocratic behavior reflect the depth of modern distrust and have become a trope in modern politics.

Mutual distrust has led the political leadership of numerous countries to believe that winning was more important than respecting the wishes of the public, until the parties goaded each other in a kind of death spiral until one party seized the reigns. Sometimes that dance of death was initiated by economic policies, sometimes by violence, and sometimes both. In other words, polarization at the top endangers democracy.153

Americans have gone through periods in which they could not in fact trust each other. The election laws in all of the states reflect periods when political success was governed by dirty tricks, fraud, and intimidation. Americans are used to attributing those flaws to party machines of a bygone era.

The contested 2000 presidential election brought these issues back to the fore and the win-at-all-costs behavior reflected the underlying and poisonous malaise. Invalid absentee ballots were counted in some Florida counties but not others.154 The procedure of voting and the form of the ballots differed by county, and in those counties that used punch cards, the much publicized hanging chads (incompletely pushed through the card stock) were scored differently, a difference the Florida Supreme Court tried to end with consistent standards until the U.S. Supreme Court barred it from continuing.155 A private computer data company with strong ties to the Republican party was hired without competitive bidding to purge the Florida rolls of people who should not vote, but their sloppy process purged tens of thousands who were qualified, most of them Democrats.156 The counting of ballots itself was interrupted by a mob of screaming party workers reminiscent of the goon squads that steered many an election at the turn of the twentieth century.

Politics in Congress began to show serious cracks, with one party excluding the other from many traditional opportunities to examine legislation, participate in hearings, or gain access to information, and both parties changed the rules and practice of filibustering. Bipartisanship was gone and many in Congress reacted to the change in behavior with dismay.157

Government is all about conflict, sometimes creating it, sometimes resolving it, managing and mismanaging it, adding fuel to the fire. Conflict by itself says very little about whether governments will fail or democracy will be overthrown. How conflict is managed is crucial.158

America is not immune from polarization, paranoia, or demagoguery.159 Despite an honored tradition of people protesting in public, mayors, sheriffs, and presidents have filmed,160 hosed, shot,161 arrested, and penned peaceful demonstrators,162 claiming fear of what the demonstrators might do. The United States has been sufficiently moved by fear of the masses elsewhere to intervene repeatedly against popular democratic leaders in such places as Guatemala, Chile, and Iran.163 That fear of others in America too threatens to corrode our own democracy.

Paranoia and polarization are dangerous.

What Works

Opposite the power to demonize others and divide the population is the need to break down barriers. One effort has been to change political incentives by means of open and blanket primaries, which handed the ballot to wider groups of voters, not just those registered in a single party. Morris Fiorina believes it may have been a mistake for the Rehnquist Court to strike down one such effort in California because it had the potential to reduce polarization.164 There are potentially unintended consequences of such a change, however, if it weakens the role of parties. As Fiorina also points out, political vacuums are short-lived, and if parties are weakened, still narrower factions and interest groups will control politics.165

One would want to tread carefully on libel law but Marshall’s suggestion of restricting legal damages to the actual economic cost might strengthen it constructively.166

Land use and discrimination law need to be rejuvenated and the American system of bringing people together needs to be revived for everyone. The American melting pot stimulated by mixing us together in the same schoolrooms and military barracks167 has support in the scientific community, where it is known as the contact hypothesis.168 How well it works depends, among other factors,169 on the speed of change,170 the length of time people have to overcome their differences, support by those in authority,171 and whether the elites of wealth and power are themselves brought together.172

Racial profiling at every level of the criminal process, plea bargaining under the threat of lengthy sentences, exaggeration of punitive measures, and the willingness of too many to look the other way while a portion of the population is subjected to a highly destructive system which white Americans do not apply to themselves have replaced the systematic racial violence of the KKK.173 And that needs to end.

The social capital that Putnam described is not automatic; it did not spring magically from the grass of northern Italy; it depends on us. In a world sinking back into religious and racial violence, our unity at home is precious, and contingent on our commitment to the ideal and the practice. As Putnam eloquently put it, “the central challenge for modern, diversifying societies is to create a new, broader sense of ‘we.’”174