An hour before dinner one evening in May 2004, my fellow national security advisers from Europe and I took a boat ride through the canals of Berlin to view the rebuilding of the “new” German capital. The journey seemed endless as I tried to disguise my discomfort with being on the water and to make amends for a recent rather spirited defense of the prospects for democracy in Afghanistan and Iraq.
Earlier in the day, in response to one of my colleagues’ comment that there was no “tradition” of democracy in either Iraq or Afghanistan, I asked pointedly, “What precisely was the German democratic tradition before 1945? Would that be the ill-fated experiment with the Kaiser? Bismarck? Hitler’s election?” Germany had experienced the Enlightenment, but, obviously, democratic values didn’t exactly take root. To be fair, my response had been provoked by a not too thinly veiled suggestion that Americans were naïve about the prospects for the spread of democracy. I’d rather be naïve than cynical, I had thought to myself.
That evening on the canal, I tried to soften what I had said, explaining that American democracy had taken a long time to mature. “The American Constitution was born of a compromise between slaveholding and non-slaveholding states that counted my ancestors as three-fifths of a man,” I explained. “My father couldn’t register to vote in Birmingham in 1952. And now Colin Powell is secretary of state and I am national security adviser. People can learn to overcome prejudices and govern themselves in democratic institutions.” My colleagues seemed a bit taken aback by the personal nature of my policy comment. Perhaps they didn’t think my race gave me a different perspective on democracy’s challenges—and its opportunities.
Returning to my hotel, I felt so American—with a kind of optimism about the rightness of democracy for everyone, everywhere, at all times. And I realized that much of that conviction proceeded from my own understanding of and experience with American institutions.
Not long before my term as secretary of state ended, I accepted a long-standing invitation from the late Allen Weinstein of the National Archives for a visit and tour. I wanted first to see the Emancipation Proclamation, which freed my ancestors—or, more correctly, some of my ancestors. Like most black Americans, they were both slaves and slave owners. My great-great-grandmother Zina on my mother’s side bore five children by different slave owners. She somehow managed to raise them all and keep them together as a family. My great-grandmother on my father’s side, Julia Head, carried the name of the slave owner and was so favored by him that he taught her to read. Her precise relationship to the Head family remains something of a mystery, but you could look at her and see that her bloodlines, like mine, clearly bore slavery’s mark: My DNA is 50 percent African and 40 percent European, and there is a mysterious 10 percent that is apparently Asian.
Reading the Proclamation, I could hear the footsteps of Rices, my father’s ancestors, and Rays, my mother’s. I marveled at their perseverance in the shackles of this most brutal of institutions. I said a little prayer of thanks to them, and moved on to the Constitution, and then all manner of treaties and executive agreements, signed by my predecessors as secretary of state. Preparing to leave, I realized that I had not seen the Declaration of Independence.
How long has it been since I read it? Have I ever read it in full—beginning to end? I didn’t verbalize either question, perhaps a little embarrassed that the answer to the last question was probably yes, as a kid at Brunetta C. Hill Elementary School, but, just as plausibly, never.
So I stopped to read the Declaration in its entirety and contemplate what it says about the moment when people decide that they’ve had enough of tyranny and oppression. After the soaring and familiar rhetoric that enshrines the principles of equality for all (“We hold these truths to be self-evident, that all men are created equal”), the document recounts multiple grievances against the British crown and King George III himself. “He has plundered our seas, ravaged our Coasts, burnt our towns, and destroyed the lives of our people,” it says. “He is at this time transporting large Armies of foreign Mercenaries to compleat the works of death, desolation and tyranny, already begun with circumstances of Cruelty & perfidy scarcely paralleled in the most barbarous ages, and totally unworthy the Head of a civilized nation.” This fist-shaking litany is a reminder that the moment when people seize power is not the most propitious for rational discourse about how to secure newly won rights. A declaration throwing off the old order is most assuredly not the establishment of the new.
The Anglo-Americans were an ethnically homogeneous lot who left England and, having occupied the “New World” for more than a century, came to think of themselves as a people distinct from the British crown. The constant interference in their affairs had united a sizable portion of them in disgust and despair and provoked a strong impulse to separate.
“When in the Course of human events,” the Declaration begins, “it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.”
America was a hardscrabble place where opportunity abounded, but only for those who would work for it, and so it attracted that kind of people. The old social classes and noble orders did not follow them to the New World because the rich and powerful of Europe were comfortable remaining where they were. Wide-open spaces and the possibility of acquiring land—at the expense of native populations—further encouraged social mobility. The country thus developed with a strong tradition of property rights. And without an aristocratic order to overthrow, America was born in as near to a state of tabula rasa as one can imagine.
Given the complexities of today’s highly interconnected world, America’s birth did indeed take place in simpler times. News traveled up and down the Atlantic coast at the pace of weeks, not hours. People and goods moved at the pace of months, not days. Colonial times were simpler, to be sure, but they were not wholly simple, and success in the American experiment was never preordained. Even with their manifold advantages, the Americans stumbled repeatedly along the path to a stable democracy. It is a point worth remembering as people in less favorable circumstances struggle too.
As we know, the revolutionaries came close to losing their war for independence. George Washington’s ragtag forces were challenged not just by superior British military prowess, but also by weak institutions that almost failed to provide for the soldiers’ needs. Rhode Island refused to pay its share of the costs for the army; the Continental Congress constantly interfered in matters of military strategy and tactics (leading Washington’s aide-de-camp, Alexander Hamilton, to lose confidence in the structure of the government under the Articles of Confederation); and in matters of diplomacy to support the war effort, numerous states cut their own deals with European powers. Indeed, the Articles of Confederation had created no executive at all. The system was just too weak to protect the interests of the new republic. The Founders learned from experience that the young country needed a central authority that worked.
Yet, born as the nation was out of a rejection of tyranny, they were suspicious too of a government that was too powerful. So the question was one of how to create institutions strong enough to protect the people’s newly won rights, but not so strong as to threaten them. This essential balance lies at the heart of the rough-and-tumble story of the birth of America’s Constitution. And it remains today the most important challenge in establishing a new order on the ruins of the old in countries across the world.
The narrative of those hot days in Philadelphia—hot both in temperature and in the intensity of debate—that gave rise to the American Constitution has been well chronicled elsewhere.1 Suffice it to say that the great histories of those efforts show clearly that it was extremely hard to strike the balance that the Framers sought. And what they achieved is a remarkable compromise between competing visions and interests. The writing and ratification of the Constitution was an intensely political process, not a divine bolt from the blue that produced perfect institutions. As James Madison put it in Federalist No. 40, “The choice must always be made, if not one of the lesser evil, at least of the GREATER, not the PERFECT good.” And in the last of the Federalist Papers, he said, “I never expect to see a perfect work from imperfect man.” This rings true more than 220 years later when we are judging the efforts of those who are trying to establish a new order.
In the pages that follow, we discuss five key aspects of the Founders’ institutional design. I have chosen these particular elements because they appear over and over in stories of democratic transitions of the past and in those that are still unfolding across the world.
First, the Founders tackled the challenge of creating institutional balance—between the states and the center through federalism—and among different parts of the federal government itself. Second, they were determined to limit the role of the armed forces—to maintain control so that they would protect the country but not threaten the political order. They understood that the state had to have a monopoly on the use of force. Third, these men were wary of a marriage between politics, religion, and the power of the state, and they sought to separate them. Fourth, they left space for the private sector and the emergence of civil society. These would be areas where government’s role would be limited and individual initiative would flourish. Finally, they bequeathed a spirit of constitutionalism to their descendants. That has allowed generations of Americans to seek their rights through appeal to its principles. It has been a novel way to deal with the vexing problem of minority rights—even for those whose ancestors came to the country as slaves.
The United States was fortunate to have Founders who both intellectually and emotionally understood the critical importance of institutions. It was, after all, the failure or absence of institutions that led them to rebel in the first place. British subjects had rights on paper and according to tradition, but those rights were being violated by the unchecked powers of other political actors—namely the King. The only way to make those rights meaningful, the Founders believed, was to build an institutional framework to protect them. They knew that if the nation were to be stable, individual rights had to be exercised according to rules that all people could understand and trust.
While tyrants were capricious with nothing to restrain them, democratic governments relied upon and were limited by the will of the people. But the people’s wishes wouldn’t be revealed every day, in every circumstance. Indeed, the Founders were concerned that the will of the people could easily become the preferences of the mob. Thus democratic institutions became a way not only to limit the government but to channel popular passions and interests. Citizens had to come to respect the institutions that would represent and protect their rights. They would be free to associate with others as they wished and there would be a watchful press that could not be abridged, censored, or otherwise checked by the government. The “Fourth Estate,” a free press, would be the eyes and ears of the people, holding their leaders accountable.2
The debates were intense about every aspect of institutional design. The question of how to deal with executive power exposed splits among the Founders. Alexander Hamilton and James Wilson of Pennsylvania were advocates of strength. As Hamilton would say, “A feeble Executive implies a feeble execution of the government. A feeble execution is but another phrase for a bad execution; and a government ill executed, whatever it may be in theory, must be, in practice, a bad government.”
Others worried that a strong presidency could evolve too easily into the very system they had fought to replace. As Patrick Henry warned supporters of the Constitution, “Your President may easily become King.”
The delegates found a way to more or less satisfy both sides—settling on divided government. A scheme of checks and balances grants powers to two legislative houses and in turn three coequal branches. In its first incarnation, Hamilton notwithstanding, the balance clearly favored Congress. The legislative branch is commissioned in Article I, and the executive branch in Article II. And as my colleague at Stanford, American historian David Kennedy, has noted, there are fifty-one paragraphs addressing the role of the Congress and thirteen concerning the presidency—eight of which lay out mechanisms for election, and four of which detail presidential powers. By the way, one provides for impeachment.3
The debate about the executive was in some ways a proxy for the larger issue of the role of central authority in the new country. The Founders may have been chastened by their experiences with the Articles of Confederation, but they were not of the same mind about what to do.
Through a series of compromises they came to a conclusion: The United States would rely on a system of “enumerated powers.” The new authorities that would be given to the central government would be explicitly spelled out in writing (they would be “enumerated”), and all powers not mentioned would be reserved for the states.
Still, federalism was not just a matter of constraining central authority. The Founders and many after them believed that government closer to the people was both more accountable and more effective. Federalism was a practical way to govern over a diverse and massive land.
The Founders established the capital of the new Union in the donated swampland between Maryland and Virginia in 1790, and many returned to their statehouses, thought to be of far greater importance. Early on the federal government was not intended to do very much.
Over time, buffeted by the requirements of continental conquest and defense, a Great Depression, the civil rights struggle, and ultimately the demands of modern governance, the role of the federal government would grow. In particular, the expectations of the president would multiply, and today we are closer to the strong executive that Hamilton favored. Nonetheless, the presidency is still encased in a web of institutional constraints—two houses of Congress made up of 535 people; and an independent judiciary comprising 108 federal courts, including the Supreme Court. And there are 50 governors and state legislatures with strong views about how their states should be run. The proper balance between central authority and the states is as hot a topic today as it was in 1787. Everything from voting requirements to K–12 educational standards to the adoption of environmental policies sooner or later becomes a test of where the writ of the center stops and that of the states begins.
The Founders’ concerns about balance reached even into the realm of military power. The proper use of the military has been a recurring theme throughout our history.
I will never forget President Bush’s exasperation one day in a Situation Room meeting during the Katrina disaster, as survivors were pleading for help. “What in the world is posse comitatus?” he asked. He was frustrated with the constitutional lawyers around the table who were invoking this phrase when telling him why he could not send the American military into the streets to deal with the lawlessness of post-Katrina New Orleans.4 Somebody needed to bring order to the situation—and assistance to the victims—and the military surely would have been capable. But there were other considerations—institutional considerations.
The Posse Comitatus Act of 1878 was passed as part of the compromise that withdrew American troops from the South and ended the period of Reconstruction. The meaning of the phrase is essentially “local law enforcement,” and the 1878 act prevents the president from using federal troops “as a posse comitatus or otherwise to execute the laws” within the United States.
Militaries are necessary for the defense of the republic but are potentially a threat to its democratic governance—this is the paradox. A standing army could be used by the state to undermine the liberty of its citizens. In modern literature on civil-military relations, students of developing countries have frequently asked the question “Why does the military intervene?” The more relevant question is “Why doesn’t the military intervene more often?”5 Confronted with the failure of political institutions and ensuing circumstances, militaries—which are by definition armed and organized—are certainly in a position to take matters into their own hands.
Some of the most robust debates at America’s founding emerged around this issue. Thomas Jefferson wanted simply to arm citizens in response to threats. The “Minute Man,” one day a farmer and the next a fierce defender of the nation, was his ideal. He saw the British redcoats, professional and commanded by an autocrat, as the antithesis of democratic values. Washington, a military leader himself, recognized that standing armies could be “dangerous to a state.” Another Founding Father from Virginia, Richard Henry Lee, said that standing armies “constantly terminated in the destruction of liberty.” It’s hardly surprising that men so preoccupied with the dangers of centralized power would find the idea of an army to protect that power absolutely terrifying. The Shays’ Rebellion of 1786, led by a veteran of the Revolutionary War and drawing heavily on ex-military men, was the closest that America would ever get to a military coup—or at least that is the way it came to be portrayed—and it seared into memories the possibility of armed insurrection.6
On the other hand, Hamilton, Madison, and the Federalists were more concerned that the young republic be able to defend itself. And never one to worry much about contradictions (he was both a slaveholder and the author of the language of equality for all), Jefferson would become both the father of the American Navy and of West Point, the country’s first military academy to train officers in the art of war—all the while glorifying the citizen solider, who was “in all manners a superior choice to defend the nation.”
Interestingly, the Framers did not go to great lengths to address questions of civil-military relations or to ensure civilian control of the government in the Constitution. Rather, they tackled the problem by drawing upon the decentralized structure of the government and the ability of the three branches to check one another. The president would be commander in chief, but he could not declare war, nor could he fund the effort. The power to declare war would be vested with Congress. Other authorities would be divided between the House of Representatives (where all funding bills originate) and the Senate (which must ratify treaties and confirm all ambassadors and cabinet officials).
Moreover, the Framers used the federal structure to defend the country through recognizing state militias and constituting them into a National Guard. The states were not allowed to raise troops without explicit congressional authority. But the militias, made up of part-timers who lived at home and worked in civilian jobs until needed, were given the task “to execute the laws of the Union, suppress insurrections and repel invasion.” The Congress was to assure a certain standardization of procedures and training, but for the first hundred years of the country’s existence, the backbone of America’s military force did not depend on professional soldiers. The Spanish-American War and then the successive world wars would shift us toward a professional military. The National Guard would remain the “citizen-soldiers” that Jefferson so admired.
The Guard (and a second component, the National Reserve, made up of soldiers recently retired from active duty) is to this day a critical part of America’s fighting force in the country’s wars. They also remain the first line of homeland defense against natural disasters and, sometimes, civil strife. That division of responsibility has permitted the United States to make war against external enemies but not bring the military into domestic conflicts and thus the country’s politics.
Although the National Guard’s role as a militia in each state leaves it under the command of that state’s governor, the Guard can also be “federalized” under certain circumstances and called into the service of the president. When tensions or disagreements exist between a state governor and the president, the National Guard has sometimes been caught in between, leading to some of the most contentious moments in American history.
That includes the day in June 1963 when the Alabama National Guard was called upon by Governor George Wallace to prevent black students from entering the University of Alabama. Even though Brown v. Board of Education had declared segregation unconstitutional a decade earlier, university and state officials had gone to great lengths to prevent black students from enrolling. Now three admitted black students with impeccable credentials—Vivian Malone of Mobile, James Hood of Gadsden, and Dave McGlathery of Huntsville—approached the door to Foster Auditorium on campus, and the governor, backed by the Alabama National Guard, stood in their way.
My family and I watched the spectacle unfold on television. Standing in the doorway and flouting demands by federal marshals and the deputy U.S. attorney general that he step aside, the governor clearly did not intend to abandon his inaugural promise of “segregation now, segregation tomorrow, and segregation forever.” A few hours later, the commander of the Alabama National Guard unit, General Henry V. Graham, approached Wallace. “Sir,” he said, “it is my sad duty to ask you to step aside under orders of the president of the United States. As a member of the Alabama National Guard, I have been ordered into federal service this morning at approximately 10:30, and it is my duty to ask you to step aside in order that the orders of the court may be accomplished.” Graham’s Guard unit, the 31st Dixie Division, had been federalized and now reported to the president, not the governor. The general had been given conflicting orders and he had to choose which one to follow. Thankfully, he made the right choice. Wallace stepped aside from the doorway and the University of Alabama was integrated.
Americans have come to trust the arrangements that constrain the military’s political role. Many questions that were raw at our founding have receded into history. President Bush’s exasperation that he could not simply deploy the American military into the streets of New Orleans is testament to how distant concerns of a military takeover have become.
The marriage of religion, politics, and the power of the state has perhaps been the single greatest source of worldwide civil strife throughout history. Individual citizens hold multiple associations and loyalties and with varying intensity. Religion, though, makes a claim on the believer that is superior to any other. If that claim is confined to the individual and his right to practice matters of conscience freely, then there is not a problem. But if a group of citizens or the state itself transfers that superior call to the realm of politics, dissenters will by definition be disadvantaged. It is simply not possible to sustain freedom of religion for the individual if the state is committed to a particular set of religious beliefs.
The America about which we learned as schoolchildren was thus one in which freedom of religion and separation of church and state were foregone conclusions at the start. But as with everything in our history, practice has not always matched stated values. In the period before independence, some colonies had official churches, like the Anglican Church in Virginia, while other colonies, such as Rhode Island, imposed a stricter separation.
Many of the earliest settlers came to the new promised land fleeing religious oppression. Their experience imprinted the free exercise of their religion as a fundamental tenet of political life. The Puritans set themselves up in Massachusetts, endeavoring to build a pious society that would be a model for others. But although they were sure to protect their own rights to freely practice their faith, Puritan leaders were not as keen on respecting the rights of minority sects or nonconformists in their midst. Indeed, Roger Williams founded Rhode Island after being banished from Massachusetts for political agitation and not adhering to some aspects of the mainstream faith.
Religious difference was nevertheless tolerated to an almost unprecedented degree for the times. In part due to the bloody religious wars in Britain and Europe, the idea of religious tolerance in the colonies had a natural base of support—if only because the alternative would mean constant strife. Across the colonies there was great variation in both religious sects and religious freedom. Maryland, for example, under the stewardship of Lord Baltimore, a Catholic, passed a resolution that prohibited almost any negative action toward a fellow colonist based on his or her religion. In language that would serve as a model for the First Amendment more than a hundred years later, the Maryland Toleration Act of 1649 stated that “no person or persons whatsoever within this province… professing to believe in Jesus Christ shall from henceforth be in any ways troubled, molested, or discountenanced for or in respect of his or her religion, nor in the free exercise thereof.” The limitation to Christian faiths notwithstanding, the Toleration Act was a groundbreaking document, not least because it came at a time of heightened religious tension back in England, where the Anglican monarch (Charles I) had just been overthrown and executed by Puritan leaders of Parliament, and where Catholics were increasingly being persecuted.
On the other hand, in Virginia, Baptists were hounded and imprisoned, and Presbyterians found it hard to establish churches and were subject to frequent property seizure. Madison was struck by the sight of a Baptist preacher who was convicted for his “insurrectionist” sermons and insisted on continuing to preach from his cell.
The Framers were appalled by such persecution, and when they set about the work of writing the Constitution, they sought to protect religious conscience and separate the church from politics.
Madison and others reserved their harshest criticism for state religions obsessed with earthly wealth and power. Their argument was two-pronged: State religion was bad for the individual citizen, interfering with his most basic and personal of choices, and it was bad for religion, condemning the church to worldliness and corruption.7 Therefore, under the “establishment” and “free exercise” clauses of the First Amendment, the U.S. Congress can make no laws “respecting an establishment of religion or prohibiting the free exercise thereof.” Freedom of religion for the individual thus became closely associated with the separation of church and state.
These high ideals have not prevented religious prejudice in American social life and politics. It was not too long ago that candidate John F. Kennedy had to assure Americans that he would not answer to the Pope when making decisions as president of the United States. Anti-Semitism has a long and dark history in America. And Muslim Americans find themselves constantly professing their loyalty to the United States in answer to those who too easily draw a link between them and the violent extremists in the Middle East. As long as human beings fear those who are “different,” prejudice and suspicion will be a part of the human experience—and America is no exception.
Yet the Constitution gave “We the people” no religious identity. The state is to be blind to the question of the “true way.” That is meant to be the ultimate guarantee that none will be persecuted by the state because of religious beliefs.
Many have made the point that Christians founded America. These men and women lived in a time when at least some expression of Christian belief was an absolute necessity for moral propriety. Like many religious people, I find great comfort in the stories of their personal struggles to find meaning and, in some cases, to find God. But in the final analysis it doesn’t matter whether they were Christian believers, Deists, or atheists: Their intention was to create a system of governance that prohibited the privileging of one set of beliefs over another and allowed citizens the freedom to choose and practice religion without the interference of the state.
And through the constitutional process, Americans have been defining precisely what that means in practical terms. This flexibility has become more crucial as our own diversity has stretched to incorporate every known religious belief and the possibility of no religious faith at all. The questions that have arisen are wide-ranging. Some strike us as fundamental: Can the government compel obedience to a law that a citizen deems to be in contradiction to her religious beliefs? Others may seem more trivial: Do holiday decorations with a religious theme displayed on government property violate the separation of church and state? What is remarkable is that we have a Constitution that gives us a pathway to confront these questions. We do not, therefore, take up arms against one another to defend the claim that God is on our side.
The claim of America’s founding documents that the government should undertake to protect the right of citizens to life, liberty, and the pursuit of happiness is so broad as to be almost absurd. It makes perfectly good sense that citizens should enjoy freedom of speech and of religion, protection from the arbitrary power of the state, and the right to select those who would govern them. But a right to pursue happiness? How in the world can government guarantee that?
The answer lies in the fact that the government’s role was actually limited. There was no guarantee to happiness—only a promise to provide conditions of freedom and liberty that allowed citizens to pursue their goals. That has meant that happiness is pursued through individual initiative and free association with others.
The United States evolved in a way that made unprecedented room for private space and private activity. This is of course true for the economy, where in terms of “value added,” private industries account for more than 87 percent of GDP.8
Then there is the role of civil society in our national life. Civil society strengthens democracy by encouraging citizen participation, fostering democratic values, advancing the general welfare, providing for public goods, and counterbalancing the government. The United States has more than one and a half million non-governmental organizations. Large numbers of them, like the Sierra Club and the Chamber of Commerce, press the government on matters of policy, while others, like Common Cause and Judicial Watch, act avowedly as checks on the power of authorities. Still others allow citizens to organize and pursue good works on behalf of the less fortunate.
In America, civil society often delivers many of the services and societal goods that are wholly the purview of the government in other countries, even other democratic ones. Faith-based groups help resettle immigrants and refugees in their communities. Boys and Girls Clubs provide safe spaces for youth after school. National organizations like the Salvation Army and local soup kitchens and shelters feed, house, and clothe the poorest people. The Boy Scouts and Girl Scouts provide leadership training to young men and women, which the military recognizes with advanced enlistment ranks, and nonprofit blood banks provide lifesaving services to patients in hospitals, many of which are also private nonprofit organizations. All of these services depend on another “private” element that until recently was almost unique to the United States: philanthropy.
Rebuilding the nation after the Civil War, particularly in the area of education, was one of American philanthropists’ first major projects. And the arts have long been sustained largely by private support.
Large-scale giving increased at the turn of the twentieth century, as the economy created a growing number of millionaires (there were one hundred in the 1870s, four thousand in 1892, and forty thousand in 1916).9 And then in 1913, after the creation of the income tax, the government took an innovative step that entrenched philanthropy in American society: It made charitable giving tax-deductible. The philanthropic sector flourished in response. Foundations were established to take on a broader array of objectives and, no longer tied to specific projects, they increasingly had open-ended missions.
Neither was philanthropy the exclusive domain of the wealthy. Indeed, in the modern era, some research suggests that by some measures, less affluent Americans give just as much, if not more. In 2011, for example, Americans with earnings in the bottom 20 percent gave 3.2 percent of their incomes to charity, while those in the top 20 percent gave 1.2 percent.10 And as Americans continue to lead the world in philanthropic giving as a percent of GDP, they continue to give more every year, at even a faster rate than the growing economy. Over the past fifty years, charitable giving per American has increased 190 percent, while GDP per capita has increased 150 percent.11
In sum, civil society plays a role in almost every area of social responsibility in the United States. Some will argue that this constitutes an abrogation of government responsibility. They will cite holes in the social safety net that only government can fill. But the relationship of the citizen to the government has become a dialogue about rights and very little about obligations. Yes, one pays taxes, serves on juries, and obeys laws, but everything else is voluntary—even voting and serving in the military. The truth is that the United States has a substantial welfare state, and it has grown immensely over the last five decades. Arguably, therefore, citizenship is finding its deepest expression in this private space where individual citizens or groups of citizens take responsibility for one another. This is one of the strongest pillars of a stable democracy.
The institutional landscape that the Founders built rested on essential principles: a limited executive, balanced by a separately elected legislature and an independent judiciary; federalism as a constraint on the power of the central government; a huge space for independent forces—civil society and a free press; and religious freedom unbound by the preferences of the state.
The system was built for what Alexis de Tocqueville called “ceaseless agitation.” He noted that “Democratic liberty is far from accomplishing all the projects it undertakes with the skill of an adroit despotism… but in the end it produces more than any absolute government. [It] produces… an all-pervading and restless activity… a superabundant force, an energy which cannot be separated from it… and under favorable conditions… begets the most amazing benefits.”12
The American government was designed to require constant engagement, not just by officials but by citizens at multiple levels—local, state, and national. Americans were thus given peaceful means to contest political questions. That battleground was and has always been to lay claims before an American Constitution that has by any standard enjoyed a remarkable run. The Founders presciently built in mechanisms for revision, litigation, and evolution. In a sense, the struggle to make America’s democracy a bit better and inclusive—little by little—is the story at the core of its stability and success.
The experiment didn’t, of course, work perfectly. The early history of America is a story of near misses that almost unraveled the compromises made on behalf of the young republic. The second president, the thin-skinned John Adams, signed into law the Alien and Sedition Acts, purportedly as wartime protection against foreign agents.13 A clear challenge to the First Amendment, the acts’ ban on criticizing the president or his administration was in practice used primarily to stifle attacks by political opponents and the press. Public opposition to the law was strong, however, and helped propel Thomas Jefferson to the presidency. Jefferson allowed the law to expire, but one can speculate that the United States of America would be a very different place had such a law remained in force. Early decisions can mark a country’s institutions permanently. In this case, Adams’s defeat in the 1800 election allowed the country to reverse course before the laws could do permanent damage. Today, America’s protection of free speech and of the press is arguably broader and more far-reaching than that of any other government in the world.
The transition to democracy in America was almost cut short by other close calls. A few decades after the Constitution was written, the influential John C. Calhoun of South Carolina, who was then vice president, put forth the disruptive notion of nullification. In response to the passage of a tariff law, he claimed essentially that the states could cancel (or nullify) the laws of the Union with which they disagreed. By refusing to pay the tariff, South Carolina was challenging the authority of the U.S. government. In late 1832, President Andrew Jackson reinforced deployments of federal troops in the state’s capital, Charleston, and positioned the navy off its shores. Jackson threatened to “hang” the members of the nullification movement (including Calhoun). The crisis was averted when Senator Henry Clay crafted a compromise to lower the tariff and undermine local support for the nullification movement, but not before South Carolina had already begun to raise its own army.
The first hundred years of America’s history were marked too by corruption, patronage, and self-dealing that threatened both prosperity and faith in the institutions. Teddy Roosevelt was pivotal in cleaning up this part of the institutional landscape, especially in reforming the federal civil service, which had long been an epicenter of political patronage. It was a cause that Roosevelt had taken up early in his career, first as a state assemblyman and later as a vocal member of the Civil Service Commission. When he became president several years later, he used his powers to help ensure that federal jobs were assigned according to merit, not political connections. Still, Roosevelt’s three-decade-long effort left many problems unresolved.14
Yes, America’s transition to democracy was not so smooth after all. Even with all the country had going for it, the great experiment was threatened several times. And nothing would challenge the young republic like America’s greatest birth defect—the original sin of slavery and its aftermath. Today, it is easy to forget that slavery was initially presented as a question of the proper balance between the power of the states and that of the federal government. The implications of that argument would stretch almost a hundred years beyond the end of the Civil War to the streets and lunch counters of Alabama.
Rarely do people think of the civil rights movement as a moment of democratic transition. But it was. Of all the amazing twists and turns of America’s history, none is more remarkable than the degree to which the Constitution came to serve the cause of overcoming the legacy of slavery and legalized segregation. That the descendants of slaves would embrace the Fourteenth Amendment as a means to push for equal rights is testament to the document’s extraordinary ability to channel and facilitate America’s evolution.
In the view of many Founders, this was an improbable outcome. Thomas Jefferson was convinced that black slaves would not live in chains forever. “Nothing is more certainly written in the book of fate than that these people are to be free,” he once wrote. But he was equally certain that whites and freed blacks “cannot live in the same government.” Tocqueville, in viewing the fate of the “three races” that inhabited America in 1835 (the “whites,” “negroes,” and “Indians,” as he put it), saw no way for them to live together in peace. Madison and other Founders so despaired about the future for freed slaves that they endeavored to return them to Africa, supporting the creation of what would become the country of Liberia. Even after the long and arduous struggle to end slavery, it took almost a hundred years, until the civil rights movement of the 1960s, to accomplish what many of the Founders thought impossible—the extension of “We the people” to black Americans.
The journey was a chaotic one. Certainly the Constitution could not help slaves in the antebellum South. Yet a few had audaciously tried, with little success, to appeal to the courts for their freedom. Under the Slave Codes, slaves had no rights because they were considered property, not people. They could not testify in court against a white person, they could not enter into contracts, and they could not defend themselves against the violence of their masters. But in the North and the new states and territories of the West, legal challenges to slavery met with more success.
Although most cases took place at the state level, a few made it all the way to the U.S. Supreme Court. In one such case, none other than John Quincy Adams defended the kidnapped Africans who were being illegally transported from Africa to slavery on the ship Amistad. They had rebelled, killing members of the crew, and Adams won their acquittal in 1841.
A few years later, the Supreme Court heard perhaps its most infamous case. Dred Scott was a slave who had been brought to a free state by his owner and claimed he should therefore be free. Deciding the question of whether the descendants of Africa, free or enslaved, could be considered Americans, a majority of the court said no: “They are not included, and were not intended to be included, under the word ‘citizens’ in the Constitution, and can therefore claim none of the rights and privileges which that instrument provides for and secures to citizens of the United States.”15 Frederick Douglass, the leading abolitionist, who had escaped from slavery in his youth, denounced the 1857 ruling as the “most scandalous and devilish perversion of the Constitution” he had ever seen, calling it “a brazen misstatement of the facts of history.”16 And yet, he said, “My hopes were never brighter than now.” The decision had been a clear setback in the legal battle for freedom, but in the political realm it had an unintended effect, energizing opponents of slavery and hastening the onset of the Civil War, which would settle the issue once and for all.
The immediate question at the end of the Civil War was broader than even how to treat the emancipated slaves; it was how to treat all citizens of the South—both white and black. It is testament to Abraham Lincoln’s greatness that he immediately, even before the last shots were fired, found a formula for inclusion. The famous phrase “with malice toward none, with charity for all” was not just a line in the Second Inaugural Address. It was how Lincoln saw the task of bringing the country back together. And it stands as a remarkable example of one approach to the horrible question that so many emerging democracies face even today: How do you deal with rebels, insurrectionists, and those who are on the losing side of civil wars?
Lincoln, felled by an assassin just five days after the war ended, would not live to see how his vision for reconciliation would—or would not—play out. With the Union’s victory, the federal government took steps to hold some rebel leaders accountable, but many of those actions were temporary or reversed soon thereafter. Only one military leader of the Confederate army was arrested; the others, including General Robert E. Lee, were allowed to go home. Civilian leaders of the Confederacy did not fare much worse. While several were arrested, not one was ever tried. Jefferson Davis, president of the Confederacy and a graduate of West Point, was imprisoned for a number of months but was eventually released without trial along with the others, and would live out his life as a symbol of pride to the most committed Confederates.
Meanwhile, freed slaves had been promised not only freedom but other forms of assistance as they tried to establish new lives. Part of the plan involved the redistribution of land from former slave owners to newly freed slaves, a policy that would have served both as a form of reparation and as a way to undermine the political power of slaveholding interests. But when Andrew Johnson took office after Lincoln’s assassination, he reversed course. The properties that were to have gone to the freedmen never did, depriving the new black American citizens of their “forty acres and a mule.”
It would nevertheless fall to Andrew Johnson, after considerable political debate in Congress, to grant the right to vote to freed slaves, making them under the Fourteenth and Fifteenth Amendments subject to “equal protection of the laws.” The occupation of the South would usher in the era of Reconstruction, a set of policies intended to rebuild the region and reintegrate it into the Union. Reconstruction is considered by almost every historian to have been a failure. Still, there were some favorable elements: Military governors were sent to enforce the new laws; efforts were made to educate blacks; and freedmen were even seated in state legislatures.
Lincoln was spared the spectacle that would follow, as the South, which he had wanted to return to the fold, ultimately rejected the hand extended to it. Proponents of white supremacy began to regain their footing. The Ku Klux Klan was founded in 1865 by Confederate veterans in Tennessee and soon developed a presence throughout the region.17 Violence and voter intimidation against blacks became commonplace. Republican-controlled state governments that were established during Reconstruction, and were supported by newly enfranchised black voters, were overturned in favor of the Democrats. State by state, the southerners who had fought on behalf of slavery became ascendant once again.
By the time of the presidential election of 1876, Washington was losing its appetite for occupying the South with federal troops. Their withdrawal was accelerated, however, as part of the deal that resolved an election dispute and made Rutherford B. Hayes president despite his losing the popular vote. After several recounts and contentious debate, Hayes secured enough Electoral College votes in the House of Representatives to win. He achieved the necessary margin only by promising to withdraw all Union troops from the South, which he did in the Compromise of 1877. With that, the effort at reconstruction and reconciliation collapsed, and the hated occupation of the South by the North was over.
Out of this dark moment in American history, institutional seeds were sown that would lead to advancement for the descendants of slaves. After the Civil War ended in 1865, the government established an agency called the Freedmen’s Bureau to help former slaves adjust to postwar realities, and the bureau spearheaded the establishment of institutions, including Morehouse College and Howard University in 1868, to help educate the newly freed men (and women). These and other historically black colleges have since educated generation after generation of black Americans, among them some of the most celebrated figures in our national history, including W. E. B. Du Bois, Martin Luther King Jr., and Thurgood Marshall.18
But the Compromise of 1877 left southern lawmakers wide latitude to establish new rules for relations between the races. “Separate but equal” became an Orwellian phrase that defended racial segregation and gave it a legal foundation.
Jim Crow (named for a minstrel show in which white actors wore blackface to impersonate African descendants) would emerge as a violent and painful system of legalized segregation and oppression in the South of my birth. For those of us old enough to have lived through the horror of the Jim Crow period, its gruesome images are indelibly etched in our minds: lynchings and mob violence, burning crosses and hate speech. I never saw the Ku Klux Klan in action, but my parents did. I never saw anyone lynched, but I remember dreaming one night that my father didn’t return home—he had been caught and hanged. That nightmare came shortly after my uncle told me about being pulled over by a Mississippi highway patrolman who told him and my father to have their “black [expletive] gone from this state when I return.”
Sadly, my male relatives experienced many such incidents. My mother’s father ran away from his family because he had beaten a white man who had assaulted his sister. He knew what his fate would be if he stayed around. There were so many martyrs to the cause of gaining equal rights, including my friend Denise McNair and three other little girls killed in a bombing at the Sixteenth Street Baptist Church in Birmingham in September 1963. They had been changing into choir robes in a basement restroom when terrorists detonated more than a dozen sticks of dynamite under the front steps of the church. This horrific attack devastated the community but helped galvanize support for passage of the Civil Rights Act the next year. I was born in 1954. America’s hard times are not that far in the past.
My own personal experience in living under Jim Crow was of a kind of parallel political existence. My family participated in the democratic process as if it mattered, even when, in substance, it didn’t. This inexplicable faith in the rights enshrined in American institutions, shared by countless black families, played a crucial role in finally gaining those rights, because it left open a pathway to change the course of America without resorting to violence.
Democratic transitions do not happen magically; they require people to have a view of a better future and the will to achieve it—and more than that, they require planning and determination. At the forefront of the battle for civil rights in the twentieth century was the NAACP, which used professed American values—hard work, ingenuity, and a belief in equality—to improve American institutions.
Throughout our history there were those black leaders, like Malcolm X of the Nation of Islam, who believed that the constitutional course would never succeed. They sought to overturn the political order by force and violence and equated Martin Luther King Jr.’s doctrine of “nonviolence” with being “defenseless.”
I heard one of those leaders, Stokely Carmichael, speak for the first time in March 1967, when my father, then dean of students at Stillman College in Tuscaloosa, Alabama, invited him to the campus, despite the misgivings of the college administrators and, indeed, the police. As leader of the Student Non-Violent Coordinating Committee (SNCC), Carmichael had made famous the phrase “black power,” which was as stirring for some blacks as it was frightening for some whites. “Reverend, I don’t want to rev up those country boys,” the sheriff told Daddy when he heard about the event. “Nothing will happen,” my father told him, hoping he was right.
When Carmichael came to Stillman, his speech was as fiery as expected. He criticized U.S. foreign policy and the war in Vietnam, recounted the double standards from American history, and called on the four hundred students before him to fight back against the system. “This country has law and order, but it doesn’t know a damn thing about justice,” he said. “If you want to be free, you’ve got to say, ‘To hell with the laws of the United States.’” Carmichael’s rhetoric was of liberation and resistance—not of constitutional change.
Carmichael later left SNCC and became associated with a much more militant group, the Black Panther Party, which acted well beyond rhetoric, with recourse to violence that rocked America even after the great civil rights legislation of the Johnson era. Established in 1966, after Malcolm X’s assassination and the race-fueled Watts riots in Los Angeles, the Black Panthers gained national notoriety when a few dozen armed supporters occupied the California legislature to protest a gun-control bill in 1967. Shortly thereafter the Panthers issued their ten-point platform, which sounded more like a call for revolution than reform. Identifying itself as a Marxist revolutionary group, the party advocated the arming of all black Americans, the release of all blacks from jail, and a blanket exemption from the draft and reparations for years of oppression. In ways both symbolic and real, the Black Panthers embraced militancy and engaged in violence, becoming involved in a number of bloody confrontations with police.
Frankly, they might have gained the upper hand were it not for Martin Luther King Jr. and others like him who used America’s own laws and principles, rather than violence, to create a more equal nation. They summoned America to be what it said it was—using the very words of the Framers in the context of the Constitution that they authored.
The injustice confronted by black Americans in the pre–civil rights era was in many ways akin to the injustices faced by people living in non-democratic regimes around the world. What we have seen in so many of those cases, time and again, is that people will not accept the conditions of tyranny forever. Eventually, even if it takes generations, there comes a point at which they will revolt. Of course, unlike authoritarian regimes, the United States has representative institutions that provide the option of peaceful resistance—through the political process rather than around it. And leaders of the civil rights movement took full advantage of it.
The NAACP led this effort and engaged institutions (chiefly the courts, but also the news media, religious groups, and others) to bring political change. Its success depended on a deliberate strategy, pursued by committed individuals who worked over the course of decades, constantly recalibrating in the face of many setbacks. The people at the NAACP persisted because they knew they were right, and they achieved success because they chose the right path.
There were four constitutional prerequisites for the NAACP’s approach: the Thirteenth, Fourteenth, and Fifteenth Amendments; the independence of the judiciary; judicial supremacy; and the fact of individual rights. The first was the language to which they appealed. The next two meant that there was a chance that judges would act not in the interests of political forces but with a just reading of the Constitution.19 The last made it possible to claim harm in the name of the individual citizen. Nathan Margold, a protégé of Felix Frankfurter, authored a kind of blueprint for the strategy. That effort would become known as the Margold Report, and it suggested using the courts to move the law forward. Secondarily, the activity in the legal system would “incite the passions of black Americans to fight for their rights.”
Not every case was fought in the South, with landmark decisions, both pro and con, in Missouri and Oklahoma. But the old Confederacy was clearly the epicenter. Thurgood Marshall and the lawyers of the NAACP won some cases and lost some, but they kept refining the strategy, filing new cases, and ultimately moving the civil rights struggle into the consciousness of the country. As the late Jack Greenberg, a young lawyer in the cause, put it in his fascinating autobiographical chronicle of the times, Crusaders in the Courts, “Our job was to exploit favorable decisions and use them to overwhelm the unfavorable ones.”
The legal strategy made incremental progress, but it was aided by the sacrifices of thousands of black Americans in World War II. That animated President Harry Truman’s interest in civil rights. Truman is well known for having integrated the armed forces, recognizing the moral absurdity of returning these men to a country steeped in inequality. He is less well known for having created a Committee on Civil Rights, which issued a landmark report in 1947, To Secure These Rights, and set forth a program to overcome injustice and lay the groundwork for the great civil rights legislation passed almost two decades later.
Those fights—in the courts and in the streets, with demonstrations, marches, setbacks, and advances, and too many martyrs to the cause—finally gave meaning to the Fourteenth Amendment through landmark civil rights legislation. The passage of the Civil Rights Act of 1964 and the Voting Rights Act of 1965 can, I believe, be said to be America’s second founding.
One may ask whether society has to change before laws or the other way around. The American experience would suggest that new laws do indeed lay the foundation for a changed society. It suggests too that democratic transitions—and that is the only way to think about the path from slavery to equal rights—require agency. Institutions are not worth the paper they are written on until people are willing to say that they must be what they claim to be and to sacrifice and even die to make the point.
The role of the Constitution in this painful history reminds us of the importance of founding documents—and of their place in evolving a society toward justice. Women have used the Constitution to gain suffrage and gay people have won the right to marry. But Tocqueville’s third race, American Indians, were left outside of the Constitution’s framework of protections until the Indian Citizenship Act of 1924. They have by and large suffered a much different fate and their condition remains an ugly stain on modern-day America.20
Even as the Constitution has been used to overcome the legacy of inequality, the arguments have gotten louder and more complicated about its proper purposes, none more so than those about affirmative action. It is a prime example of how the country has tried to balance competing principles in the pursuit of racial equality.
The idea was rather simple at its inception. Years of legal segregation and societal prejudice had led to an imbalance in the opportunities available to American minorities. When the policies first emerged in the 1960s, the Jim Crow era was coming to an end and the integration of the University of Alabama had just taken place. Not surprisingly, blacks were underrepresented in academia, government, and the corporate world. In theory, America was more equal than ever before, but the reality continued to tell a different story.
Lyndon Johnson argued that the country could not be satisfied with this paradox. As he put it, “You do not wipe away the scars of centuries by saying: Now you are free to go where you want, and do as you desire, and choose the leaders you please. You do not take a person who, for years, has been hobbled by chains and liberate him, bring him up to the starting line of a race and then say, ‘you are free to compete with all the others,’ and still justly believe that you have been completely fair.”
A year after signing the Civil Rights Act, Johnson issued an executive order requiring federal contractors to take “affirmative action” to hire qualified minorities. In 1967, he expanded the order to include women. When Richard Nixon became president, he continued these efforts. In 1969, Nixon created the Office of Minority Business Enterprise to promote equal opportunity for minority-owned businesses. Labor Secretary George Shultz then approved the Philadelphia Plan, which required federal contractors to adopt “numerical goals and timetables” to desegregate their workforces. In 1970, the Labor Department issued an order applying the Philadelphia Plan to almost all government contractors, and a year later that order was also extended to include women.
It was not long, however, before tensions emerged between the principle of race-blind equality (the Fourteenth Amendment outlawed discrimination based on race, color, or national origin) and the desire to overcome a history of racial exclusion. From Ronald Reagan, who avowedly challenged affirmative action and sought to end it, to Bill Clinton, who promised to “mend it not end it,” the tug and pull between the two principles continued.
Emotional cases before the courts pitted aggrieved individual white citizens against larger societal concerns. How was it possible that white teachers with many years in service could be laid off simply to assure racial balance in a school district? The white teachers won. Was it really right for black firefighters with lower examination scores to be promoted ahead of their white counterparts? No, not really. These questions frankly had no good answer when seen as a contest between two compelling principles.
Nowhere was the tension more pronounced than concerning the question of race in admissions in higher education. Access to quality education is at the core of fulfilling America’s promise of upward mobility and personal progress. Choosing to advantage one student over another because of race, ethnicity, or gender seemed to some an assault on this promise.
As provost of Stanford in the 1990s, though, I knew we would enroll fewer minority students if we could not take race into consideration. The president of the university, Gerhard Casper, and I defended affirmative action in college admissions before alumni groups, a few skeptical faculty, and the board of trustees. For an elite university, I truly believed we were not making compromises of quality: Stanford, Harvard, and our peers are so selective and small that admissions officers can “handpick” minority students who can succeed, even if in some cases their test scores are slightly lower than their white counterparts.
At large state institutions, however, implementing affirmative action policies can be trickier. The University of Michigan had established a point system in admissions that awarded applicants a certain number of additional points if they represented a qualified minority group. A white student sued the school for discrimination in a case that ended up before the Supreme Court in 2003.21 I was national security adviser to President Bush at the time, and he called me into the Oval Office one day. “I’ve got to make a decision in this Michigan case,” he began. He was asking me to opine on an issue that was, of course, well outside of my job description: the language of the administration’s amicus brief in support of the plaintiff in the landmark case.
The president explained that as governor of Texas, he had not supported quotas but had sought to pursue affirmative action through what he called “affirmative access.” In the Texas program, the top 10 percent of every high school class was guaranteed a place in the Texas university system. In the Michigan case, on the other hand, the plaintiff alleged that the point system in undergraduate admissions amounted to a quota system. Quotas had been ruled unconstitutional by the Supreme Court in 1978, in a case called Regents of the University of California v. Bakke, but that decision also upheld the use of race as one of several factors in admissions decisions, given the compelling state interest in promoting diversity. The president had two options. One was simply to support the plaintiff’s claims that the University of Michigan’s point system was unconstitutional. The other was to go beyond that and ask the Court to overturn the last vestiges of affirmative action in college admissions and eliminate any use of race in admissions decisions, in effect overturning the Bakke precedent.
I felt a little odd weighing in on a matter of domestic policy, but knew that as a close adviser who was black and the former provost of Stanford, I should do so. I told the president that I personally would not have joined the amicus brief on behalf of the plaintiff, but he and his advisers had already decided to do that. But I also urged him not to support those who would overturn the Bakke decision. “Mr. President,” I said, “this work isn’t yet done. One day it will be, but not yet.”
Later, I learned that Alberto Gonzales, the White House counsel at the time, who went on to become attorney general, had made the same case to him. Against the wishes of some in the administration, the president took the middle course we recommended. When the Washington Post published a story claiming that I had argued for overturning Bakke, I asked the president to allow me to do something that I had never done before: reveal the contents of our private conversation. He agreed, and I let everyone know that I was—and still am—a supporter of affirmative action. Sometimes when important principles clash, you have to choose: I believe that we still need to choose inclusion even if it collides with our desire to be race-blind.
The time is coming when we, as a country, may make a different choice. In her opinion on the Michigan case, Sandra Day O’Connor thought that the need for preferences would expire in twenty-five years. That would be 2028.
Affirmative action is also being challenged on a state-by-state basis. In 1996, for example, the people of California voted in a referendum to end affirmative action by state agencies in employment, education, and contracting.
But the underrepresentation of minorities in academia, the corporate environment, and the government persists. Affirmative action has caused people to stop and think and make good-faith efforts to diversify outside of traditional channels. I doubt that Stanford would have taken a second look at a young Soviet specialist from the University of Denver on a one-year fellowship without an eye toward diversification. But the university took a chance on me and I joined the faculty. It worked out well for both of us.
When I was secretary of state, I told my aides that it was appalling to me that I could go through an entire day of meetings and never see someone who looked like me. The president of the United States had selected two African Americans in a row, Colin Powell and me, to be the country’s chief diplomats, and yet the Foreign Service was still just 6 percent black, a percentage virtually unchanged since the 1980s and half of what one would expect based on the population.
Obviously, it isn’t easy to know what role choice plays in these and other circumstances of underrepresentation. I have argued to rooms full of minority students that they can’t personally decide against studying a foreign language and expect the diplomatic corps to be diverse. I have told black undergraduates that they can’t personally refuse to go on to PhD study and protest the lack of minority professors.
Moreover, as other groups have rightly claimed the need for relief from underrepresentation, further contradictions have appeared. Asian American representation in college admissions is arguably depressed by efforts to include other minorities. And I am very aware that every admitted minority student faces a kind of stigma due to affirmative action, no matter what universities argue to the contrary. I saw this so often that it ceased to come as a surprise.
One incident sticks in my mind. I asked a colleague at Stanford how his teaching was going. He told me that despite being very busy, he was holding an extra section to help his minority students come up to speed. The quarter had just started and I asked innocently if he had done an evaluation of the students to see if it was needed. He had never thought of doing so. I wanted to ask him, “Have you thought that your white students might need help too?” But I didn’t. He meant well but had fallen into the worst kind of prejudice. In another context, President George W. Bush once called this “the soft bigotry of low expectations.”
These are the tensions and contradictions that the admirable effort to overcome our nation’s birth defect of slavery and prejudice has produced. That we are still struggling with these issues today, after more than two centuries as a nation, is yet another reminder that nothing is smooth on democracy’s path.
As people around the world struggle to build democracies of their own, the effort to protect the rights of ethnic, religious, and other minorities is a daunting challenge. As the American experience shows, it will continue long after the democracy is stable. But it helps to have a “spirit of constitutionalism,” and a belief that the institutions of the nation are in the end just—and that it is worth the trouble to use them.
The United States is a stable democracy today not because the Founders’ institutional design answered every question for all time about how to balance the rights and interests of citizens and their state. They relied on necessary compromises to create a framework of principles and laws that could guide future generations as they met new challenges. The lesson for young democracies is that not everything can be settled at the start. But if the institutions are put in place and citizens use them, there is at least a way to channel the passions of free people and to resolve the hard questions of governing as they arise in future times.