chapter four

DEMOCRATIC INFRASTRUCTURE

THE IMMENSE ECONOMIC PROBLEMS THAT ARE BEING AGGRAVATED and accelerated by the technological revolution raise fundamental issues that require popular involvement to be resolved in a humane and sustainable manner. Ours is a citizenless democracy, where core political and economic decisions are made by the wealthy few for the wealthy few. Consequential economic issues are off the table. Voting is hard. Money rules the day. Instead of guiding the ship of state, the vast majority of Americans are on individual rafts in the middle of the ocean watching that ship sail off into the distance. Citizens need democratic infrastructure to be effective participants in governance, but the democratic infrastructure of the United States seems to be disappearing faster than the polar ice caps.

The good news is that there is a rich and underappreciated tradition in American history that respects the importance of a credible democratic infrastructure, and that has fought for its development—sometimes with considerable success. Those wealthy few who are on top now, along with their pawns in the political and pundit classes, tell us that serious efforts to expand the democratic infrastructure of the United States would violate the terms laid out in the Constitution, to which they swear unrivaled fealty. The most emphatic defenders of the status quo wrap themselves in the flag and claim that what the United States has today is the freest and best society possible—that contemporary America is exactly what the wisest of the founders intended. All other versions of democracy pale by comparison, and any reform effort to alter the US system would lessen our freedoms. They are wrong about the history, and wrong about democracy.

CONSTITUTIONS AGAINST DEMOCRACY?

Constitutions in democratic societies are important. It is in the writing of these documents that nation-states set the specific terms for how citizens shall direct governance. Constitutions mandate much, though rarely all, of what citizens understand as democratic infrastructure. Constitutions also provide guidance for elected bodies and organized citizens to fill in the rest of the democratic infrastructure, and define how easy it is for the people of a new age to amend old rules. “A constitution,” political scientist Robert Dahl has written, should “maintain political institutions that foster political equality among citizens and all the necessary rights, liberties, and opportunities that are essential to the existence of political equality and democratic governance.”1 Constitutions are not the only place these matters are determined, but they are central to the process.2 So what do we learn by assessing how the democratic infrastructure has been understood in the American constitutional traditions? That is a loaded question, as a close reading reveals, as we use the plural to talk about constitutional traditions. Not only are there multiple ways to understand the US Constitution, there are fifty state constitutions and hundreds of state constitutional conventions, many of which have dealt with the democratic infrastructure. It is only by considering the range of American constitutional traditions that we can begin to answer the question about democratic infrastructure.**

The first point is that the framers of the Constitution flunked miserably Dahl’s test for a democratic constitution. The immediately obvious problem is that African Americans and Native Americans were written out of the picture as potential citizens of the new country. They were only to be used by whites for the advantages of whites. Writer and activist Staughton Lynd is spot-on when he states that the United States “was founded on crimes against humanity directed at Native Americans and African-American slaves.” This was not a regrettable footnote to the nation’s founding and history, but it was at the center, a fact which still remains underappreciated to this day.3

Most of the founders of the nation were well aware that slavery was morally wrong and intellectually indefensible. Thomas Jefferson’s original version of the Declaration of Independence included a paragraph on slavery in its bill of offenses committed by King George:

He has waged cruel war against human nature itself, violating its most sacred rights of life and liberty in the persons of a distant people who never offended him, captivating and carrying them into slavery in another hemisphere, or to incur miserable death in their transportation hither. . . . And he is now exciting those very people to rise in arms among us, and to purchase that liberty of which he had deprived them, by murdering the people upon whom he also obtruded them: thus paying off former crimes committed against the liberties of one people, with crimes which he urges them to commit against the lives of another.

This contradictory paragraph says it all: simultaneous outrage at the British for enslaving Africans and forcing them to the New World along with outrage that the British were now encouraging the same slaves to revolt. Perhaps that is why the paragraph was deleted from the final draft, and no mention of slavery appears in the document.4 (Indeed, Gerald Horne and other historians make a convincing argument that it was the fear of the burgeoning British anti-slavery movement—with its sympathy for slave uprisings, and the English Court of King’s Bench 1772 Somerset v. Stewart ruling that chattel slavery was unsupported by English common law—that motivated slaveholders and merchants dependent upon slavery to throw in with the American revolution.5) Historians debate the precise role of slavery in the constitutional debates that followed a decade later; the federal Constitution makes no explicit mention of slavery, but deems each slave three-fifths of a person for the purpose of counting state populations and thereby apportioning political representation in the US House of Representatives.6

With but a handful of glorious exceptions, like Thomas Paine—whose 1775 essay “African Slavery in America” was a scathing indictment of the institution and a call for its termination—and the Pennsylvania radicals, the white supremacy that was embedded into the culture was either accepted or actively promoted.7 But the Constitution is also a dubious product for democracy even when looking strictly at the white male population as the relevant group of prospective citizens, which is what the framers did.

While they were deeply concerned with preventing tyranny, the intent of the federal Constitution was not to promote democracy, even among those of European descent. This analysis was best put by the scholar and future president Woodrow Wilson in 1893:

The federal government was not by intention a democratic government. In plan and structure it was meant to check the sweep and power of popular majorities. The Senate, it was believed, would be a stronghold of conservatism, if not of aristocracy and wealth. The President, it was expected, would be the choice of representative men acting in the electoral college, and not of the people. The federal Judiciary was looked to, with its virtually permanent membership, to hold the entire structure of national politics in nice balance against all disturbing influences, whether of popular impulse or of official overbearance. Only in the House of Representatives were the people to be accorded an immediate audience and a direct means of making their will effective in affairs. The government had, in fact, been originated and organized upon the initiative and primarily in the interest of the mercantile and wealthy classes. Originally conceived in an effort to resolve commercial disputes between the States, it had been urged to adoption by a minority, under the concerted and aggressive leadership of able men representing a ruling class.

Wilson also observed that “there can be a moneyed aristocracy, but there cannot be a moneyed democracy.” In subsequent generations the forces of democracy would rise up in the United States with frequent success, Wilson noted, but in the first decades of the Republic, “the conservative, cultivated, propertied classes of New England and the South practically held the government as their own.”8 In 1800 in New York City, for example, a mere 25 percent of white men were eligible to vote. The property requirement for white male voters did not end in the nation until North Carolina eliminated it in 1856.9 And the poll tax was not formally eliminated until 1964.10

Wilson, like many other observers to this day, charitably conceded that the framers were mostly distrustful of those without substantial property because they believed the poor were incapable of having a will of their own and could therefore be misled by some artful charlatan to the ruin of the republic; that is, the tyranny of the majority. As people became more educated and responsible owners of property themselves, they would be trusted with full participation in the governing process. As numerous historians have demonstrated, however, the framers were more than genteel patricians on the lookout for their dim-witted and gullible brethren in the lower orders. Their more pressing concern, as historian Alexander Keyssar puts it, was “that men who were financially strapped would band together to defend their own interests.”11

The revolution against Britain had been fought by the poor and those of limited means, and they did so inspired by the radical democratic words of Thomas Paine in Common Sense. It is for that reason that Benjamin Franklin told Paine that he, Paine, “was more responsible than any other living person on this continent for the creation of what are called the United States of America.”12 The vision of the farmers and artisans who attended to battle was not a war to simply switch flags and end up with a native ruling class over them. It was to create a far more egalitarian and fair society that they participated in governing. Shays’ Rebellion of 1786–1787 is the best known of a number of popular uprisings throughout this period that alarmed the elites that gathered in Philadelphia to draft the Constitution in the summer of 1787.13

Indeed, the federal Constitution was in significant part a reaction to a series of more democratic state constitutions that had already been drafted and approved in the new Republic. The most radical was written in the summer and fall of 1776 in Pennsylvania by a group that included Franklin and Paine. Much of the momentum came from rural farmers, the poor, artisans, and the mid-level merchants of Philadelphia, who by 1775 had become, as William Hogeland puts it, “a powerful street constituency in favor of American independence as a way to promote economic equality.”14 The Pennsylvania constitution allowed for universal male suffrage and the election of a powerful unicameral legislature for one-year terms. The judiciary and executive branches were weak, and no bill could become law after being passed until it had been in print for a year so citizens could respond to it. It also called for the state to promote public education and the establishment of universities. The first assemblies elected under this constitution “began passing laws to regulate wealth and foster economic development for ordinary people.” Laws specifically were passed, according to Hogeland, to “restrict monopolies, equalize taxes, abolish slavery, and otherwise pursue, now through legitimate government, the old goals of popular agitation.”15

So when the lawyer Edmund Randolph of Virginia opened the proceedings for the first meeting of the United States Constitutional Convention, he was blunt. “Our chief danger,” he told a group that included George Washington, “arises from the democratic parts of our constitutions.” As Hogelund observes, like Randolph, the founders’ use of the term democracy was pejorative, “sometimes to mean mob rule, sometimes to mean unchecked representation, and sometimes to mingle the two.” The states with their more popular constitutions and suffrage laws were the problem. “None of the constitutions,” Randolph said, “have provided sufficient checks against the democracy.”16

When today’s “conservatives” demand a return to the original vision of the US Constitution, they often celebrate these antidemocratic elements—its minimizing the power of one-person, one vote; its lack of any “positive rights” outlining what citizens might expect from their government; for a few, even its appointment rather than direct election of US senators as signs of genius and the basis for individual freedom. From there it is a short leap to regarding the Constitution as a full-throttle endorsement of “free market” capitalism with a Gilded Age disregard for the poor and working class, who are “free” to pull themselves up by their bootstraps, as the saying goes. This interpretation is self-serving; it takes the US Constitution out of context. It eliminates the actual, and rich, traditions that do exist. It turns the Constitution into a cardboard rationale for what the contemporary dominant interests favor today and going forward.

Two critical points of context are these: first, the US Constitution dealt with the political economy it knew and anticipated would continue. There was no speculation about what the economy might or might not look like—as well as should or should not look like—one hundred or two hundred or three hundred years down the road, any more than sane people would do today. It was written in the context of a mercantile economy based overwhelmingly on agriculture and significantly on slave labor. It was an economy with private property, but it was decidedly preindustrial capitalist in economic terms.

Capitalism, as we now know it, is based on the competition of individual capitalists to get ever-richer through the pursuit of profit, in a never-ending manner; it generally requires a propertyless urban working class to provide the labor to generate the profits. Such a growth of private capital leads to routine overall economic growth, and that increased economic output continually increases the material wealth of the society. That is what is revolutionary about capitalism and distinguishes it from all its economic predecessors. But capitalism in that sense was not even close to existence in 1787. Ninety percent of adults worked in agriculture, and there was only one bank—Alexander Hamilton’s Bank of New York, founded in 1784—in the entire country. There were few cities and none had a population of more than 100,000. It took weeks to travel overland from Maine to Georgia, or to cross the Atlantic.17 Routine annual economic growth was largely unknown anywhere in the world before the nineteenth century.18 That explains why in the US Constitution or the debates about the Constitution, terms like corporations, profits, competition, enterprise, growth, and markets are nowhere to be found. Many of those terms would not become widely used for several generations.** Indeed, the use of the term capitalism to describe this revolutionary new economic system begins in Germany in the 1880s, and only gains common usage in English and French at the beginning of the twentieth century.19

Moreover, the US Constitution was drafted during an era in which the notions that the endless increase of one’s wealth and property is good both for society and, as important, a person’s inherent right—central, even necessary, postulates of a modern capitalist society—were considered dubious, if not folly. As Franklin put it in 1783:

All the Property that is necessary to a Man, for the Conservation of the Individual and the Propagation of the Species, is his natural Right, which none can justly deprive him of: But all Property superfluous to such purposes is the Property of the Publick, who, by their Laws, have created it, and who may therefore by other Laws dispose of it, whenever the Welfare of the Publick shall demand such Disposition. He that does not like civil Society on these Terms, let him retire and live among Savages. He can have no right to the benefits of Society, who will not pay his Club towards the Support of it.20

To the extent Americans got a sense of the industrial capitalism then in its very earliest stages in England, there was no groundswell, let alone a consensus, that this was at all desirable for the United States. In his 1785 Notes on the State of Virginia, Thomas Jefferson wrote “Let our workshops remain in Europe.”21

Then, as corporations began to proliferate in the first few decades of the nineteenth century, the framers, not to mention many others, while recognizing corporations’ economic advantages, were immediately and deeply concerned about the ability and commercial incentive of corporations to corrupt and destroy the political system.22 Jefferson wrote on this topic with some regularity in the last two decades of his life.23 In 1816, for example, in a letter to Pennsylvanian George Logan, he wrote: “I hope we shall . . . crush in its birth the aristocracy of our monied corporations which dare already to challenge our government to a trial of strength, and to bid defiance to the laws of their country.”24

The second critical point of context is that in 1787 many of the states had constitutions that included or enabled their legislatures to enact positive rights and social welfare provisions. These were striking by the standards of those times, and inspiring even for these times. Shouldn’t this tradition be every bit as important to understanding American history? These state constitutions were generally much more democratic undertakings than the US Constitutional Convention, the products of sometimes significant popular participation. These were the values of the (white male) American people, to the extent they were consulted. The federal Constitution understandably regarded positive rights and social welfare provisions as being handled at the more democratic state level—where most of the work of government was to take place—and not a federal affair. For that reason, as scholar Cass Sunstein puts it, “the Constitution’s framers gave no thought to including social and economic guarantees in the bill of rights.”25 Constitutional historian Emily Zackin writes that because of the centrality of state constitutions at the time, “while the Bill of Rights may reflect a suspicion of the federal government, we cannot infer from this document that even its drafters were suspicious of all government.”26 Indeed, the preamble to the Constitution states the very purpose of the federal government is “to form a more perfect Union, establish Justice, insure domestic Tranquility, provide for the common defense, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity.” When one looks at the writings of the framers on what the functions of government be, they were anything but modern-day free-market libertarians.27

Consider the matter of public education. It is not mandated in the federal Constitution. Does that mean the framers did not regard it as essential for the nation, or that it should not be paid for and conducted by the government? Of course not. The Northwest Ordinance of 1787, later reaffirmed by the first Congress in 1789, set the statehood conditions for the territories west of the Appalachian mountains to enter the union on equal footing with the original thirteen states. Chief among the requirements was the establishment of “schools and means of education” which were “necessary to good government and the happiness of mankind.”28 No less than John Adams, among the least sympathetic of the founders to popular suffrage and democratic governance, put the matter of education this way in 1785: “The whole people must take upon themselves the education of the whole people and be willing to bear the expenses of it. There should not be a district of one mile square, without a school in it, not founded by a charitable individual, but maintained at the public expense of the people themselves.”29

The same is true on the matter of economic inequality. Jefferson is on record in 1785 as writing about the use of aggressive government action to reduce economic inequality, an intervention he regarded as highly desirable and necessary for effective governance.30 “I am conscious that an equal division of property is impracticable. But the consequences of this enormous inequality producing so much misery to the bulk of mankind, legislatures cannot invest too many devices for subdividing property, only taking care to let their subdivisions go hand in hand with the natural affections of the human mind.”31 Noted educator and Federalist Noah Webster stated in 1787 that political equality required “a general and tolerably equal distribution of landed property.”32 Only a few years following the passage of the Constitution, James Madison argued in an essay titled “On Parties” that it was imperative to establish genuine “political equality among all,” and a main way to accomplish that was by “the silent operation of the laws, which, without violating the rights of property, reduce extreme wealth towards a state of mediocrity, and raise extreme indigence toward a state of comfort” (our emphasis). Because this program was not spelled out in the federal Constitution does not mean the founders found it unimportant or improper.

Late in his life, as Madison prepared his notes on the constitutional debates for publication, he wrote that his views on suffrage had matured and changed in the intervening years. “The right of suffrage,” he now wrote, “is a fundamental Article in Republican Constitutions.” He reasserted his belief that, for the republican system to succeed, the government needed to reduce the great wealth of the few to benefit those without property. He no longer thought of the Constitution as primarily necessary to protect the few from the tyranny of the majority. Instead, Madison changed course and now defended majority rule as the single “vital principle of republican government.”33 After reviewing this and other evidence, Dahl writes, “I have little doubt that if the American Constitutional Convention had been held in 1820, a very different constitution would have emerged from the deliberations.”34 That same year, noting the Constitution’s complicity with slavery, future president John Quincy Adams wrote, “The constitution is a compact with Hell, and a life devoted to its destruction would be a life well spent.”35

CONSTITUTIONS ON MILITARISM

It is ironic and colossally tragic that the two areas where the US Constitution and Bill of Rights anticipated and spoke most directly to raging perennial threats to democracy—threats that preceded capitalism and that will survive it as well—are today those provisions that are almost entirely ignored. These are the limitations on militarism and the recognition of the crucial role of the government in guaranteeing the existence of a viable independent press, or news media. Both relate strongly to the deep concern of the framers to prevent corruption, which they regarded as a deadly threat to any governing system. “My wish,” Madison said, “is that the national legislature be as uncorrupt as possible.” The Articles of Confederation, which preceded the Constitution, strictly prohibited any person working for the United States or any state government from accepting “any present, emolument, office or title of any kind.” As law professor Zephyr Teachout writes, “Charges of corruption and its variants were an essential force in the creation of the Constitution, and part of almost every debate about government structure.”36

The first great imperative for promoting (or protecting) a democratic infrastructure are the sections in the federal Constitution limiting the capacity of the government to engage in war. At least six of the state constitutions drafted prior to 1787 had phrases similar to this one from the Pennsylvania and North Carolina constitutions: “As standing armies in time of peace are dangerous to liberty, they ought not to be kept up.”37 The US Constitution strictly limits the war-making power of the president and puts the power to declare war and the obligation to raise funds to pay for war in the hands of Congress. The Constitution’s purpose was “clogging rather than facilitating war,” as delegate George Mason stated to broad approval during the Constitutional Convention.38 The point of the Second and Third Amendments—the “military amendments,” as constitutional scholar Akhil Reed Amar terms them—“centrally focuses on the structural issue of protecting civilian values against the threat of an overbearing military. No standing army in peacetime can be allowed to dominate civilian society, either openly or by subtle insinuation.”39

The framers, men like Madison and Jefferson, were classically educated. They saw that warfare, militarism, and endless imperialism had sounded the death knell for Greek democracy, the Roman Republic, and the democratic prospects for the great nations of contemporary Europe. Both Jefferson and Madison were obsessive on this point. “Even under the best forms of government,” Jefferson observed, “those entrusted with power have, in time, and by slow operations, perverted it to tyranny.”40 “Perhaps it is a universal truth,” Madison wrote to Jefferson in 1798, “that the loss of liberty at home is to be charged to the provisions against danger, real or pretended, from abroad.”41 Madison put it best in his timeless declaration that

of all the enemies of true liberty, war is, perhaps, the most to be dreaded, because it comprises and develops the germ of every other. War is the parent of armies; from these proceed debts and taxes; and armies, and debts, and taxes are the known instruments for bringing the many under the domination of the few. In war, too, the discretionary power of the Executive is extended; its influence in dealing out offices, honors and emoluments is multiplied; and all the means of seducing the minds, are added to those of subduing the force, of the people. The same malignant aspect in republicanism may be traced in the inequality of fortunes, and the opportunities of fraud, growing out of a state of war, and in the degeneracy of manner and of morals, engendered in both. No nation can preserve its freedom in the midst of continual warfare.42

In his 1796 “farewell address” to the nation—a letter written with the assistance of Alexander Hamilton and read before at least one branch of Congress annually for the past 150 years, and a great inspiration to the anti-imperialist movements of the late nineteenth and early twentieth centuries—President Washington cautioned his peers and future generations to “avoid the necessity of those overgrown military establishments, which, under any form of government, are inauspicious to liberty, and which are to be regarded as particularly hostile to Republican Liberty.”43

Aside from the numerous continental campaigns to seize lands from the indigenous populations as well as Mexico, the United States heeded the spirit of the Constitution and its framers for much of its history, generally demobilizing immediately after a declared war. “As late as the 1930s,” political scientist Walter Dean Burnham observes, “the U.S. Army contained scarcely more men under arms than the hundred thousand to which the Treaty of Versailles limited Germany.”44 Much of military and naval war production was under state control and when wars generated vast fortunes, as happened during the Civil War and World War I, scandals ensued. In the 1930s US Senator Gerald Nye of North Dakota denounced the munitions industry as “an unadulterated, unblushing racket.”45 When America entered the Second World War, President Franklin D. Roosevelt stated, “I don’t want to see a single war millionaire created in the United States as a result of this world disaster.”46 FDR was obsessive on this point. In his 1944 State of the Union Address, he observed that “for two long years I have pleaded with the Congress to take undue profits out of war.”47

That seems like the ancient history of some very different nation. Following World War II the United States abandoned its commitment to being a peacetime nation unless in a formally declared war and became what Madison feared: a continual warfare society.48 A massive and self-perpetuating military-industrial complex was created in a few short years, largely outside civilian review or control, which is now hard-wired to some of the largest private corporations. It is a huge profit center for investors. This military-industrial complex has extended the power of the executive branch relative to Congress in ways that are beyond all previous understanding.49 Any effort to strengthen the democratic infrastructure must address the problem of continual warfare head-on, and the wisdom and counsel of the framers provides the necessary starting point.

THE CONSTITUTION AND A FREE PRESS

Everyone is familiar with the First Amendment’s prohibition on government interference with the press. This is something that has inspired advocates of liberty worldwide, and of which Americans are justifiably proud. But understanding the Constitution’s commitment to a free press as a negative right—as something the government cannot do to citizens who wish to engage in journalism—is only half-right. In the haze of the past century of commercially driven news media, nearly all Americans have lost sight of the fact that the American free press constitutional tradition has a second component, every bit as important as the prohibition against prior restraint and censorship: it is the highest duty of the government to see that a free press actually exists so there is something of value that cannot be censored. It is difficult to exaggerate the importance of this issue at the beginning of our country.

In 1787, as the Constitution was being drafted in Philadelphia, Thomas Jefferson was ensconced in Paris as minister to France. From afar he corresponded on the singular importance of the free press. Jefferson wrote:

The way to prevent these irregular interpositions of the people is to give them full information of their affairs thro’ the channel of the public papers, and to contrive that those papers should penetrate the whole mass of the people. The basis of our governments being the opinion of the people, the very first object should be to keep that right; and were it left to me to decide whether we should have a government without newspapers, or newspapers without a government, I should not hesitate a moment to prefer the latter. But I should mean that every man should receive those papers and be capable of reading them.

For Jefferson, just having the right to speak without government censorship is a necessary but insufficient condition for a free press, and therefore democracy. It also demands that there be a literate public, a viable press system, and that people have easy access to this press.

But why, exactly, was this such an obsession to Jefferson? In the same letter, Jefferson praises Native American societies for being largely classless and happy, and criticizes European societies—like the France he was witnessing firsthand on the eve of its revolution—in no uncertain terms for being their opposite. Jefferson also stakes out the central role of the press in stark class terms when he describes the role of the press in preventing exploitation and domination by the rich over the poor:

Among [European societies], under pretence of governing they have divided their nations into two classes, wolves and sheep. I do not exaggerate. This is a true picture of Europe. Cherish therefore the spirit of our people, and keep alive their attention. Do not be too severe upon their errors, but reclaim them by enlightening them. If once they become inattentive to the public affairs, you and I, and Congress, and Assemblies, judges and governors shall all become wolves. It seems to be the law of our general nature, in spite of individual exceptions; and experience declares that man is the only animal which devours his own kind, for I can apply no milder term to the governments of Europe, and to the general prey of the rich on the poor.50

In short, a free press has an obligation to call out, to challenge, to undermine the natural tendency of propertied classes to dominate politics, open the doors to corruption, reduce the masses to effective powerlessness, and eventually terminate self-government.

James Madison was every bit Jefferson’s equal in his passion for a free press. Together they argued for it as a check on militarism, secrecy, corruption, inequality, and empire. Near the end of his life, Madison famously observed, “A popular government without popular information or the means of acquiring it, is but a Prologue to a Farce or a Tragedy or perhaps both. Knowledge will forever govern ignorance, and a people who mean to be their own Governors, must arm themselves with the power knowledge gives.”51

An institution this important to the very existence of democracy is not something you roll the dice on and hope you get lucky. There was no sense in this period (and for a long time thereafter) that as long as the government did not censor newspapers, private citizens or businesses would have sufficient incentive to produce a satisfactory press. Indeed, the Constitution’s creation of the Post Office was above all else a commitment to seeing that newspapers were distributed effectively and inexpensively. That is why Jefferson and Madison met with President Washington and urged him to name as the first postmaster general the most prominent and radical of the country’s pamphleteers, Thomas Paine.52 For the first century of American history most newspapers were distributed by the mails, and the Post Office barely charged newspapers anything to be delivered. (In around 10 percent of the cases, newspaper delivery through the mails was free.) It was quite consciously a subsidy by the federal government to make it economically viable for many more newspapers to exist than would otherwise be the case. Newspapers constituted more than 90 percent of the Post Office’s weighted traffic, yet provided only about 10 percent of its revenues. If the United States government subsidized journalism in the second decade of the twenty-first century as a percentage of GDP to the same extent it did in the first half of the nineteenth century, it would spend in the area of $35 billion annually.53

This was the case through much of the nineteenth century. The Post Office was by far the largest and most important branch of the federal government, with 80 percent of federal employees as of 1860. Delivery in major cities was often two or three times daily, six or seven days per week. Newspaper subsidies constituted a policy that worked magnificently; the United States had vastly more newspapers than any other nation on a per capita basis during this period. When Alexis de Tocqueville made his journey across America in the 1830s, he was astounded by the prevalence of newspapers far beyond that to be found anywhere else in the world. “The number of newspapers,” he wrote, “exceeds all belief.” He wrote of the close relationship between newspapers, equality, and democratic governance. “The power of newspapers must therefore increase as men become equal.” It was no accident or the consequence of free markets; it was due to explicit government policies and subsidies.54

Although this second component of the American free press tradition has been largely forgotten since the advent of the commercial era of journalism, the US Supreme Court, in all relevant cases, has asserted its existence and preeminence. In the Supreme Court’s 1927 Whitney v. California case, Justice Louis Brandeis concluded, “Those who won our independence believed that the final end of the State was to make men free to develop their faculties; . . . that the greatest menace to freedom is an inert people; that public discussion is a political duty; and that this should be a fundamental principle of American government.”55 In perhaps the greatest free press case in Supreme Court history, Associated Press v. United States (1945), in his majority opinion, Hugo Black wrote that the First Amendment “rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public, that a free press is a condition of a free society.”56 Justice Potter Stewart noted that “the Free Press guarantee is, in essence, a structural provision of the Constitution” (Stewart’s emphasis). It required government policies to guarantee its existence. “The primary purpose of the constitutional guarantee of a free press was,” he added, “to create a fourth institution outside the Government as an additional check on the three official branches.” Stewart concluded: “Perhaps our liberties might survive without an independent established press. But the Founders doubted it, and, in the year 1974, I think we can all be thankful for their doubts.”57 In his opinion in the 1994 case Turner Broadcasting System v. FCC, Reagan appointee Justice Anthony Kennedy concluded, “assuring the public has access to a multiplicity of information sources is a governmental purpose of the highest order.”58

It is understandable why this rich recognition of the constitutional commitment to a free press declined by the end of the nineteenth century: newspaper publishing became extremely lucrative and the subsidies disappeared or came to play a smaller role. But today, with the emergence of the Internet, the commercial journalism model based upon advertising providing the lion’s share of the revenues is disintegrating. There are far fewer paid reporters and editors on a per capita basis, accounting for both old and new media, than there were twenty-five years ago. Most of governance and the relationship of governance to commercial interests go unreported, and few people have any idea what is happening. It is Jefferson’s and Madison’s worst nightmare, and there is nothing on the horizon to suggest a commercial solution to the problem. It is high time for Americans to embrace their full constitutional rights and demand policies and subsidies to create a viable, competitive, independent, and uncensored news media. Much as the framers understood, it is difficult to imagine how anything remotely close to a democratic society can exist unless this happens.59

CHANGING CONSTITUTIONS

There is one aspect of the Constitution that has not been ignored, to the resounding approval of those who like the status quo: it is notoriously difficult to revise or update. The US Constitution has been amended only sixteen times in the past 220 years; two of those were the prohibition amendments that cancelled each other out, and six other amendments were largely noncontroversial bookkeeping measures.

Consequently, the 1791 document including the Bill of Rights remains largely intact, with a mere eight significant amendments since George Washington left office, and three of those were the great Reconstruction amendments that passed on the heels of the nation’s bloodiest war—when defeated Southern states could not participate.60 Not only is the Constitution out of date, constitutional scholar Daniel Lazare writes, but “by imposing an unchangeable political structure on a generation that has never had an opportunity to vote on the system as a whole, it amounts to a terrible dictatorship of the past over the present.” Because it is “virtually impossible to alter the political structure in any fundamental way,” Lazare adds, Americans have “one of the most unresponsive political systems this side of the former Soviet Union.”61

Before anyone reports Lazare to Homeland Security or the National Security Agency, recall that this was precisely Jefferson’s position as well. “Each generation is as independent as the one preceding, as that was of all that had gone before. It has then, like them, a right to choose for itself the form of government it believes most promotive of its own happiness,” Jefferson wrote in 1816. He called on the Constitution to be amended so that there would be a new constitutional convention every “nineteen or twenty years,” such that every generation would have the opportunity to create its own politics and governance.62

This is, again, where states provide a rich alternative approach in American history; unlike the federal Constitution, popular involvement in state constitutions is encouraged. “The realm of state constitutional law is a beehive of activity,” one scholar writes.63 This began back in the 1770s and 1780s, when states were routinely meeting to draft and redraft constitutions, and has continued to this day.64 By 2005 the fifty states had held a combined 233 constitutional conventions, adopted 146 different constitutions, and ratified over six thousand amendments to their existing constitutions.65 Around ten thousand amendments have been submitted to voters for consideration. As Zackin put it in her cleverly titled book, Americans have been “looking for rights in all the wrong places.”66

If you want to see what democracy looks like, put Washington DC in your rearview mirror and head to the states. It is here that the battles for a democratic infrastructure have been waged. In general, what one finds when examining state constitutions is that “Americans have used their constitutions to demand protective and interventionist government,” beginning in the mid-nineteenth century.67 In the constitutions of the various states one finds provisions to address the concern that Jefferson and Madison highlighted, the need to improve the lives of those without property. It is here, for example, that numerous states, as a result of popular organizing, have put in place constitutional protections for the right to collective bargaining, and, more recently, the right to a clean and sustainable environment.68

The most important area of state constitutional involvement in positive rights and building a democratic infrastructure has unquestionably been education. “Every state constitution,” legal scholar Katherine Twomey writes, “includes a clause requiring the state legislature to establish a free system of public schools for children residing within its borders.”69 Universal public education dates back to Horace Mann and the common school movement of the 1830s, and it was a rejection of the European system of a liberal education for the children of the privileged and vocational training for the children of the masses.70 As Townsend Harris of the New York City Board of Education advocated in 1847, “Open the door to all—let the children of the rich and poor take their seats together and know no distinction save that of industry, good conduct, and intellect.”71 In this sense the vision for American public education was fiercely egalitarian. The notion was best expressed in 1907 by John Dewey: “What the best and wisest parent wants for his own child, that must the community want for all its children. Any other ideal for our schools is narrow and unlovely; acted upon, it destroys our democracy.”72

Every bit as important as promoting equality and fairness, as education historian Diane Ravitch has emphasized, is that “the essential purpose of the public schools, the reason they receive public funding, is to teach young people the rights and responsibilities of citizens . . . to sustain our democracy.” This was the idea from the very beginning; education for a role in the economy was secondary.73 When Texas declared its independence from Mexico in 1836, it stated, “It is an axiom of political science that unless a people are educated and enlightened it is idle to expect the continuance of civil liberty or the capacity for self-government.”74 Public education has been a cornerstone of the democratic infrastructure. So much of what is democratic—indeed, so much of what is truly exceptional—about America can be attributed to it. Its rejuvenation and development are imperative to any vision of a healthy democratic infrastructure.

A US CONSTITUTION FOR MODERN TIMES?

Following the Gilded Age of the final decades of the nineteenth century, the United States became a substantially more democratic nation in the seven decades from 1900 to 1970. For starters, with the 1971 passage of the Twenty-Sixth Amendment to the Constitution allowing eighteen- to twenty-year-olds to vote, the United States finally—195 years after the Declaration of Independence, 182 years after the passage of the Constitution, and 108 years after Lincoln’s Gettysburg Address—theoretically granted suffrage to the entire population of adult citizens. It took decades of ferocious and heroic organizing and political struggle to win the franchise for poor people, women, young people, Native Americans, African Americans, and other people of color. Six or seven decades earlier, suffrage was restricted mostly to white males; and a century before that, to mostly rich white males. Theodore Roosevelt won a massive landslide victory in the 1904 presidential election . . . with the votes of 15 percent of the adult population. That looks like democratic nirvana compared to Thomas Jefferson’s overwhelming landslide reelection to the presidency a century earlier. Jefferson got a whopping 73 percent of the total vote that was cast in 1804—impressive enough until one realizes it accounted for less than 3 percent of the nation’s population. (Part of the reason for this astonishing statistic is that only 11 of the nation’s 17 states even held popular votes for president in 1804.)75

There were ebbs and flows to be certain over these seven decades from 1900 to 1970, but the expanded democracy generated a variety of crucial policies that probably would have not been enacted, left to the whims of traditional elites. These include progressive income taxation, inheritance taxation on the wealthy, the right to form trade unions, abolishing child labor, social security, expanded universal public education and higher education, Medicare, unemployment insurance, and a variety of regulations to protect the environment, workplace safety, and consumers. Much of this dealt directly and indirectly with building out the democratic infrastructure, making it possible for all citizens to participate more effectively in the political system. Throughout this period elected bodies worked to limit the role of money in elections. Citizens won the right to elect judges in most states, while initiatives, referenda, and recalls were put into place. In 1913 the Seventeenth Amendment to the US Constitution was passed, requiring the direct election of senators. The first great wave of this democratic surge took place during the Progressive Era, the first fifteen years of the twentieth century.

We turn now to the 1930s and 1940s, a period of great relevance for Americans today, perhaps more than any other. Then, even more so than today, unemployment was extremely high by historical standards. In such an environment traditional policies fail and are discredited, and the center collapses. Unemployment has a way of getting a person’s attention more than almost any other issue and has inspired new social forces demanding reform or revolution. Some nations responded with sharp lurches toward fascism, like Germany; others enhanced their democracies and moved moderately to the left, like the United States. Fascism was an extraordinary and unprecedented development for democracies. Prior to 1900 no one considered the idea that an advanced industrial society could have popular movements to explicitly end democracy; it went against everything people knew.76

In the late 1930s and early 1940s, as tens of millions of people worldwide would lose their lives in the battle against fascism, the United States wrestled with a fundamental question: what can the nation do, the world do, to see that the specter of fascism never raises its head again to threaten humanity’s survival? What institutions and practices, what democratic infrastructure, are required to lessen the chances of falling into the bottomless pit of fascist tyranny?

In 1938 FDR singled out the economic problems in the United States that he associated with the rise of fascism. It arises in nations where people have “to go without work, or to accept some standard of living which obviously and woefully falls short of their capacity to produce.” It is then that people feel “the oppressive sense of helplessness under the domination of a few, which are overshadowing our whole economic life.” It was the greed, corruption, and power of monopolistic big businesses that Roosevelt saw as the motor force for fascism. And for that reason he authorized a comprehensive study of economic concentration with an eye to breaking up monopolistic power. “The power of a few to manage the economic life of the nation must be diffused among the many or be transferred to the public and its democratically responsible government. If prices are to be managed and administered, if the nation’s business is to be allotted by plan and not by competition, that power should not be vested in any private group or cartel, however benevolent its professions profess to be.”77

This anti-monopoly thrust was curtailed as the nation entered full war preparations with the start of the European war in 1939. The United States entered the war in 1941, and it was at this moment, with clarity and conviction, that President Roosevelt and others laid out a vision of how to ensure the survival and growth of democracy. Of course FDR had incentive to define the war effort in noble terms to galvanize enthusiasm among the mass of the population for the immense sacrifices required during wartime. But historians who have studied the matter are convinced that it was far more than that; this was an issue dear to FDR and to many people in the United States at the time.78 The signal statements were the “Four Freedoms” and the “Second Bill of Rights.” It was a part of the weltanschauung, or dominant world view, that crystallized in the later 1930s—a period with a thriving labor union movement and immense popularity for government programs like Social Security—that saw the United States become a much more progressive and democratic nation. “For the past several years,” Republican presidential candidate Wendell Willkie wrote in 1940, “practically everybody claims to be a liberal.”79

FDR unveiled the Four Freedoms in his January 1941 State of the Union Address to Congress. He offered up four universal principles for a free and democratic world, which he hoped would define the war against the Axis powers:

In the future days, which we seek to make secure, we look forward to a world founded upon four essential human freedoms.

The first is freedom of speech and expression—everywhere in the world.

The second is freedom of every person to worship God in his own way—everywhere in the world.

The third is freedom from want—which, translated into world terms, means economic understandings which will secure to every nation a healthy peacetime life for its inhabitants—everywhere in the world.

The fourth is freedom from fear—which, translated into world terms, means a world-wide reduction of armaments to such a point and in such a thorough fashion that no nation will be in a position to commit an act of physical aggression against any neighbor—anywhere in the world.

That is no vision of a distant millennium. It is a definite basis for a kind of world attainable in our own time and generation.80

In short, the war was not being fought merely to defeat an enemy. It was a pivot point after which nothing would be the quite the same. Countries would change, and for the better, recognizing that the best defense against totalitarianism was the shaping of a good society where poverty and militarism were to be eliminated. These were core freedoms that all people deserved. And Roosevelt was certain that, after struggling through the Great Depression and World War II, that Americans were ready for the change.

The Four Freedoms became a big deal. “The people of the United States through their President have given the world a new Magna Carta of democracy,” wrote newspaper editor William Allen White.81 By early 1942, after America entered the war, it was these criteria that would be used by a majority of Americans to explain why the nation was at war. A May 1942 Intelligence Survey showed that the Four Freedoms “have a powerful and genuine appeal to seven persons in ten.” It became a battle for public opinion, albeit a bit one-sided. Even the conservative Saturday Evening Post—whose president Walter D. Fuller was chairman of the board of the vehemently anti–New Deal National Association of Manufacturers—threw in the towel. It ran FDR-inspired paintings by Norman Rockwell on covers of four consecutive issues, each devoted to one of the Four Freedoms.82

Three years after the Four Freedoms speech, in his 1944 State of the Union Address, as victory in the war was all but certain, FDR introduced the idea of his economic bill of rights, or what has been called the Second Bill of Rights. It is worth reading Roosevelt’s precise words:

It is our duty now to begin to lay the plans and determine the strategy for the winning of a lasting peace and the establishment of an American standard of living higher than ever before known. We cannot be content, no matter how high that general standard of living may be, if some fraction of our people—whether it be one-third or one-fifth or one-tenth—is ill-fed, ill-clothed, ill-housed, and insecure.

This Republic had its beginning, and grew to its present strength, under the protection of certain inalienable political rights—among them the right of free speech, free press, free worship, trial by jury, freedom from unreasonable searches and seizures. They were our rights to life and liberty.

As our Nation has grown in size and stature, however—as our industrial economy expanded—these political rights proved inadequate to assure us equality in the pursuit of happiness.

We have come to a clear realization of the fact that true individual freedom cannot exist without economic security and independence. “Necessitous men are not free men.” People who are hungry and out of a job are the stuff of which dictatorships are made.

In our day these economic truths have become accepted as self-evident. We have accepted, so to speak, a second Bill of Rights under which a new basis of security and prosperity can be established for all regardless of station, race, or creed.

Among these are:

                  The right to a useful and remunerative job in the industries or shops or farms or mines of the Nation;

                  The right to earn enough to provide adequate food and clothing and recreation;

                  The right of every farmer to raise and sell his products at a return which will give him and his family a decent living;

                  The right of every businessman, large and small, to trade in an atmosphere of freedom from unfair competition and domination by monopolies at home or abroad;

                  The right of every family to a decent home;

                  The right to adequate medical care and the opportunity to achieve and enjoy good health;

                  The right to adequate protection from the economic fears of old age, sickness, accident, and unemployment;

                  The right to a good education.

All of these rights spell security. And after this war is won we must be prepared to move forward, in the implementation of these rights, to new goals of human happiness and well-being.

America’s own rightful place in the world depends in large part upon how fully these and similar rights have been carried into practice for our citizens. For unless there is security here at home there cannot be lasting peace in the world.83

Few would question that FDR is one of the three or four most important presidents in American history, and by many accounts he and Lincoln are the greatest. Here he is calling for radical, even revolutionary, additions to the rights guaranteed to all Americans, with the spectacular policy implications that necessarily follow. In effect he is saying that unemployment and poverty should be unconstitutional, a massive amount of democratic infrastructure must be created, and monopolistic big business is now officially a dubious force.

FDR did not seek constitutional amendments. Instead he asked Congress to “explore the means for implementing this economic bill of rights—for it is definitely the responsibility of the Congress so to do.” FDR recognized that there were dangers of a “rightist reaction” that would not only oppose the Second Bill of Rights but would seek “to return to the so-called ‘normalcy’ of the 1920’s.” FDR was unsparing in his assessment of his opponents. He famously had said in 1936 that his was the first administration in a generation to directly challenge “business and financial monopoly,” and as a result, “they are unanimous in their hate for me—and I welcome their hatred.”84 Now, he was even more pointed. If the rightist reactionaries triumphed as the nation entered the postwar years, he said, “then it is certain that even though we shall have conquered our enemies on the battlefields abroad, we shall have yielded to the spirit of Fascism here at home.” The enemies of freedom were the businesses and their “selfish pressure groups who seek to feather their nests while young Americans are dying.”85

What makes this moment even more extraordinary was that the Second Bill of Rights was not a partisan undertaking. In 1940 FDR defeated the Republican Willkie, a businessman and self-described champion of free enterprise, for the presidency.86 As the party’s titular head, Willkie remained in the public eye, and he prepared to run for president again in 1944. But Willkie made a point of embracing proposals by FDR that he said made sense for America and the world. By 1942 the Republican was a strong proponent of the Four Freedoms, and he was commissioned by President Roosevelt to travel the world meeting with World War II allies. Not only did Willkie find himself in agreement with FDR, he criticized him from the left, on the grounds that not enough was being done to battle racism against African Americans.87 After Willkie dropped out of the 1944 race for the Republican nomination, he published a book, An American Program, with seven essays on what should be done in America regardless of which party won the White House that November.

An American Program is an astonishing statement, and says quite a bit about the weltanschauung. Willkie fully supports expanding Social Security, making taxation more progressive, and ensuring that the government do what must be done to guarantee a “reasonably high level of employment.” It is necessary that “every child in America grows up with the basic necessities of education, good food, adequate clothing, medical care and a decent home.”88 Most striking, Willkie emphatically supports organized labor and collective bargaining. “Labor’s inherent right to strike,” he wrote, “is the basis of all its rights. . . . Every thoughtful American knows today that a strong labor movement is one of the greatest bulwarks against the growth of fascistic tendencies and consequently is necessary for our democratic way of life.”89 Willkie, the former corporate president, shares FDR’s distaste for large powerful corporations: “Monopolies and monopolistic prices threaten the very existence of the free enterprise system.”90 So close was Willkie to FDR, including on the component parts of the Economic Bill of Rights, that the two of them broached the idea of forming a new political party to unite all the liberals in the nation, and leave the Southern segregationists and big business “reactionaries” to have their own party.91 Alas, Willkie died suddenly in October 1944 at age fifty-two and FDR died six months later, so nothing came of it.

With FDR in poor health and then gone, the Second Bill of Rights never got anywhere in Congress. For Americans, there was one notable program that grew from the Second Bill of Rights: the GI Bill, which was signed into law in June 1944. The bill gave veterans access to vocational training and higher education, as well as housing and medical benefits while in school, and low-interest loans for buying homes and starting businesses. It was no small contribution to postwar society—indeed, it is regarded as a definitional legislative accomplishment—but it was just a fraction of what FDR thought necessary for the entire population.92 Vice President Henry Wallace never wavered; he argued in 1944 that to defeat fascism in the postwar era, the great benefits of the “immense and growing volume of scientific research, mechanical invention and management technique” needed not only to be promoted, but the benefits were to be shared across society. Wallace argued this required political “inventions” comparable in scope to the economic inventions. His grave warning was that “fascism in the postwar inevitably will push steadily for Anglo-Saxon imperialism and eventually for war with Russia. Already American fascists are talking and writing about this conflict and using it as an excuse for their internal hatreds and intolerances toward certain races, creeds and classes.”93

Coming out of the war, many Americans assumed the battle to permanently defeat fascism would continue, as would the democratic advances of the people. In 1946, for example, Congressman Wright Patman, a Democrat from Texas, wrote that “there are many strong symptoms of fascism in our own democratic society. True, this movement in the United States masquerades under other names than the discredited one of fascism, but whatever it may be called, its peculiar characteristics are alarmingly evident.” Patman, a nationally prominent populist and New Dealer who served nearly five decades in Congress, listed several requirements to minimize the threat of fascism: high-quality public education for all Americans regardless of their economic background; guaranteed full employment and a strong labor movement; and to “make certain that the existing government operates honestly and efficiently.” As Patman concluded: “no really strong democracy has fallen before fascism.”94 Patman requested the Legislative Reference Service of the Library of Congress to make a detailed analysis of the historical record of fascism, so the American people and government could take steps to minimize any prospective success in the United States. The 206-page report concluded by characterizing fascism as a system that “favors big business, strengthens the position of heavy industries, retains enough of the profit system to permit the elite to build up personal fortunes . . . facilitates cartelization, and spends huge sums for military purposes.” Moreover “free collective bargaining and self-government by labor organizations is abolished.”95 The conclusions dovetailed with the general thrust of the Four Freedoms and the Second Bill of Rights.

Over at the Federal Communications Commission, New Dealer Clifford Durr led the campaign to make radio broadcasting more diverse and less a pawn to commercial and big business interests. The “Blue Book” campaign of the immediate postwar years was the most radical effort to rein in the commercial broadcasting system since its inception, and was done with the clear recognition that a powerful democratic media was mandatory to prevent the weakening of democracy and the rise of fascism.96 But very quickly any momentum to directly attack fascism disappeared, and all attention went to addressing the “communist” threat. It was as if World War II had never happened. Anti-fascism was suspect, unless the critic made it abundantly clear that communism was every bit as evil, as Patman did in his foreword to the 1947 fascism report.97 And eventually anti-fascism just became suspect, period. The liberal New Dealer Durr, who had a distinguished career at the FCC, was redbaited out of office and had trouble finding suitable employment for years.98

That last hurrah for the issues in the Second Bill of Rights came in 1948. Wallace bid for the presidency on the ticket of a new Progressive Party, opposing President Harry Truman and more cautious Democrats on a third-party platform that not only embraced the component parts of economic bill of rights but also included a strong call for demilitarization that was right out of the Four Freedoms. Wallace also called for an end to Jim Crow, recognizing and embracing much of what Willkie and civil rights campaigners such as A. Philip Randolph were saying how racial division had to be ended for the nation as whole to progress. Though polls showed him to be competitive early on, Wallace’s numbers declined as the former vice president faced relentless redbaiting attacks. Historians suggest that Wallace’s candidacy forced Truman to support integration of the military and wage hikes.99 But Wallace, and the drive for a Second Bill of Rights, were finished.

By the end of the 1940s the country experienced a massive red scare—“fear itself” returned with a vengeance—which ended labor’s surge and put it on the defensive, and established a continual warfare economy. By 1949, if not earlier, to advocate loudly what the president had proposed in January 1944 might be enough to cost a person her job, and it certainly would have stigmatized someone as insufficiently patriotic, if not a red. The weltanschauung had seemingly turned on a dime. But, crucially, the Red Scare was not a return to the normalcy of the 1920s, as FDR feared; most of the existing New Deal reforms and much of the democratic infrastructure were too popular to be rolled back. They provided the foundation for the next democratic surge in the 1960s.

The Four Freedoms and the Second Bill of Rights were most influential abroad. The United States occupied Japan and oversaw the Allied occupation of Germany. In both cases the occupiers were concerned with putting in place a democratic infrastructure that would minimize if not eliminate the prospect of a return of fascism.** Notorious anti–New Deal General Douglas MacArthur, who directed the postwar US occupation of Japan, boldly advocated for the creation and protection of labor unions as a bulwark against the return of authoritarian rule.100 The Japanese constitution, which the US effectively wrote, included most of the original Bill of Rights as well as the Second Bill of Rights. It even included the strongest anti-militarism provisions ever written into any nation’s constitution, basically banning the existence of armed forces.101 Moreover, the American occupiers spared no expense bankrolling a large and diverse news media, which they insisted had to be critical of the occupiers to be legitimate. Much the same took place in Germany, with regard to labor rights, press support, and the new constitution that Germans wrote, though in Europe anti-communism curtailed the extent of anti-fascist work.102 Neither nation is perfect in 2015, but the constitutions and infrastructure put in place in the late 1940s have certainly contributed to Germany and Japan being at the very top of the lists of the most democratic nations in the world.103

The influence of the Second Bill of Rights, Sunstein notes, also “played a major role in the Universal Declaration of Human Rights, finalized in 1948 under the leadership of Eleanor Roosevelt and publicly endorsed by American officials at the time.”104 The terminology of the Universal Declaration is, at times, almost interchangeable with the Second Bill of Rights. As Sunstein writes, “All over the globe, modern constitutions follow the Universal Declaration in creating social and economic rights, sometimes using its precise words. They guarantee citizens a wide range of social entitlements.”105 “These principles are chiefly rooted in the 1948 Universal Declaration of Human Rights,” constitutional scholar Judith Blau observes, “and became the basis of constitutions written in the last half of the 20th century as colonies gained their independence or were incorporated into older constitutions.” Blau’s review of the 194 existing national constitutions found that some two-thirds of them include most of the planks of the Second Bill of Rights.106

The great political scientist Robert Dahl once asked this rhetorical question: “If our constitution is as good as most Americans seem to think it is, why haven’t other democratic countries copied it?”107 Well, we now know they have copied it and been inspired by it—only it was the one generated in the 1940s that was never quite made part of the federal Constitution here.108

THE DEMOCRATIC SURGE OF THE 1960S AND EARLY 1970S

Perhaps it was a coincidence, but this period of democratization culminated in a long period of unusually strong economic growth from the late 1940s to the early 1970s. Moreover, as one historian put it, this postwar period “had been the most economically egalitarian period in U.S. history, the point on the graph where the bounty was shared most equitably, and unemployment was at historic lows.” The year 1972 proved to be the apex of real earnings for male workers in American history.109 “America was on a roll during those Eisenhower-Kennedy-Johnson years,” Bob Herbert writes. “Economic, social, and cultural doors were being flung open one after another. There was a buoyancy to the American experience that was extraordinary.”110

The 1960s and early 1970s are a relevant period for us not only because it is so recent and immediately precedes the citizenless democracy we now inhabit, but because that was the first generation that actually grappled in some way with the notion of a post-scarcity society. In 1965, in his commencement address at the University of Michigan, President Lyndon Baines Johnson launched with great fanfare his plans for a “Great Society,” to be built on the spectacular growth in the nation’s productive capacity wrought by “unbounded invention.”

The Great Society rests on abundance and liberty for all. It demands an end to poverty and racial injustice, to which we are totally committed in our time. But that is just the beginning.

The Great Society is a place where every child can find knowledge to enrich his mind and to enlarge his talents. It is a place where leisure is a welcome chance to build and reflect, not a feared cause of boredom and restlessness. It is a place where the city of man serves not only the needs of the body and the demands of commerce but the desire for beauty and the hunger for community.

It is a place where man can renew contact with nature. It is a place which honors creation for its own sake and for what it adds to the understanding of the race. It is a place where men are more concerned with the quality of their goals than the quantity of their goods.

LBJ proposed a dramatic build-out of the democratic infrastructure to end poverty, build magnificent cities, and repair nature.111 Though he failed—and “continual warfare” was a factor in his downfall—this was arguably the first credible post-scarcity vision from someone in power, and it is the prospect facing Americans and humanity today, as least in terms of what we are technologically capable of accomplishing. Compared to the present historical moment, however, what is striking about the 1960s is, even in turbulent times, the overall optimism that difficult social problems could be solved.

There is a measure of irony that most of the great social movements of the 1960s and early 1970s—civil rights, black power, environmental, student, consumer, antiwar, feminist, Chicano, Native American, and gay—were fueled by the conviction that American society was deeply flawed and required radical transformation. A record percentage of young people were getting college educations and upward mobility was the order of the day. By contemporary standards political corruption barely existed. But, by historical standards, an aroused citizenry was dissatisfied and thought America had to do better, and had to go places no democracy had ever gone before. Militarism, racism, environmental calamity, sexual discrimination, inequality, poverty, mindless materialism, and endless commercialism—all had to end. And an astonishing number of young people—including the sort of upper-and middle-class youth that were the bulwark of conservatism in most societies—thought capitalism was a dying economic system and that corporations were dangerous parasites.

During this period the center of gravity for American politics moved to the left on a number of fundamental issues. In 1954 Republican President Dwight Eisenhower wrote to his brother that “should any political party attempt to abolish social security, unemployment insurance, and eliminate labor laws and farm programs, you would not hear of that party again in our political history. Among them are H.L. Hunt (you possibly know his background), a few other Texas oil millionaires, and an occasional politician or business man from other areas. Their number is negligible and they are stupid.”112 Fifteen years later that sentiment was ever more entrenched and extended. The baseline for participation in public discourse—with but a few exceptions, primarily generated by wealthy businessmen and their shock troops of increasingly discredited white supremacists and segregationists—was a commitment to a relatively strong democracy and an enhanced democratic infrastructure. And this happened when capitalism was hitting on all cylinders. The weltanschauung, or dominant worldview, had changed.

Some sense of the shifting tides can be found in the 1972 platform of the Democratic Party, which had nominated George McGovern as its presidential candidate after a successful grassroots insurgency campaign. McGovern, who had supported Henry Wallace’s campaign for president in 1948, was probably the most left-wing major party presidential candidate in US history. “Our traditions, our history, our Constitution, our lives, all say that America belongs to its people. But the people no longer believe it,” the platform began. “They feel that the government is run for the privileged few rather than for the many—and they are right.” It went on to call for (and we paraphrase):

         A guaranteed job for all Americans, with government providing employment if necessary at a living wage

         Huge expansion of public spending projects to rebuild cities, create mass transportation networks, address pollution, and build housing for the poor

         Tax reform to promote a much greater commitment to “progressive taxation” to generate equitable distribution of income and wealth

         Stepped-up antitrust action to break-up “shared monopolies” like those found with the massive corporations that dominated the automobile, steel, and tire industries—that is, restructure the oligopolistic basis of American capitalism (our emphasis)

         Establishing a national economic commission to examine the role of large multinational corporations in the economy to see if federal chartering of corporations is necessary to reduce their influence

         Policies to directly attack the concentration of economic power in fewer and fewer hands

         Extension of trade union rights to workers in the nonprofit and public sector

         Support of a grape boycott to assist the farmworkers in their campaign to establish a union

         Establishing universal comprehensive health insurance controlled, financed, and administered by the federal government so all Americans are covered at all times; that is, Medicare for everyone

         Supporting equalization of spending among school districts to end the disparity between the caliber of public education based upon family income

         Actively supporting community policing

         Recognizing human rights of prisoners and fundamentally restructuring prisons to make them effective rehabilitation facilities

         Reestablishing the congressional role in military affairs, reducing military spending, and ending secrecy, except where absolutely necessary

         A total overhaul of the campaign-finance system with clear limits on donations to prevent candidates from being “dependent on large contributors who seek preferential treatment.” Also, an increase in public funding of elections.

         Universal voter registration by postcard, abolition of the Electoral College, and a run-off election for president if no candidate gets at least 40 percent of the vote

That is just a taste of what was included.113 In short, the 1972 Democratic Party platform effectively called for the fulfillment of FDR’s and Henry Wallace’s anti-fascist democratic vision, along with a broader commitment to the great issues that had emerged subsequently. Put another way, it embraced all the economic social policies raised by A. Philip Randolph and Martin Luther King Jr. in their 1966 Freedom Budget, and the Freedom Budget was taken directly from FDR’s Second Bill of Rights.114

What is almost as striking is the Republican Party response. Its platform certainly positioned itself to the right of the Democrats, but it was nothing like what would become de rigueur for the GOP by the 1980s and thereafter. The basic contours of the welfare state were supported, and when Nixon won his landslide victory over McGovern it was no referendum on the core New Deal policies like Social Security or the right to form trade unions, or on the more recent turns to environmental and consumer regulation. After the 1964 debacle in the presidential election, the Goldwater wing of the party was marginalized and licking its wounds. In the early 1970s the Republican Party had a significant liberal wing in Congress, made up of people like Senators Lowell Weicker, Edward Brooke, and Charles Percy and Representatives Pete McCloskey and John Anderson.

In fact, the Nixon administration (1969–1974) is noted for its passage of trailblazing environmental and consumer legislation, far more sweeping than anything that would follow. “Clean air, clean water, open spaces—these should once again be the birthright of every American,” Nixon said in a lengthy discussion of the environment in his 1970 State of the Union Address. “This requires comprehensive new regulations.” He bragged that his program “will be the most comprehensive and costly program in this field in America’s history.”115 More than a few commentators have observed that on balance the Nixon administration was well to left of the Democratic administrations of Bill Clinton and Barack Obama. It was not that Richard Nixon was some sort of closet lefty by any stretch of the imagination; it was the nature of the times.116 Civilian (non-military) spending by the government hit a peak never seen before or since.117

To get a better sense of these times it is imperative to get away from the elites. What drove both parties leftward were the social movements of the period, particularly the civil rights, student, antiwar, and black power movements, joined later by the women’s and environmental movements.118 There were numerous other related currents that held considerable importance. Perhaps the most significant of these was the movement inspired by consumer advocate Ralph Nader, who by most polls was among the most respected Americans in that period, for people across the political spectrum and across the generations. Nader, a proponent of small “d” strong democracy as well as competitive markets, worked tirelessly to advance open governance and effective regulation of big business on behalf of consumers and workers. His work, according to many, had saved more lives than that of any other living American save Dr. Jonas Salk, the creator of the polio vaccine. Nader made the case to young Americans that government could be a positive force in improving both the economy and people’s lives, and that a career in public service was not only honorable, but a superior way to give one’s life value and meaning.

Simultaneously, there was an enormous counterculture that developed among young people—largely, though not exclusively, from the white middle class, outwardly typified by “hippies”—with a vast culture of music, magazines, FM radio, and underground newspapers. To no small extent this tradition has been erased from history, or it has been replaced with caricatures and stereotypes that do it no justice. Sometimes these hippies and those many they influenced were associated with the activist New Left, but often they were apolitical or only marginally active.119 This counterculture was decidedly radical in values and lifestyles, with an emphasis on creativity, community, egalitarianism, nonviolence, and a rejection of the acquisitive materialism of consumer culture, which was regarded as ethically and morally bankrupt. While most attention has focused on recreational drug use, rock music, loosening sexual mores, or flamboyant fashion, what is most striking for our purposes is their thoroughgoing rejection of dominant social attitudes and the belief that society needed to radically change.120 Nothing expressed this worldview better, and to a larger audience, than Yale law professor Charles Reich’s 1970 bestseller, The Greening of America. Reich projected that the counterculture would lead ultimately to a political revolution by the sheer power of its eventual effective replacement of the Corporate State.121

Reich’s vision is worth considering:

There is a revolution coming. It will not be like revolutions of the past. It will originate with the individual and with culture, and it will change the political structure only as its final act. It will not require violence to succeed, and it cannot be successfully resisted by violence. It is now spreading with amazing rapidity, and already our laws, institutions and social structure are changing in consequence. It promises a higher reason, a more human community, and a new and liberated individual. . . .

This is the revolution of the new generation. Their protest and rebellion, their culture, clothes, music, drugs, ways of thought, and liberated life-style are not a passing fad or a form of dissent and refusal, nor are they in any sense irrational. The whole emerging pattern, from ideals to campus demonstrations to beads and bell bottoms to the Woodstock Festival, makes sense and is part of a consistent philosophy. It is both necessary and inevitable, and in time it will include not only youth, but all people in America.122

It is easy to laugh at what seems like the absurdity of the counterculture today, as it appeared to get so easily co-opted by Madison Avenue and corporate America in subsequent years.123 But it certainly scared the bejesus out of older people—especially people in power—who wondered why so many young people were rejecting that which they worked so hard to acquire. In some respects, this was the first “post-scarcity” generation—the heirs to Mill and Keynes—grappling with what the point of life is in an era when accumulating material goods was no longer necessary for survival.

It was the anarchist writer Murray Bookchin who first understood this. He linked the computer revolution to post-scarcity societies, and imagined the ability for humanity to create truly egalitarian, self-governing, libertarian, and environmentally sound societies. In a series of essays in the late 1960s, Bookchin wrote that the conditions were ripe, for the first time in history, for what he termed “post-scarcity anarchism.” Rebellious American youth, Bookchin wrote, have

produced invaluable forms of libertarian and utopian affirmation—the right to make love without restriction, the goal of community, the disavowal of money and commodities, the belief in mutual aid, and a new respect for spontaneity. Easy as it is for revolutionaries to criticize certain pitfalls within this orientation of personal and social values, the fact remains that it has played a preparatory role of decisive importance in forming the present atmosphere of indiscipline, spontaneity, radicalism and freedom.

He added: “In the era when technological advances and cybernation have brought into question the exploitation of man by man, toil and material want in any form whatever, the cry ‘Black is beautiful’ or ‘Make love, not war’ marks the transformation of the traditional demand for survival into a historically new demand for life.”

Bookchin concluded that the United States was “at a point in history when the boldest concepts of utopia are realizable.” It was based on the “steady destruction in the United States of the myth that material abundance, based on commodity relations between men, can conceal the inherent poverty of bourgeois life.” Two years after Bookchin penned these words the great uprising in Paris of May-June 1968 took place, with slogans like “All Power to the Imagination” and “Be Realistic: Demand the Impossible” conveying these exact sentiments.124

In a weltanschauung where this was a relatively widespread presence and where the Black Panther Party had captured the eyes of the world, the democratic socialist Martin Luther King Jr. and eventual Green Party presidential candidate Ralph Nader were the moderates.** Astonishingly, organized labor, the longstanding nemesis of capital and organized wealth, where socialists frolicked, often was cast in the uncharacteristic role of conservative bastion, or at least as a defender of the status quo. To some extent this owed to the immediate postwar purging of the radical and communist organizers who built up many of the great American trade unions in mid-century; this stripped the movement of many of its most principled and visionary activists. Organized labor still aggressively supported liberal candidates for office, and some elements of it were strong proponents of civil rights, but it tended to be uncomfortable with criticism of the military-industrial complex and the war in Vietnam, and had little apparent sympathy for the student left, black militants, or the counterculture. It did not appear to enjoy being outflanked on the political left, and was very cool toward the 1972 McGovern campaign, despite that campaign’s having what was possibly the most pro-labor platform for a major party in US presidential election history.

During the late 1960s and early 1970s several reforms were enacted to expand and deepen the democratic infrastructure. As noted, voting rights were extended to all Americans over eighteeen. The primary process was opened up so voters could have more influence over their party’s nomination process. Congress passed the toughest federal campaign-finance laws ever, all with the intent of getting the influence of wealth out of the electoral process.125 Public broadcasting was finally authorized in 1967, with the implicit charter to provide a voice to the underserved parts of the population.126 Colleges and universities were serving an enormous number of first-generation students, with what seems today like virtually free tuition at great public universities. The federal government began funding public education in 1965 to bring poor school districts closer to those standards found in schools in more affluent communities.127 Poverty was in decline and both parties were committed to the war on it. Economic inequality was at an all-time low, assisted by high rates of unionization and progressive income taxation. Only the war, and the stain of militarism, stood as an intractable barrier, and the antiwar movement and New Left had that scourge in their sights.

And that’s not all. Scholars like Cass Sunstein persuasively argue that had Hubert Humphrey won the 1968 presidential election—he came very close—the Supreme Court would have ruled soon thereafter to, in effect, make FDR’s Second Bill of Rights part of the Constitution: that is, given that the Constitution required the establishment of a number of democratic infrastructure initiatives, and given that all people are entitled to a basic standard of living so they can effectively participate in society, poverty and excessive inequality were, in effect, unconstitutional.128 Likewise, it probably would have mandated equal public-education spending for all children, thus ending the class-biased system that had been aggravated by suburbanization.129 Outside the court, activists came close to having television commercials aimed at children prohibited, and even launched a debate about whether prisons were necessary any longer.130

Everywhere one turned it seemed that wealth and privilege were on the defensive and the democratic infrastructure was being built out in an unprecedented manner. “From 1969 to 1972,” political scientist David Vogel writes, “virtually the entire American business community experienced a series of political setbacks without parallel during the postwar period.”131 In his introduction to the 1971 bestseller, America, Inc., Nader said the great issue of the coming decade would be the fight to democratize the control of corporations, so shareholders would not have exclusive power, and big business would no longer be a “mindless, parochial juggernaut.”132 To those atop the economy, the nation’s traditional rulers, it must have seemed like the inmates were running the asylum.

THE 1970S AND THE CRISIS OF DEMOCRACY

The conservative magazine National Review commissioned a poll in 1971 to gauge the opinions of college students on twelve representative campuses. Among the findings was that almost half the students favored “socialization of basic U.S. industries,” and that 75 percent would see no problem with Marxists teaching citizenship courses in public schools.133 Business was on the run, and in its public pronouncements often sounded defensive and conciliatory toward its critics, and eager to establish its commitment to being socially responsible.**

Between 1969 and 1971 a spate of articles appeared in the business press and trade publications addressing the diminished prestige of business and the apparent embrace of socialist ideas by what seemed like a large segment of the population, and especially young people. These were business people talking to each other and strategizing about what best to do. The most influential communication of this period, by a wide margin, was the Lewis Powell Memorandum of August 1971. It was a confidential memo, prepared for the US Chamber of Commerce and only distributed to a few score corporate executives and wealthy investors, but it created ripples that helped define recent history. When Powell was nominated and confirmed for a position on the US Supreme Court later that year, the memo’s existence was unknown and did not arise in his confirmation hearings. Indeed, the Senate hearings were more a coronation than a review, and the collegial tone of the amiable and respected moderate Southern corporate lawyer was contrary to the alarmism and distress of his memo.134

What we are dealing with “is quite new in the history of America,” Powell wrote. “The assault on the enterprise system is broadly based and consistently pursued. It is gaining momentum and converts. . . . Business and the enterprise system are in deep trouble, and the hour is late.” He noted that “the single most effective antagonist of American business is Ralph Nader who—thanks largely to the media—has become a legend in his own time and an idol to millions of Americans.” Powell called for a huge increase in the cash commitment of business, its trade associations, and the wealthy to changing the culture and making the media, universities, and schools much more sympathetic to business and free enterprise. “It is time for American business—which has demonstrated the greatest capacity in all history to produce and influence consumer decisions—to apply their great talents vigorously to the preservation of the system itself.” He called for business to dramatically increase spending in the “neglected political arena,” through increased lobbying and attention to campaigns such that politicians from both parties are beholden to business interests. And he called for business to direct its attention to the judicial system as much as possible, because “the judiciary may be the most important instrument for social, economic and political change.”135

What is striking about Powell’s memo is that it avoids discussion of specific policies—such as income taxation, trade agreements, the Vietnam war, civil rights—altogether. His entire message is that it is imperative for business to seize the political infrastructure and change the weltanschauung. Soon thereafter much of what Powell called for was in place. Groups and campaigns were established to make news media and higher education more sympathetic to the needs of business by the end of the decade. The great pro-corporate think tanks like the Heritage Foundation, the Cato Institute, and the American Enterprise Institute (AEI) were all created within a few years of Powell’s memo. As corporate lobbyist Bryce N. Harlow put it in a speech to business leaders: “We must seek out and liberally support the scholars and the institutions in universities and the AEI kind of private research institutes that are tried-and-true believers in a market-oriented economy and American capitalism. If we fail in that, perhaps all else we attempt will in time be unavailing.”136

Thanks to journalist and author Timothy Noah, we are now aware of the central role of Harlow—whom Henry Kissinger said “virtually single-handedly created the modern advocacy industry”—in radically transforming corporate political power in Washington, DC, during the 1970s. Harlow worked in the Eisenhower and Nixon administrations, and spent most his career as Procter and Gamble’s top Washington lobbyist. He began in the early 1960s, when corporate lobbyists were few and far between. Harlow brought a decided class consciousness to the enterprise. “The Achilles heel of every democracy,” he told a group of wealthy businessmen, “has been the drive of the enfranchised to use the mighty weapon of political equality to enforce economic equality. The days of a democracy are numbered . . . when the belly of the system takes charge of the head—when the vagrant on the street corner, resentfully eyeing the passing limousines of the privileged, the talented, and the influential, sets about using his equal vote as he would use a pistol in a bank.”137

Harlow led the campaign in the early 1970s, after leaving the Nixon administration, to get businesses and trade associations to increase their Washington lobbying efforts dramatically, and to coordinate their activities. “The essence of the problem,” he told a group of corporate leaders at the newly formed Heritage Foundation, “is that, unless business can force itself to shape up, and very quickly, in ways that it has been unable to use or has refused to use ever before, it is in for the most disruptive, most disheartening season since the earliest New Deal days of 40 years ago. The important thing to understand from the business point of view is that the old ways of dealing with Congress just won’t hack it anymore.”138

Harlow did have a carrot at the end of his stick: “The hard fact is that, each time American business does unify, does weld together its thunderbolt, it wins hands down in Washington. That very fact gave us the Taft-Hartley Act . . . passed over a veto by a two-thirds vote.”139 Harlow did more than talk. He played a large role in the formation of the Business Roundtable in 1972, a lobbying group with membership restricted to the most powerful corporate CEOs, which is chartered to think in class terms.140

Message received, and then some. There was a tenfold increase in corporate federal lobbying by the 1980s—such that it eventually became a $9 billion industry by 2014—and K Street became synonymous with this burgeoning major industry, much like Madison Avenue or Wall Street.141 Where better to find insider lobbyists than from former members of Congress? In the early 1970s, 3 percent of retiring or exiting members became lobbyists; by 2012 the figure was more like 50 percent, often earning seven-figure incomes after their stint in “public service.”142 Washington, DC, accordingly, went from a sleepy middle-class town of bureaucrats to a booming metropolitan area of high-rolling lobbyists and fat cat government contractors.143 By 2014, according to Forbes magazine, the greater Washington, DC, metropolitan area housed six of the ten richest counties in the nation.144 And thanks to a process initiated on the Supreme Court by Justice Lewis Powell in the late 1970s, beginning in 2010 the US Supreme Court overturned a century of legislation and jurisprudence and allowed, in effect, unlimited and unaccountable corporate and individual donations to political campaigns. With this newly shaped and decidedly less-democratic infrastructure, the business domination and control of governance was all but guaranteed.

But that was far from clear in the early 1970s. While the loudest voices of the radical left had disappeared or were speaking in quieter tones, and the student left all but vanished, two related developments threatened to put US business in an even more precarious position.

First, not all young white workers got the memo that they were supposed to be a bunch of Archie Bunker lunkheads, easily manipulated by politicians using jingoistic and dog whistle racist rhetoric. The Vietnam War had been fought largely by poor and working-class Americans, and that experience had a radicalizing effect on more than a few of them.145 By the early 1970s some of the troops in Vietnam were in semi-open revolt against their commanders, while for many of the rest cynicism abounded, and it was very difficult for the war to be prosecuted. An extraordinary report on the “collapse of the armed forces” by Marine Corps Colonel Robert D. Heinl Jr. was published in a June 1971 edition of the Armed Forces Journal. It begins with the following and then goes into detail, particularly about the widespread “fragging”146 (i.e., murder) of US military officers by the soldiers under their command:

The morale, discipline and battleworthiness of the U.S. Armed Forces are, with a few salient exceptions, lower and worse than at any time in this century and possibly in the history of the United States.

By every conceivable indicator, our army that now remains in Vietnam is in a state approaching collapse, with individual units avoiding or having refused combat, murdering their officers and non commissioned officers, drug-ridden, and dispirited where not near mutinous.

Elsewhere than Vietnam, the situation is nearly as serious.147

Indeed, Army bases were often adjacent to left-wing coffee shops—sometimes affiliated with the FTA148 movement—and African American, Latino, and a significant number of white GIs were radicalized. It convinced the brass that the draft was no longer viable and led to the institution of a professional army.149

When these veterans returned to jobs in American factories they brought an entirely new sensibility to a generation of workers already changing with the times. “With all the shoulder-length hair, beards, Afros and mod clothing along the line,” Newsweek observed after a visit to the Lordstown, Ohio, GM plant, “it looks for all the world like an industrial Woodstock.”150 By 1970 the “situation exploded in an upsurge of pent-up rank-and-file militancy,” as historian David Noble explained.151 The early 1970s saw the greatest wave of strike activity, work stoppages, slowdowns, and wildcat strikes since 1946. In 1970 alone 2.4 million workers engaged in large-scale work stoppages of one kind or another. The Wall Street Journal characterized the situation as “the worst within memory.”152 There were aggressive attempts led by young workers to take over the steelworkers and mineworkers unions, among others, and throw out the traditional leadership. Management was “dealing with a workforce,” Fortune informed its readers, “no longer under union discipline.”153

The concerns of the workers were far more than wages and benefits; after all, real wages for male workers hit their historic peak in 1972. Automation, both in its elimination of jobs and dehumanization of those that remained, was a huge issue for the young workers. As Noble put it, workers were not happy with “management’s obsession with and struggle for control over workers.”154 “It is imperative for labor,” dissident young longshoremen wrote in a 1971 pamphlet opposing union leadership and management, “to challenge the notion that the employer—in the name of ‘progress’—can simply go ahead and slash his workforce or close his factory or, as is being planned in our industry, close an entire port, and to do so without any regard for the people and community involved.”155

“At the heart of the new mood,” the New York Times reported, “there is a challenge to management’s authority to run its plants, an issue that has resulted in some of the hardest fought battles between industry and labor in the past.” The symbol of this new wave was the three-week-long 1972 strike at the Lordstown plant led by “a group of young, hip, and inter-racial autoworkers” whose primary issue was opposing the “fastest—and most psychically deadening—assembly line in the world.”156 There were efforts to link this working-class radicalism to student and antiwar activists and liberals in general. As progressive journalist Jack Newfield put it in 1971, the way to unite these forces was to build around “the root need to redistribute wealth and the commitment to broaden democratic participation.”157

The crisis in America’s workplaces grew so severe that in 1971 Nixon’s Secretary of Health, Education, and Welfare, Elliot Richardson, appointed a special task force to study and propose recommendations to address the emergence of “blue-collar blues” and “white collar woes.”158 The subsequent January 1973 report, Work in America, began with a quote by Albert Camus—“Without work all life goes rotten. But when work is soulless, life stifles and dies.”—and went from there. “Productivity increases and social problems decrease when workers participate in the work decisions affecting their lives, and when their responsibility for their work is buttressed by participation in profits.” The “keystone” of the report was a call for “the redesign of jobs,” to make them more fulfilling, and there was an important role for unions and workers and government to play in doing this redesign in conjunction with business. “It would give, for the first time, a voice to many workers in an important decision-making process. Citizen participation in the arena where the individual’s voice directly affects his immediate environment may do much to reduce political alienation in America.”159

This extraordinary report, generated by the Nixon administration no less, was in many senses a last hurrah for the era’s weltanschauung.

The second development that made the position of US business more uncertain was that in 1974 the United States entered the worst recession since the 1930s; it lasted until 1975 and the official unemployment rate climbed to 9 percent, the highest it had been since the Great Depression. The Watergate scandal as well as the recession gave the Democrats overwhelming control of the Congress after the 1974 elections. The progressive wing of the Democratic party went on the offensive and in the middle of the 1970s advocated strongly for guaranteed full employment, tax reform to make the system more progressive, excess-profits taxes on large corporations, Ralph Nader’s proposal for a cabinet-level Department of Consumer Protection, same-day voter registration to encourage and increase turnout, labor-law reform to benefit unions, and national health insurance (Medicare for all), among other things.160 Polling revealed large corporations were singularly unpopular institutions deep into the 1970s. Business was in a precarious position as it sought policies to reestablish profitability and make it lucrative to invest, especially as it saw austerity—cutbacks in wages and social services for the many and reductions in taxes on business and the wealthy—as the only possible course. In 1974 Business Week magazine explained the dilemma facing business: “Some people will obviously have to do with less. . . . It will be a bitter pill for many Americans to swallow the idea of doing with less so that big business can have more. Nothing that this nation, or any other nation, has done in modern economic history compares with the selling job that must be done to make people accept this reality.”161

There was growing elite consensus on the importance of this point. In 1973 the Trilateral Commission was established by business interests to examine the “crisis of democracy” facing the leaders of advanced capitalist nations as they attempted to deal with the problems associated with the ongoing “democratic surge,” especially in the United States. This was far from a wingnut operation; it was from the heart of the establishment and was thoroughly bipartisan. Its 1975 report on the United States written by Harvard scholar Samuel P. Huntington stated that “Al Smith once remarked that ‘the only cure for the evils of democracy is more democracy.’ Our analysis suggests that applying that cure at the present time could well be adding fuel to the flames. Instead, some of the problems in the United States today stem from an excess of democracy.”

So the crisis of democracy was . . . too much democracy: “The effective operation of a democratic political system usually requires some measure of apathy and noninvolvement on the part of some individuals and groups.”162 F. A. Hayek, the noted free-market economist, agreed. “Our system of unlimited democracy,” he noted despondently, forces “persons at the head of government . . . to do things that they know to be permissive” and wrong, but they must do them to retain their positions.163

But how realistic was it to believe such a world-class sales job was possible? The strategy had emerged: an effective sales job by business and its allies on the glories and primacy of “free enterprise” and the evils of big government should be complemented by ongoing efforts to shrink the democratic infrastructure—generating the necessary amount of “apathy and noninvolvement”—such that people would be less likely to interfere with governance. “The effort was undertaken,” as political scientist Sheldon Wolin puts it, “to hammer home the astounding principle that a democratically chosen government was the enemy of ‘the people.’”164

As for the sales campaign, any timidity was cast aside and the operating logic by the late 1970s was that the best defense is a good offense. A tidal wave of material not only promoted the genius of free enterprise, but it also found a new scapegoat to blame for all the nation’s increasing economic and social problems: liberals, especially liberal intellectuals. All problems in the nation could be traced to the loony half-baked ideas of liberals and their effete supporters in the media and academia. Now that the economy was broken, the problems there could also be attributed to labor unions and their lazy, pampered, overweight chieftains. Business was a heroic, All-American, job-creating institution that all these deadbeat parasites could never appreciate and wrongly stigmatized. Entrepreneurs and corporate CEOs were the real heroes of society. Milton Friedman, fresh off his 1976 Nobel Prize in economics, argued that liberals, like the kind he was forced to cohabit with at the University of Chicago, were the “intellectual architects of the suicidal course” the country was on.165 If the Powell Memo was confidential, former Treasury Secretary William E. Simon made the same case in a 1978 bestseller that said what was required was “nothing less than a massive and unprecedented mobilization of the moral, intellectual and financial resources” from those who understood the life-and-death imperative to defend the free enterprise system.166

The campaign was a smashing success.167 The very term liberal, which had only a few years earlier been embraced by most Democrats and many Republicans, went from a place of respect for a great political tradition to being filed between drug dealer and pedophile in the popular lexicon. It became unmentionable; the “L” word. And on the congressional front, the entire package of progressive legislation which had seemed likely to pass the day Jimmy Carter was inaugurated in January 1977 went down to ringing defeat within two years as the rejuvenated business lobbies flexed their now gargantuan muscles.168

While Reagan defeated Jimmy Carter in the 1980 presidential race to mark the ascension of this “neoliberal” approach to dominance, it has since been forgotten than only a few short years earlier the prospect that someone with Reagan’s views might win a national election was seen as preposterous. But to get the full measure of the transition still ongoing, consider the 1980 platform of the Libertarian Party, which featured billionaire David Koch as its vice-presidential candidate. The platform called for, among other things:

         Repeal of all campaign finance laws and unlimited corporate and individual donations

         Abolition of Medicare, Medicaid, the Postal Service, and Social Security

         Abolition of the Environmental Protection Agency, and an end to most consumer regulation

         Privatization of the water system, railroads, public roads, and the highway system

         Abolish all income and capitals-gains taxation

         End all government funding and operation of public schools

         Abolish all social-welfare programs

         Make labor unions “voluntary” for employees, and collective bargaining possible only if employers agree; prohibit the government from enforcing collective bargaining rights

         Repeal of antitrust laws and any government efforts to break up monopolies, as well as abolition of the Federal Trade Commission169

This was Milton Friedman’s vision of a “free” society with no democratic infrastructure. It is a society where most citizens get nothing of value from the government, and are told they can never get anything of value from the government, so they logically lose their interest in it. As Wolin writes, when politicians proceed “methodically to reduce or eliminate social programs, the result is tantamount to a deliberate strategy of encouraging political apathy among the poor and needy.”170

Even in 1980 these were all regarded as such batshit crazy ideas that Ronald Reagan wanted nothing to do with them. Yet, their proponents were at last in the on-deck circle. Today some of them have been accomplished at least in part or informally and all of them are legitimate issues for discussion. They have become mainstream political issues, and some of them will be the defining political battles of the coming generation. The weltanschauung has been turned upside down.

This was reflected in political realignments. The Republican Party has moved steadily to the right since the 1970s, purging its entire liberal and moderate wings. Economic and political elites who were associated with the party, and who might still be reasonably liberal on issues such as reproductive rights, made a marriage of convenience with a new entity that came to be known as the “religious right.” This uncomfortable but politically potent alliance—a union first conceived by big business as a way to counter the popularity of the New Deal—sought from the start to win votes and elections without dwelling on the self-serving (and unpopular) economic policies of the elites.171

The Democratic Party has moved rightward as well. The Democratic Leadership Council was founded by people like Bill Clinton and it successfully remade the party into a far more pro-business party—a champion of deregulation, lower taxes on business and the rich, cutbacks in social services, and secretive trade deals that benefit large corporations and investors but have dubious value for everyone else. The concerns of organized labor and social movements, now reclassified as “special interest groups,” were marginalized. The degree to which Democrats moved rightward was obscured by the “polarization” that occurred after congressional white southern Democrats all became Republicans or were defeated by Republicans. With the Democrats losing their southern wing, and the Republicans purging their moderate and liberal wings, there was almost no overlap between the parties, for the first time in history. This has made effective governance by a two-party system increasingly difficult. Yet the structural constraints on the process—the lack of a sufficient democratic infrastructure—has prevented the development of viable third, fourth, and fifth parties that exist as meaningful alternatives in themselves or in coalitions. Consequently, as the parties have both moved well to the right since the 1970s, so too has political discourse and the governance of the country.172

For four decades the right wing has led the charge to dismantle or enfeeble the democratic infrastructure. In some of these cases the Democrats offer resistance, usually tepid. In others, they go with the flow. And, in too many instances, they join in the deconstruction.

The leaderships of the parties march in lock-step in two areas where grave damage is being done to the democratic infrastructure. The first is with regard to militarism. After the defeat in Vietnam and a rash of scandals involving illegal government surveillance on American citizens in the 1960s, Senator Frank Church’s Senate select committee held unprecedented (and never repeated) hearings in 1975–1976 on the crimes of the federal government, especially by the unaccountable intelligence community. It consequently passed laws to reassert congressional control.173 This mid-1970s moment of honest reflection about militarism and empire was a nod to the old weltanschauung—it was termed the “Vietnam syndrome”—and quickly forgotten.174 Both parties now effectively pursue the same policies with regard to the US military role in the world, and are committed to a permanent wartime budget, with minimal civilian or congressional oversight. The United States is in a permanent war against a faceless enemy that only ends when people in power say it will end, and they have no incentive to ever end it. When the Soviet Union collapsed some innocent dreamers imagined the United States would be able to return to a level of military spending found in other nations, or in our own pre–Cold War history. America would enjoy a massive “peace dividend” and could build up its civilian (and democratic) infrastructure. Instead, after what seemed like a thirty-second pause, military spending remained unchanged and then increased, for no coherent reason. Continual warfare is now hard-wired into the political economy, a part of the informal constitution.

Second, beginning in the 1980s, for the first time in US history, the federal government began to systematically “privatize” public services and “outsource” to private firms what had traditionally been government activities.175 States and local governments have followed suit, and both parties participate in the process.176 The purported reason for privatization and outsourcing was to bring market efficiency to the public sector; it followed from what Tony Judt described as “the intellectual shift that marked the last third of the 20th century . . . the worship of the private sector and the cult of privatization.”177 Research suggests that politics and greed had the most to do with what the government privatized, and that the efficiency claims were rarely realized and often flat wrong. Instead, this became a cash cow for large corporations and wealthy investors and has fanned the flames of corruption.178 For investors and corporations hard-pressed to locate profitable investments in the sainted “free market,” having a chance to grab a fistful of taxpayers’ dollars and take over military functions, prisons, public schools, and anything else that wasn’t nailed down is a gift from the heavens, especially when the terms are invariably generous, with all-but-guaranteed high rates of return. This also creates powerful lobbies with a decided interest in more militarism, more prisons, and more privatization of schools, so more public money can go into their coffers.179 The US government, under Republicans and Democrats, seems to be dedicated to fattening the bank accounts of crony capitalists above all else.

But the corruption surrounding privatization and outsourcing is only the beginning of the damage it does to the democratic infrastructure. By removing the government from important functions, it lessens the ability of the citizenry to play a role in the economy and it locks in business domination. Privatization and outsourcing lessens the ability of government to solve social problems and therefore generates cynicism toward it. And, to top it off, evidence suggests privatization has contributed to the rapid escalation of economic inequality.180 Ironically, the administration of the government by the “free market” crowd proves their exact point: government is corrupt, incompetent, and not to be trusted.181 The end result is a great demoralization and depoliticization. The weltanschauung has changed, precisely as intended.182 The cancer of the “excess of democracy” or “unlimited democracy” has been surgically removed and destroyed. Corporations and the wealthy have won, with all the economic benefits outlined in Chapter 2.

For some time now scholars and writers have wrestled with a paradox. In the past, when the United States has had great periods of conservatism where elite interests dominated—such as the original planter/merchant aristocracy, Southern slavery, the Gilded Age, the 1920s—they were followed by major reform periods dedicated to lessening inequality and corruption.183 By historical standards, the United States is long past due, by a good two decades, for such a reform moment. In our view, the evidence points to the deterioration of the democratic infrastructure as a—perhaps the—key factor in delaying or preventing a new era of reform; people have little way to effectively participate in the governing process and they respond (or opt out of responding) accordingly.184 Until that changes, the paradox will only continue and deepen.

THERE IS NO MYSTERY TO DEMOCRACY

The United States faces a great crisis of unemployment and underemployment, which will be exacerbated by revolutionary developments in technology. It is part of a broader economic malaise, which is made worse by a political system that is mired in corruption and does the bidding of society’s very wealthiest inhabitants and largest corporations.

The people who dominate the political economy at present are determined to use their considerable resources and influence to prevent the development and expansion of democratic infrastructure. Indeed, at many turns, they consciously seek the actual deconstruction of that infrastructure. And they will work harder to do so as the social pressures created by technological change, automation, and joblessness are felt more acutely.

Furthermore, the present rulers have spent the past forty years trying to convince everyone that becoming part of an aroused and engaged and organized citizenry is unnecessary and a waste of time. Arguably their greatest victory of the past four decades has been converting the longstanding American optimism that democracy can lick any problem before it into a morose pessimism that there is no alternative, and resistance is futile.

Of course it is frustrating for citizens to be fighting old fights for rights that should have been secured long ago. But the elites know something that should give us all encouragement: the current rulers cannot win a fair fight so they must rig the game. In times of crisis, like the 1970s, their contempt for democracy comes to the surface. In their hands, the United States has become a nation that, as FDR warned it might, is coming to share far too many attributes with fascist societies. Unless there are major structural changes, even those liberties and privileges we enjoy today may be in jeopardy. This is a frightening proposition. But the world the current rulers have made is ill-equipped to address the crisis of unemployment and underemployment, and in no position to advance democratic practices and values. It has to go.

The humane and effective solution to the economic crisis requires that (1) the political system be rejuvenated into a powerful democratic infrastructure that (2) draws people into public policy debates as effective participants. That is the route to the best possible outcomes. Then a frank and effective debate over how best to restructure the economy to serve human interests can occur. In that process the weltanschauung will change, and the crisis will appear as more of an opportunity than as a threat, and human imaginations will be unchained. We can use the technologies to build an egalitarian, humane, sustainable, and democratic society as has never before been seen.

The good news is that nearly all the elements of a democratic infrastructure that we list in Chapter 1 and return to in Chapter 6 have deep roots in American political history. Indeed, what is required to have a credible democracy is well known across the planet.

The other good news is that there is no mystery about what creates democracy and democratic infrastructure. They advance primarily with energy from dynamic popular social movements, as we discuss next, in Chapter 5. Social activism changes everything.

Just as history tells us much about how a democratic infrastructure can and should be constructed, so it tells us much about the formation of those dynamic social movements in times of great economic and social turbulence. In that history, and in the movements that have already taken shape in contemporary America, we see the outlines for the movements to establish the necessary democratic infrastructure for our times.

* In the early days of the American experiment, each state was required to establish its own constitution. Since then, it has been a standard that states have constitutions. As we will explain later in this chapter, they are much more dynamic documents than the US Constitution. Many states have held, and continue to hold, constitutional conventions not just to write the documents but to update them. Others have established relatively easy processes for amending constitutions by petitioning to schedule referendum votes on particular issues.

* The capitalism roundly celebrated today as commanded from on high through the vessel of the US Constitution proved to be an acquired taste deep into America’s history. In his 1861 State of the Union Address—in the era when capitalism’s contours were indeed becoming visible—President Abraham Lincoln closed his speech, which had focused on the raging Civil War, by stating, “In my present position I could scarcely be justified were I to omit raising a warning voice against this approach of returning despotism.” The despotism that so concerned Lincoln was “the effort to place capital on an equal footing with, if not above, labor in the structure of government.” Lincoln elaborated on the notion: “Labor is prior to and independent of capital. Capital is only the fruit of labor, and could never have existed if labor had not first existed. Labor is the superior of capital, and deserves much the higher consideration.” This only scratches the surface of Lincoln’s remarkable statement about the relationship of capital and labor to democracy. Abraham Lincoln, State of the Union Address, December 3, 1861, http://www.presidentialrhetoric.com/historicspeeches/lincoln/stateoftheunion1861.html.

* There were obviously other motives as well, of a geopolitical nature. In particular, the obsession with communism and the Left altered the nature of the occupations, especially in Germany. There de-Nazification slowed down so all hands could be on deck to battle the threat ostensibly posed by the Soviet Union and its supporters.

* Here is a sample of the voice of moderation, from King’s 1967 speech announcing his opposition to the war in Vietnam: “The war in Vietnam is but a symptom of a far deeper malady within the American spirit, and if we ignore this sobering reality, we will find ourselves organizing ‘clergy and laymen concerned’ committees for the next generation. They will be concerned about Guatemala and Peru. They will be concerned about Thailand and Cambodia. They will be concerned about Mozambique and South Africa. We will be marching for these and a dozen other names and attending rallies without end unless there is a significant and profound change in American life and policy. . . . This is the role our nation has taken, the role of those who make peaceful revolution impossible by refusing to give up the privileges and the pleasures that come from the immense profits of overseas investments. I am convinced that if we are to get on to the right side of the world revolution, we as a nation must undergo a radical revolution of values. We must rapidly begin the shift from a thing-oriented society to a person-oriented society. When machines and computers, profit motives and property rights, are considered more important than people, the giant triplets of racism, extreme materialism, and militarism are incapable of being conquered. A true revolution of values will soon cause us to question the fairness and justice of many of our past and present policies.” Martin Luther King Jr., “Beyond Vietnam,” New York, New York, April 4, 1967, http://mlk-kpp01.stanford.edu/index.php/encyclopedia/documentsentry/doc_beyond_vietnam/.

* It was left to the entirely unrepentant Milton Friedman to grab big business by the collar, much like when Don Corleone shook Johnny Fontane and told him to stop crying and to “act like a man” in The Godfather. In a 1970 article in the New York Times, Friedman wrote that businessmen who think they are protecting free enterprise by accepting that their businesses have a social responsibility to solve social problems like discrimination and pollution “and whatever else may be the catchwords of the contemporary crop of reformers” are only making matters worse. “Businessmen who talk this way are unwitting puppets of the intellectual forces that have been undermining the basis of a free society these past decades.” He introduced, instead, the Friedman doctrine: “The Social Responsibility of Business is to Increase its Profits.” Period. See Milton Friedman, “The Social Responsibility of Business Is to Increase Its Profits,” New York Times, September 13, 1970, p. 33.