2

Gatekeeping in America

In The Plot Against America, American novelist Philip Roth builds on real historical events to imagine what fascism might have looked like in prewar America.

An early American mass-media hero, Charles Lindbergh, is the novel’s central figure: He skyrockets to fame with his 1927 solo flight across the Atlantic and later becomes a vocal isolationist and Nazi sympathizer. But here is where history takes a fantastic turn in Roth’s hands: Rather than fading into obscurity, Lindbergh arrives by plane at the 1940 Republican Party convention in Philadelphia at 3:14 A.M., as a packed hall finds itself deadlocked on the twentieth ballot. Cries of “Lindy! Lindy! Lindy!” erupt for thirty uncontained minutes on the convention floor, and in a moment of intense collective fervor, his name is proposed, seconded, and approved by acclamation as the party’s nominee for president. Lindbergh, a man with no political experience but unparalleled media savvy, ignores the advice of his advisors and campaigns by piloting his iconic solo aircraft, Spirit of St. Louis, from state to state, wearing his flight goggles, high boots, and jumpsuit.

In this world turned upside down, Lindbergh beats Franklin Delano Roosevelt, the incumbent, to become president. And Lindbergh, whose campaign is later revealed to be linked to Hitler, goes on to sign peace treaties with America’s enemies. A wave of anti-Semitism and violence is unleashed across America.

Many Americans have found parallels between the 2016 presidential election and Roth’s work of fiction. The premise—an outsider with dubious democratic credentials comes to power with the aid of a foreign nation—cannot help but resonate. But the comparison raises another striking question: Given the severity of the economic crisis in 1930s America, why didn’t this happen here?

The reason no extremist demagogue won the presidency before 2016 is not the absence of contenders for such a role. Nor is it the lack of public support for them. To the contrary, extremist figures have long dotted the landscape of American politics. In the 1930s alone, as many as eight hundred right-wing extremist groups existed in the United States. Among the most important figures to emerge during this period was Father Charles Coughlin, an anti-Semitic Catholic priest whose fiery nationalist radio program reached up to forty million listeners a week. Father Coughlin was openly antidemocratic, calling for the abolition of political parties and questioning the value of elections. His newspaper, Social Justice, adopted pro-fascist positions in the 1930s, naming Mussolini its “Man of the Week” and often defending the Nazi regime. Despite his extremism, Father Coughlin was immensely popular. Fortune magazine called him “just about the biggest thing ever to happen to radio.” He delivered speeches to packed stadiums and auditoriums across the country; as he traveled from city to city, fans lined his route to see him passing by. Some contemporary observers called him the most influential figure in the United States after Roosevelt.

The Depression also gave rise to Louisiana governor and senator Huey Long, who called himself “the Kingfish.” Long was described by the historian Arthur M. Schlesinger Jr. as “the great demagogue of the day, a man who resembled…a Latin American dictator, a Vargas or a Perón.” The Kingfish was a gifted stump speaker, and he routinely flouted the rule of law. As governor, Long built what Schlesinger described as “the nearest approach to a totalitarian state the American republic has ever seen,” using a mix of bribes and threats to bring the state’s legislature, judges, and press to heel. Asked by an opposition legislator if he had heard of the state constitution, Long replied, “I’m the constitution just now.” Newspaper editor Hodding Carter called Long “the first true dictator out of the soil of America.” When Franklin Roosevelt’s campaign manager, James A. Farley, met Mussolini in Rome in 1933, he wrote that the Italian dictator “reminded me of Huey Long.”

Long built a massive following with his call to redistribute wealth. In 1934, he was said to have “received more mail than all other senators combined, more even than the president.” By then his Share Our Wealth movement had more than 27,000 cells across the country and a mailing list of nearly eight million names. Long planned a presidential run, telling a New York Times reporter, “I can take this Roosevelt….I can out-promise him. And he knows it.” Roosevelt viewed Long as a serious threat but was spared when Long was assassinated in September 1935.

America’s authoritarian tendency persisted through the post–World War II golden age. Senator Joseph McCarthy, who used the Cold War fear of communist subversion to promote blacklisting, censorship, and book banning, enjoyed wide backing among the American public. At the height of McCarthy’s political power, polls showed that nearly half of all Americans approved of him. Even after the Senate’s 1954 censure of him, McCarthy enjoyed 40 percent support in Gallup polls.

A decade later, Alabama governor George Wallace’s defiant segregationist stance vaulted him to national prominence, leading to surprisingly vigorous bids for the presidency in 1968 and 1972. Wallace engaged in what journalist Arthur Hadley called the “old and honorable American tradition of hate the powerful.” He was, Hadley wrote, a master at exploiting “plain old American rage.” Wallace often encouraged violence and displayed a casual disregard for constitutional norms, declaring:

There is one thing more powerful than the Constitution….That’s the will of the people. What is a Constitution anyway? They’re the products of the people, the people are the first source of power, and the people can abolish a Constitution if they want to.

Wallace’s message, which mixed racism with populist appeals to working-class whites’ sense of victimhood and economic anger, helped him make inroads into the Democrats’ traditional blue-collar base. Polls showed that roughly 40 percent of Americans approved of Wallace in his third-party run in 1968, and in 1972 he shocked the establishment by emerging as a serious contender in the Democratic primaries. When Wallace’s campaign was derailed by an assassination attempt in May 1972, he was leading George McGovern by more than a million votes in the primaries.

In short, Americans have long had an authoritarian streak. It was not unusual for figures such as Coughlin, Long, McCarthy, and Wallace to gain the support of a sizable minority—30 or even 40 percent—of the country. We often tell ourselves that America’s national political culture in some way immunizes us from such appeals, but this requires reading history with rose-colored glasses. The real protection against would-be authoritarians has not been Americans’ firm commitment to democracy but, rather, the gatekeepers—our political parties.

On June 8, 1920, as Woodrow Wilson’s presidency was winding down, Republican delegates gathered to choose their nominee in the flag-draped but poorly ventilated Chicago Coliseum, where the withering heat reached over one hundred degrees. After nine ballots over four days, the convention remained undecided. On Friday evening, in Suite 404 on the thirteenth floor of the nearby Blackstone Hotel, Republican National Committee Chairman Will Hays and George Harvey, the powerful publisher of Harvey’s Weekly, hosted a rotating group of U.S. senators and party leaders in the original “smoke-filled back room.” The Old Guard, as journalists called them, poured themselves drinks, smoked cigars, and talked late into the night about how to break the deadlock to get a candidate the 493 delegates needed for the nomination.

The leading contender on the convention floor was Major General Leonard Wood, an old ally of Theodore Roosevelt who had generated popular enthusiasm in the primaries and dominated the ballot earlier in the week, with 287 delegates. He was followed by Illinois governor Frank Lowden, California senator Hiram Johnson, and Ohio senator Warren G. Harding, trailing in a distant fourth place with only 65½ delegates. From the convention floor, reporters wrote, “Nobody is talking Harding…[He is] not even considered as among the most promising dark horses.” But as reporters heard rumors about the discussions taking place at the Blackstone, the most motivated of them found their way to the thirteenth floor of the hotel and quietly gathered in the hallways outside Suite 404 to catch a glimpse as leading senators—including Henry Cabot Lodge of Massachusetts, McCormick of Illinois, Phipps of Colorado, Calder of New York, former senator Crane of Massachusetts, and others—came and went.

Inside Suite 404, the upsides and downsides of each candidate were carefully reviewed and debated (Knox was too old; Lodge didn’t like Coolidge). At one in the morning, seven members of the Old Guard remained in the room and took a “standing vote.” Called in at 2:11 A.M. by George Harvey, a stunned Harding was informed that he had been selected. Word spread. By the next evening, on the tenth ballot and to the great relief of the sweltering delegates, Warren G. Harding received an overwhelming 692½ convention delegates amid rousing cheers. Though he garnered just over 4 percent of the primary vote, he was now the Republican Party’s 1920 presidential nominee.

Nobody likes smoke-filled rooms today—and for good reason. They were not very democratic. Candidates were chosen by a small group of power brokers who were not accountable to the party rank and file, much less to average citizens. And smoke-filled rooms did not always produce good presidents—Harding’s term, after all, was marked by scandal. But backroom candidate selection had a virtue that is often forgotten today: It served a gatekeeping function, keeping demonstrably unfit figures off the ballot and out of office. To be sure, the reason for this was not the high-mindedness of party leaders. Rather, party “bosses,” as their opponents called them, were most interested in picking safe candidates who could win. It was, above all, their risk aversion that led them to avoid extremists.

Gatekeeping institutions go back to the founding of the American republic. The 1787 Constitution created the world’s first presidential system. Presidentialism poses distinctive challenges for gatekeeping. In parliamentary democracies, the prime minister is a member of parliament and is selected by the leading parties in parliament, which virtually ensures that he or she will be acceptable to political insiders. The very process of government formation serves as a filter. Presidents, by contrast, are not sitting members of Congress, nor are they elected by Congress. At least in theory, they are elected by the people, and anyone can run for president and—if he or she earns enough support—win.

Our founders were deeply concerned with gatekeeping. In designing the Constitution and electoral system, they grappled with a dilemma that, in many respects, remains with us today. On the one hand, they sought not a monarch but an elected president—one who conformed to their idea of a republican popular government, reflecting the will of the people. On the other, the founders did not fully trust the people’s ability to judge candidates’ fitness for office. Alexander Hamilton worried that a popularly elected presidency could be too easily captured by those who would play on fear and ignorance to win elections and then rule as tyrants. “History will teach us,” Hamilton wrote in the Federalist Papers, that “of those men who have overturned the liberties of republics, the great number have begun their career by paying an obsequious court to the people; commencing demagogues, and ending tyrants.” For Hamilton and his colleagues, elections required some kind of built-in screening device.

The device the founders came up with was the Electoral College. Article II of the Constitution created an indirect election system that reflected Hamilton’s thinking in Federalist 68:

The immediate election should be made by men most capable of analyzing the qualities adapted to the station, and acting under the circumstances favorable to deliberation, and to a judicious combination of all the reasons and inducements which were proper to govern them.

The Electoral College, made up of locally prominent men in each state, would thus be responsible for choosing the president. Under this arrangement, Hamilton reasoned, “the office of president will seldom fall to the lot of any man who is not in an eminent degree endowed with the requisite qualifications.” Men with “talents for low intrigue, and the little arts of popularity” would be filtered out. The Electoral College thus became our original gatekeeper.

This system proved short-lived, however, due to two shortcomings in the founders’ original design. First, the Constitution is silent on the question of how presidential candidates are to be selected. The Electoral College goes into operation after the people vote, playing no role in determining who seeks the presidency in the first place. Second, the Constitution never mentions political parties. Though Thomas Jefferson and James Madison would go on to pioneer our two-party system, the founders did not seriously contemplate those parties’ existence.

The rise of parties in the early 1800s changed the way our electoral system worked. Instead of electing local notables as delegates to the Electoral College, as the founders had envisioned, each state began to elect party loyalists. Electors became party agents, which meant that the Electoral College surrendered its gatekeeping authority to the parties. The parties have retained it ever since.

Parties, then, became the stewards of American democracy. Because they select our presidential candidates, parties have the ability—and, we would add, the responsibility—to keep dangerous figures out of the White House. They must, therefore, strike a balance between two roles: a democratic role, in which they choose the candidates that best represent the party’s voters; and what political scientist James Ceaser calls a “filtration” role, in which they screen out those who pose a threat to democracy or are otherwise unfit to hold office.

These dual imperatives—choosing a popular candidate and keeping out demagogues—may, at times, conflict with each other. What if the people choose a demagogue? This is the recurring tension at the heart of the presidential nomination process, from the founders’ era through today. An overreliance on gatekeeping is, in itself, undemocratic—it can create a world of party bosses who ignore the rank and file and fail to represent the people. But an overreliance on the “will of the people” can also be dangerous, for it can lead to the election of a demagogue who threatens democracy itself. There is no escape from this tension. There are always trade-offs.

For most of American history, political parties prioritized gatekeeping over openness. There was always some form of a smoke-filled room. In the early nineteenth century, presidential candidates were chosen by groups of congressmen in Washington, in a system known as Congressional Caucuses. The system was soon criticized as too closed, so beginning in the 1830s, candidates were nominated in national party conventions made up of delegates from each state. Delegates were not popularly elected; they were chosen by state and local political party committees, and they were not bound to support particular candidates. They generally followed the instructions of the state party leaders who sent them to the convention. The system thus favored insiders, or candidates backed by the party leaders who controlled the delegates. Candidates who lacked support among their party’s network of state and local politicians had no chance of success.

The convention system was also criticized for being closed and undemocratic, and there was no shortage of efforts to reform it. Primary elections were introduced during the Progressive era; the first was held in Wisconsin in 1901, and in 1916, primaries were held in two dozen states. Yet these brought little change—in part because many states didn’t use them, but mostly because elected delegates were not required to support the candidate who won the primary. They remained “unpledged,” free to negotiate their vote on the convention floor. Party leaders—with their control over government jobs, perks, and other benefits—were well-positioned to broker these deals, so they remained the presidency’s gatekeepers. Because primaries had no binding impact on presidential nominations, they were little more than beauty contests. Real power remained in the hands of party insiders, or what contemporaries called “organization men.” For prospective candidates, securing the backing of the organization men was the only viable road to the nomination.

The old convention system highlights the trade-offs inherent to gatekeeping. On the one hand, the system wasn’t very democratic. The organization men were hardly representative of American society. Indeed, they were the very definition of an “old boys” network. Most rank-and-file party members, especially the poor and politically unconnected, women, and minorities, were not represented in the smoke-filled rooms and were thus excluded from the presidential nomination process.

On the other hand, the convention system was an effective gatekeeper, in that it systematically filtered out dangerous candidates. Party insiders provided what political scientists called “peer review.” Mayors, senators, and congressional representatives knew the candidates personally. They had worked with them, under diverse conditions, over the years and were thus well-positioned to evaluate their character, judgment, and ability to operate under stress. Smoke-filled back rooms therefore served as a screening mechanism, helping to keep out the kind of demagogues and extremists who derailed democracy elsewhere in the world. American party gatekeeping was so effective that outsiders simply couldn’t win. As a result, most didn’t even try.

Consider Henry Ford, the founder of the Ford Motor Company. One of the richest men in the world in the early twentieth century, Ford was a modern version of the kind of extremist demagogue Hamilton had warned against. Using his Dearborn Independent as a megaphone, he railed against bankers, Jews, and Bolsheviks, publishing articles claiming that Jewish banking interests were conspiring against America. His views attracted praise from racists worldwide. He was mentioned with admiration by Adolf Hitler in Mein Kampf and described by future Nazi leader Heinrich Himmler as “one of our most valuable, important, and witty fighters.” In 1938, the Nazi government awarded him the Grand Cross of the German Eagle.

Yet Ford was also a widely admired, even beloved, figure in the United States, especially in the Midwest. A “poor farm boy who made good,” the plainspoken businessman was revered by many rural Americans as a folk hero, alongside such presidents as Washington and Lincoln.

Ford’s restless imperiousness eventually lured him into politics. He began with opposition to World War I, launching an amateurish but high-profile “peace mission” to Europe. He dipped in and out of politics after the Great War, nearly winning a Senate seat in 1918 and then flirting with the idea of running for president (as a Democrat) in 1924. The idea quickly generated enthusiasm, especially in rural parts of the country. Ford for President clubs sprang up in 1923, and the press began to write of a “Ford Craze.”

That summer, the popular magazine Collier’s began a weekly national poll of its readers, which suggested that Ford’s celebrity, reputation for business acumen, and unremitting media attention could translate into a popular presidential candidacy. As the results rolled in each week, they were accompanied by increasingly reverential headlines: “Politics in Chaos as Ford Vote Grows” and “Ford Leads in Presidential Free-for-All.” By the end of the two-month straw poll of upward of 250,000 readers, Henry Ford ran away from the competition, outpacing all twelve contenders, including President Warren Harding and future president Herbert Hoover. With these results, Collier’s editors concluded, “Henry Ford has become the issue in American politics.”

But if Ford harbored serious presidential ambitions, he was born a century too soon. What mattered far more than public opinion was the opinion of party leaders, and party leaders soundly rejected him. A week after publishing the results of its readers’ poll, in a series of articles, including one titled “The Politicians Pick a President,” Collier’s reported the results of its poll of the ultimate insiders—a group of 116 party leaders in both parties, including all members of the Republican and Democratic Party National Committees, 14 leading governors, and senators and congressmen in each party. Among these kingmakers, Ford lagged in a distant fifth position. The Collier’s editors observed that fall:

When Democratic [Party] chieftains are asked: “What about Ford?” they all shrug their shoulders. Almost without a single exception the men who constitute what is usually known as the “organization” in every State are opposed to Ford. In all the States except where there are presidential primaries these men practically hand-pick the delegates to the national conventions….Nobody denies the amount of Ford sentiment among the masses of the people—Democratic and Republican. Every Democratic leader knows his State is full of it—and he is afraid of it. He thinks, however, that because of the machinery of selection of delegates there is little likelihood that Ford will make much of a showing.

Despite popular enthusiasm for his candidacy, Ford was effectively locked out of contention. Senator James Couzens called the idea of his candidacy ridiculous. “How can a man over sixty years old, who…has no training, no experience, aspire to such an office?” he asked. “It is most ridiculous.”

It is, therefore, not surprising that when Ford was interviewed for Collier’s at the end of that long summer, his presidential ambitions were tempered:

I can’t imagine myself today accepting any nomination. Of course, I can’t say…what I will do tomorrow. There might be a war or some crisis of the sort, in which legalism and constitutionalism and all that wouldn’t figure, and the nation wanted some person who could do things and do them quick.

What Ford was saying, in effect, was that he would only consider running if the gatekeeping system blocking his path were somehow removed. So, in reality, he never stood a chance.

Huey Long didn’t live long enough to test the presidential waters, but despite his extraordinary political skills, popularity, and ambition, there is good reason to think that he, too, would have been stopped by the partisan gatekeepers. When he was elected to the Senate in 1932, Long’s norm-breaking behavior quickly isolated him from his peers. Lacking support among Democratic Party leaders, Long would have stood no chance of defeating Roosevelt at the 1936 convention. He would have had to mount an independent presidential bid, which would have been extraordinarily difficult. Polls suggested that a Long candidacy could divide the Democratic vote and throw the 1936 race to the Republicans but that Long himself had little chance of winning.

Party gatekeeping also helped confine George Wallace to the margins of politics. The segregationist governor participated in a few Democratic primaries in 1964, performing surprisingly well. Running against civil rights and under the slogan “Stand Up for America,” Wallace shocked the pundits by winning nearly a third of the vote in Wisconsin and Indiana and a stunning 43 percent in Maryland. But primaries mattered little in 1964, and Wallace soon bowed out in the face of an inevitable Lyndon Johnson candidacy. Over the next four years, however, Wallace campaigned across the country in anticipation of the 1968 presidential race. His mix of populism and white nationalism earned him strong support among some white working-class voters. By 1968, roughly 40 percent of Americans approved of him. In other words, Wallace made a Trump-like appeal in 1968, and he enjoyed Trump-like levels of public support.

But Wallace operated in a different political world. Knowing that the Democratic Party establishment would never back his candidacy, he ran as the candidate of the American Independence Party, which doomed him. Wallace’s performance—13.5 percent of the vote—was strong for a third-party candidate, but it left him far from the White House.

We can now grasp the full scale of Philip Roth’s imaginative leap in his novel The Plot Against America. The Lindbergh phenomenon was not entirely a figment of Roth’s imagination. Lindbergh—an advocate of “racial purity” who toured Nazi Germany in 1936 and was awarded a medal of honor by Hermann Göring—emerged as one of America’s most prominent isolationists in 1939 and 1940, speaking nationwide on behalf of the America First Committee. And he was extraordinarily popular. His speeches drew large crowds, and in 1939, according to Reader’s Digest editor Paul Palmer, his radio addresses generated more mail than those of any other person in America. As one historian put it, “Conventional wisdom had had it that Lindbergh would eventually run for public office,” and in 1939, Idaho senator William Borah suggested that Lindbergh would make a good presidential candidate. But here is where we return to reality. The Republican Party’s 1940 convention was not even remotely like the fictionalized one described in The Plot Against America. Not only did Lindbergh not appear at the convention, but his name never even came up. Gatekeeping worked.

In the conclusion of their history of radical-right politics in the United States, The Politics of Unreason, Seymour Martin Lipset and Earl Raab described American parties as the “chief practical bulwark” against extremists. They were correct. But Lipset and Raab published their book in 1970, just as the parties were embarking on the most dramatic reform of their nomination systems in well over a century. Everything was about to change, with consequences far beyond what anyone might have imagined.

The turning point came in 1968. It was a heart-wrenching year for Americans. President Lyndon Johnson had escalated the war in Vietnam, which was now spiraling out of control—16,592 Americans died in Vietnam in 1968 alone, more than in any previous year. American families sat in their living rooms each evening watching the TV nightly news, assaulted with ever more graphic scenes of combat. In April 1968, an assassin gunned down Martin Luther King Jr. Then, in June, within hours of his winning the California Democratic presidential primary, Robert F. Kennedy’s presidential campaign—centered on opposition to Johnson’s escalating war—was abruptly halted by a second assassin’s gun. The cries of despair in Los Angeles’s Ambassador Hotel ballroom that night were given expression by novelist John Updike, who wrote that it felt as if “God might have withdrawn His blessing from America.”

Meanwhile, the Democrats grew divided between supporters of Johnson’s foreign policy and those who had embraced Robert Kennedy’s antiwar position. This split played out in a particularly disruptive manner at the Democratic convention in Chicago. With Kennedy tragically gone, the traditional party organization stepped into the breach. The party insiders who dominated on the convention floor favored Vice President Hubert Humphrey, but Humphrey was deeply unpopular among antiwar delegates because of his association with President Johnson’s Vietnam policies. Moreover, Humphrey had not run in a single primary. His campaign, as one set of analysts put it, was limited to “party leaders, union bosses, and other insiders.” Yet, with the backing of the party regulars, including the machine of powerful Chicago mayor Richard Daley, he won the nomination on the first ballot.

Humphrey was hardly the first presidential candidate to win the nomination without competing in primaries. He would, however, be the last. The events that unfolded in Chicago—displayed on television screens across America—mortally wounded the party-insider presidential selection system. Even before the convention began, the crushing blow of Robert Kennedy’s assassination, the escalating conflict over Vietnam, and the energy of the antiwar protesters in Chicago’s Grant Park sapped any remaining public faith in the old system. On August 28, the protesters turned to march on the convention: Blue-helmeted police attacked protesters and bystanders, and bloodied men, women, and children sought refuge in nearby hotels. The so-called Battle of Michigan Avenue then spilled over into the convention hall itself. Senator Abraham Ribicoff of Connecticut, in his nomination speech for antiwar candidate George McGovern, decried “the gestapo tactics” of the Chicago police, looking—on live television—directly at Mayor Daley. As confrontations exploded on the convention floor, uniformed police officers dragged several delegates from the auditorium. Watching in shock, NBC anchor Chet Huntley observed, “This surely is the first time policemen have ever entered the floor of a convention.” His coanchor, David Brinkley, wryly added, “In the United States.”

The Chicago calamity triggered far-reaching reform. Following Humphrey’s defeat in the 1968 election, the Democratic Party created the McGovern–Fraser Commission and gave it the job of rethinking the nomination system. The commission’s final report, published in 1971, cited an old adage: “The cure for the ills of democracy is more democracy.” With the legitimacy of the political system at stake, party leaders felt intense pressure to open up the presidential nomination process. As George McGovern put it, “Unless changes are made, the next convention will make the last look like a Sunday-school picnic.” If the people were not given a real say, the McGovern–Fraser report darkly warned, they would turn to “the anti-politics of the street.”

The McGovern–Fraser Commission issued a set of recommendations that the two parties adopted before the 1972 election. What emerged was a system of binding presidential primaries. Beginning in 1972, the vast majority of the delegates to both the Democratic and Republican conventions would be elected in state-level primaries and caucuses. Delegates would be preselected by the candidates themselves to ensure their loyalty. This meant that for the first time, the people who chose the parties’ presidential candidates would be neither beholden to party leaders nor free to make backroom deals at the convention; rather, they would faithfully reflect the will of their state’s primary voters. There were differences between the parties, such as the Democrats’ adoption of proportional rules in many states and mechanisms to enhance the representation of women and minorities. But in adopting binding primaries, both parties substantially loosened their leaders’ grip over the candidate selection process—opening it up to voters instead. Democratic National Committee chair Larry O’Brien called the reforms “the greatest goddamn changes since the party system.” George McGovern, who unexpectedly won the 1972 Democratic nomination, called the new primary system “the most open political process in our national history.”

McGovern was right. The path to the nomination no longer had to pass through the party establishment. For the first time, the party gatekeepers could be circumvented—and beaten.

The Democrats, whose initial primaries were volatile and divisive, backtracked somewhat in the early 1980s, stipulating that a share of national delegates would be elected officials—governors, big-city mayors, senators, and congressional representatives—appointed by state parties rather than elected in primaries. These “superdelegates,” representing between 15 and 20 percent of national delegates, would serve as a counterbalance to primary voters—and a mechanism for party leaders to fend off candidates they disapproved of. The Republicans, by contrast, were flying high under Ronald Reagan in the early 1980s. Seeing no need for superdelegates, the GOP opted, fatefully, to maintain a more democratic nomination system.

Some political scientists worried about the new system. Binding primaries were certainly more democratic. But might they be too democratic? By placing presidential nominations in the hands of voters, binding primaries weakened parties’ gatekeeping function, potentially eliminating the peer review process and opening the door to outsiders. Just before the McGovern–Fraser Commission began its work, two prominent political scientists warned that primaries could “lead to the appearance of extremist candidates and demagogues” who, unrestrained by party allegiances, “have little to lose by stirring up mass hatreds or making absurd promises.”

Initially, these fears seemed overblown. Outsiders did emerge: Civil rights leader Jesse Jackson ran for the Democratic Party nomination in 1984 and 1988, while Southern Baptist leader Pat Robertson (1988), television commentator Pat Buchanan (1992, 1996, 2000), and Forbes magazine publisher Steve Forbes (1996) ran for the Republican nomination. But they all lost.

Circumventing the party establishment was, it turned out, easier in theory than in practice. Capturing a majority of delegates required winning primaries all over the country, which, in turn, required money, favorable media coverage, and, crucially, people working on the ground in all states. Any candidate seeking to complete the grueling obstacle course of U.S. primaries needed allies among donors, newspaper editors, interest groups, activist groups, and state-level politicians such as governors, mayors, senators, and congressmen. In 1976, Arthur Hadley described this arduous process as the “invisible primary.” He claimed that this phase, which occurred before the primary season even began, was “where the winning candidate is actually selected.” Members of the party establishment—elected officials, activists, allied interest groups—were, thereby, not necessarily locked out of the game. Without them, Hadley argued, it was nearly impossible to win either party’s nomination.

For a quarter of a century, Hadley was right.