7

TOO SMART TO WIN

image

“If a nation expects to be ignorant and free, in a state of civilization, it expects what never has and never will be.”

—Thomas Jefferson

“Whatever it is, I’m against it.”

—Groucho Marx

During the sweltering summer of 1776, a group of brilliant men huddled in Philadelphia to declare independence from Great Britain. Many of them—like Jefferson and Franklin—were polymaths: inventors, philosophers, scientists, theorists, and writers. Historian Richard Hofstadter described the Founding Fathers as “sages, scientists, men of broad cultivation, many of them apt in classical learning, who used their wide reading in history, politics, and law to solve the exigent problems of their time.”63 In other words—horrors!—elites.

For decades, this founding generation (Washington, Adams, Jefferson, Madison, and Monroe) presided over our young nation. John Quincy Adams, son of our second president and an incredibly accomplished man in his own right, succeeded them as our sixth president in 1825. But the nation was changing. The West was expanding, and more people were becoming enfranchised.

The time was ripe for a populist like Andrew Jackson, a Democrat and a military hero. He marked a stark break from the founding generation. With a few exceptions, populism, often with an anti-intellectual flavor (think “Tippecanoe and Tyler, Too” and the “Know-Nothings,” for example), dominated American politics of this period until the rise of Theodore Roosevelt, a hybrid candidate who—despite being old moneyed, aristocratic, bespectacled, and Harvard educated—somehow managed to overcome all those “disadvantages” by virtue of advocating a strenuous life, moving to North Dakota and becoming a cowboy, leading the charge up San Juan Hill, and loving big game hunting. (The next time you see Ivy League–educated Ted Cruz wearing a blaze orange hunting jacket in a duck blind, you’ll know why.)

That’s not to say the presidents between Jackson and TR were all ignoramuses (many were conversant in Latin; James Abram Garfield could simultaneously write in Latin with one hand and Greek with the other). Numbered among the most important political events of our nation’s first hundred years was the series of Senate debates in 1858 between Illinois’s Democratic candidate, Stephen A. Douglas, and Republican candidate, Abraham Lincoln, who was a man (like Garfield) famously born in a log cabin. Just imagine it: two men—both civic-minded, both well-read—and before them an audience willing, even eager, to hear one candidate speak for sixty minutes, followed by his opponent for ninety minutes, culminating in thirty minutes for a rebuttal. Now imagine repeating that debate six more times.

Still, stylistically, populism dominated from Jackson until TR—who soon paved the way for a far less robust intellectual named Woodrow Wilson—whose involvement in the First World War effectively led to a backlash against intellectuals that lasted until FDR’s “Brain Trust” of advisers came to town. A post-FDR backlash endured until JFK. After JFK’s presidency and the sense that “the best and brightest” had led us into Vietnam, the pendulum again swung away from intellectuals. This third reaction lasted until the arrival of Obama. There is a historic ebb and flow between voters demanding dumbed-down populist style and intellectual elite style; the last populist phase ended with Obama, yet Republicans are still trying to win with dumb.64

Too many of today’s conservatives deliberately shun erudition, academic excellence, experience, sagaciousness, and expertise in politics.65 Many of the people doing so are not as dumb as they pretend to be. But many do lack wisdom—a wisdom that true conservatives from a prior generation would describe as one that embraces education, expertise, and scholarship. Today, even those rare conservatives who possess a wealth of knowledge feel obliged to act dumb. In 2015, Louisiana governor Bobby Jindal noted, “There is a tendency among the Left and some Republicans that say you can either be conservative or smart.” He was right—so why does he keep acting dumb? It turns out there’s a pretty good reason. If there’s such a thing as being too dumb to fail, then isn’t it possible a politician could be too smart to win?

Referring to a particularly vapid article penned by Jindal, in which Jindal criticized “bedwetting” introspection (“excessive navel gazing leads to paralysis”), Slate’s Dave Weigel observed, “Jindal’s rep[utation] is as a wunderkind who was put in charge of Louisiana’s hospital system at age 28. To be competitive in the Iowa caucuses, he needs to either pretend to be a schmuck or emphasize his heretofore-concealed schmucky tendencies.”

Weigel wasn’t the only one who noticed. “Jindal isn’t talking to independents or Democrats in this op-ed. This is solely about telling Republicans what they want to hear,” wrote Ezra Klein in the Washington Post. “That’s how the GOP becomes the stupid party: Republican Party elites like Jindal convince Republican Party activists of things that aren’t true. And that’s how the GOP becomes the losing party: The activists push the Republican Party to choose candidate decisions and campaign strategies based on those untruths, and they collapse in the light of day.”

Republicans have occasionally proposed a positive, substantive agenda, such as the party’s Contract with America in 1994 or the more recent and ambitious government reform budgets of representative Paul Ryan. But the norm has been to eschew substance for poll-tested fluff like George W. Bush’s “compassionate conservatism” (which only pretended to be an ideology) and Donald Trump’s “Make America Great Again” slogan (which did not even pretend). One reason why politicians are losing their intellectual sharpness is that many have no choice but to spend about 50 percent of their time raising money. Take it from me; my wife is a political fund-raiser. Part of her job is to sit with candidates and elected officials as they “dial for dollars.” The media also deserves blame for our current state of candidates. We lament the loss of access and the days of “straight talking” politicians, yet are quick to crucify any politician who dares say something interesting. The smart move for politicians is to talk in sound bites, to stay on message, and to be boring.

I’m reminded of this scene from the movie Bull Durham, when Kevin Costner’s character—catcher “Crash” Davis, a battle-hardened minor league veteran—schools rookie phenom pitcher Ebby Calvin “Nuke” LaLoosh (Tim Robbins) on how to talk to the press:

Crash Davis: You’re gonna have to learn your clichés. You’re gonna have to study them; you’re gonna have to know them. They’re your friends. Write this down: “We gotta play it one day at a time.”

Nuke LaLoosh: Got to play…It’s pretty boring.

Crash Davis: ’Course it’s boring. That’s the point.

Sadly, it’s not just rookie ballplayers who are taught to spout insipid lines. Our nation’s leaders learn the same script. And if they dare go unscripted—if they dare say something interesting—we destroy them. The result is a dumbed-down political discourse with no substance. Today, it is the GOP that is more or less the party of the disaffected populists. The party is thrashing about, suffering an identity crisis—stunned that the American public no longer reflexively supports them—and this leads to all sorts of negative consequences. As with an alcoholic who finally hits rock bottom, recovery works in stages—and denial is usually the first stage.

The final part of this book will focus on some ways conservatives can return to their intellectual moorings, and, indeed, the midterm elections of 2014 provide hope that the process has already begun. It’s also fair to say that just as Ronald Reagan was able to encourage both the nation and the GOP to up their games, electing the right leader can cover a multitude of sins. Should Republicans elect a transformational president, many of these problems might right themselves.

For now, however, I have identified three symptoms of our malaise, all marked by the demise of a necessary political strength: 1) the death of experience and knowledge, 2) the death of compromise, and 3) the death of political institutions.

1. The Death of Experience and Knowledge

According to a March 2015 CNN/ORC poll, nearly half of Republicans—46 percent—want a candidate who is new to politics (conversely 77 percent of Democrats said they want an experienced one).66 Perhaps it came as no surprise, then, when Donald Trump and Ben Carson (neither of whom had ever been elected to public office) surged to the top of the polls in the late summer of 2015.67

The rejection of experience and expertise parallels the history of anti-intellectualism in America. The “anti-expert” ethos is cyclical, having begun with Old Hickory Jackson’s victory over John Quincy Adams, whose “learning and political training were charged up not as compensating virtues but as additional vices.”68 One contemporary newspaper described the contest as between “John Quincy Adams, who can write” and “Andrew Jackson, who can fight.”69 (In 2011, Herman Cain would unintentionally channel this sentiment, declaring, “We need a leader, not a reader.”) The problem was, as Richard Hofstadter notes, “the Jacksonian conviction that the duties of government were so simple that almost anyone could execute them downgraded the functions of the expert and the trained man to a degree which turned insidious when the functions of government became complex.”70 Donald Trump similarly convinced many conservatives that governing doesn’t require any sort of expertise. Solving illegal immigration would be easy, he insisted. He would just have the Mexican government pay for a fence. Health care? Trump would simply replace Obamacare with “something terrific.” Foreign policy expertise? You can get that by watching “the shows.” When conservative talk radio host Hugh Hewitt tripped him up, exposing Trump’s lack of foreign policy expertise, Sarah Palin came to his defense. “I’d rather have a president who is tough and puts America first than [a president who] can win a game of Trivial Pursuit,” she declared. Not much has changed since the time of Adams and Jackson.

But Palin wasn’t the only conservative defending Trump’s lack of knowledge and experience. After videos showing Planned Parenthood discussing the harvesting and selling of baby organs sparked calls to defund the group from taxpayer support, Trump declared on CNN’s New Day, “I would look at the good aspects of [Planned Parenthood], and I would also look because I’m sure they do some things properly and good for women.” He then continued, “I would look at that, and I would look at other aspects also, but we have to take care of women. The abortion aspect of Planned Parenthood should absolutely not be funded.” Despite the fact that Trump is a brilliant businessman (just ask him), he appeared unfamiliar with the basic concept that money is fungible. He apparently didn’t realize that taxpayer subsidizing of Planned Parenthood’s legitimate health care services frees up other money for abortions. More interesting than Trump’s ridiculous position, however, was the fact that his conservative apologists were still making excuses. After Trump’s Planned Parenthood comments, conservative provocateur Ann Coulter actually went on Fox News’s Hannity to defend the mogul, even if her words sounded more like an indictment of his lack of experience: “He is not a politician. He is not familiar with all the deceptive ways Planned Parenthood will sneak through funding for abortion.”

Of course, Trump is merely the next step in the trend of elevating entertainers to executive positions. It’s important to remember that Arnold Schwarzenegger and Jesse “The Body” Ventura have more in common than just starring in the movie Predator; both are also failed governors. It was only a matter of time until this trend made it to the highest office in the land.

Consider how the qualifications for president have continued to decline in the last thirty years. Yes, Ronald Reagan was an actor, too. But he didn’t simply try to trade on his celebrity to attain elective office. He proved himself in the arena of ideas before becoming governor, and accumulated actual governing experience before running for president. He was married (twice), had three71 kids (and adopted another), and had multiple careers that were punctuated by both achievement and less successful endeavors. Before being elected president in 1980, he lost two presidential bids (if you count 1968) and served two terms as California’s governor. What a stark contrast to today, when seemingly fifteen minutes as a US senator from Illinois or a half term as governor of Alaska serve as qualifications. And the bar keeps getting lowered. Before she was even sworn in after her 2014 Senate victory, Joni Ernst was touted by pundits John McLaughlin and Bill Kristol as a possible 2016 presidential candidate. As recently as 2011, she served as the auditor of Montgomery County in Iowa (population 10,424), and now she’s ready to be president?

Part of the problem is that, as a civilization, we no longer respect experience or revere elders. It used to be you paid your dues. The new guy on the job bided his time and spent that time observing those above him. Today, young people start a new job on Monday and are trying to take over by Friday. You’ve probably heard the line, “Give me six hours to chop down a tree, and I’ll spend the first four sharpening my axe.” How quaint. Jesus spent thirty years preparing for a three-year ministry, but today we can’t wait to get behind the pulpit.

The democratization of information also deserves to be mentioned. Google empowers people to feel (incorrectly) that they no longer need experts. In politics, average voters and amateur activists can now access the same raw data as elected officials and often trust their own interpretation over that of some designated expert’s. Everybody thinks he can do three things: manage a baseball team, write a book (!), and run for/hold an office. Everybody isn’t always right. The “instant expert” phenomenon is choking journalism. Young people formerly worked their way up the ranks of a publication. Now, media outlets, desperate for young consumers and cheap labor, routinely make the mistake of elevating the inexperienced. In the old days, a young cub reporter might toil for years in a local market honing her craft at straight reporting before finally getting a chance to write a column. Now, a decent Twitter following might land you a TV show, as it did for twenty-six-year-old Ronan Farrow on MSNBC, despite his youth and inexperience. MSNBC canceled him in little more than a year.

Some of media’s dramatic shift to youth is due to the larger trend of media decentralization and the rise of alternative outlets, giving young reporters—willing to work longer hours for less money than the middle-aged—more platforms to seize. Conservatives, who endured generations of a liberal media filter monopoly, are understandably hesitant to criticize the emergence of alternative media, but, as the late novelist David Foster Wallace pointed out in his Atlantic essay, “Host,” “The ever increasing number of ideological news outlets creates precisely the kind of relativism that cultural conservatives decry, a kind of epistemic free-for-all in which ‘the truth’ is wholly a matter of perspective and agenda.”

The trend gallops on even in America’s most elite club—the US Senate. Ted Cruz had barely been on the job for two months in 2013 when he pointedly challenged California senator Diane Feinstein over her support of an assault weapons ban. “I’m not a sixth grader. Senator, I’ve been on this committee for twenty years,” she scolded Cruz.

Another particularly egregious example of pursuing instant gratification over dues paying saw former vice president Dick Cheney’s daughter Liz Cheney (a former State Department official who had never held elected office) decide to challenge three-term incumbent senator Mike Enzi in Wyoming’s Republican primary. Ms. Cheney had no plausible philosophical reason for opposing Enzi. He was a reliable conservative vote from a deeply red state. Even stranger, Cheney lived in Virginia, where she could have run against an incumbent Democrat. But that meant risking a loss in a swing state. Still, if she had thought her politics wouldn’t play in Virginia, she might have moved westward, gotten involved in community affairs or local government for a few years until Enzi, now serving his fourth term, was ready to retire. But Ms. Cheney didn’t think she had to pay her dues. She thought she could convince Wyoming voters to dump a solid conservative because of her name. Unfortunately for her, the audacity of challenging a sitting US senator of the same party just because you can proved unseemly. The backlash among Wyoming voters was fierce, and she dropped out seven months before the primary (citing health reasons in her family).

It was an appalling episode, but who could blame her for trying? We live in a world where both Democrat Barack Obama and Republican Ted Cruz showed up in Washington and immediately began running for president. Neither man had any executive experience (Obama had been a backbench Illinois state senator, and Cruz had never held elected office before), but both adopted a “go big or go home” mentality. It’s too soon to see how this will play out for Cruz, but it’s fair to say that, heretofore, both men have largely prospered professionally from their approach (even if the nation has suffered as a result).

It’s not just that today’s youth are disrespectful to their elders. Our elders also fetishize youth. It was not always thus. “Young people are in a condition like permanent intoxication,” declared Aristotle. We shouldn’t be glamorizing the ignorance of inexperience, and the fact that we do has led us to predictable results.

The conservative movement, desperate for young people to replenish its aging ranks, has fallen prey to the cult of youth. Every once in a while, a new wunderkind emerges on the Right. Consider the case of Jonathan Krohn, best remembered for delivering a speech to the 2009 Conservative Political Action Conference (CPAC)—one of the most important annual gatherings of the conservative movement. He was thirteen years old. Krohn now regrets the speech, explaining that he was merely parroting what he heard growing up in conservative Georgia. “I think it was naive,” he told Politico in 2012. “It’s a thirteen-year-old kid saying stuff that he had heard for a long time.”72 So who’s to blame? Over at Commentary, Alana Goodman took Krohn’s parents to task, asking, “What were his parents thinking when they pushed him into the national spotlight as a ‘conservative pundit’ at just thirteen years old?” A better question: Why would a venerable institution like CPAC undermine the stature of their platform by allowing a kid to give a speech? Was it merely a novelty?

This lowering of barriers and removal of filters can be positive, but not every outcome is. Take the case of one conservative blogger who gained notoriety (or was it infamy?) during Republican senator Thad Cochran’s bitterly disputed 2014 primary reelection campaign in Mississippi.

It was a story straight out of a John Grisham novel. Here’s the abbreviated version: A young Mississippi conservative activist and blogger named Clayton Kelly sneaked into the nursing home where Senator Cochran’s bedridden, dementia-stricken, seventy-two-year-old wife, Rose, was living, in order to photograph her. The reason? There were rumors that Senator Cochran was shacking up with one of his female staffers, and this conservative activist apparently believed juxtaposing pictures of Cochran’s infirm, bedridden wife with pictures of Cochran living the Washington high life with an alleged mistress might help Cochran’s primary challenger Chris McDaniel win the race. But things fell apart when the police closed in (Mississippi law prohibits exploiting a vulnerable adult), and then things really took a turn for the worse. Mark Mayfield, a Mississippi Tea Party leader who was allegedly involved in the scheme, died of an apparent suicide. (As if this story weren’t interesting enough, after Rose Cochran died, Thad Cochran and his aide, in fact, got married.)

As a writer, I am sympathetic to how an ambitious yet naive young blogger might think it’s glamorous and even noble to go sleuthing around for campaign dirt—and how, in that moment, he might cross a serious legal and ethical line. The incident in Mississippi represents the culmination of many things. We have the convergence of media activism (with the rise of alternative media and blogs blurring the lines between activism and journalism), the ever-present problem of campaigns and journalism being stacked full of ambitious young people like Kelly, untethered by life experience or historic perspective, and, finally, the trend of political campaigns treating losing any election as an existential threat to one’s side.

And this is complicated by the fact that young people (without the benefit of a fully developed frontal cortex) are on the front lines. The adults are no longer in charge—a situation that holds true even in senior positions for major US Senate races.

Once upon a time, only a seasoned professional would serve as a spokesman for a high-profile campaign. Consider the case of the aforementioned Reagan aide Lyn Nofziger, who lost two fingers to German shrapnel on D-day,73 two decades before going to work for Ronald Reagan. In between, he was a husband, a father, and a reporter. But times have changed. A 2014 New York Times Magazine piece on the two spokespersons working for the campaigns of Senator Mitch McConnell (Kelsey Cooper) and his Democratic opponent, Kentucky secretary of state Alison Lundergan Grimes (Charly Norton) makes that abundantly clear: “Norton and Cooper, 25 and 23 respectively, are typical of young political operatives at work today. Each speaks with a Southern accent, though neither is from the South, let alone Kentucky,” the piece tells us. “The job now requires no special education or experience, no roots to a state, and no affiliation with a candidate.”

If the lack of respect for experience, authority, protocol, and ethical standards is a problematic trend, at least it’s an informal one. Interestingly, not only are populists unconcerned about this trend, they want to enshrine it into law with term limits—a government mandate that elected officials be inexperienced. Ronald Reagan famously opposed presidential ones, yet the 1994 Republican Revolution promised congressional term limits (not surprisingly, it’s one promise they didn’t keep).

Absent elected “experts” (and their presumably experienced staffs), a permanent apparatus of unelected bureaucrats and lobbyists become the go-to experts. Without experienced and tenured public servants, an unelected ruling class eventually grows more and more powerful. Conservatives usually worry about such unintended consequences, but the siren song of wanting to “throw the bums out” is too alluring to resist.

Ironically, these “outsiders” eager to replace an entrenched politician are really insiders, dumbing themselves down just to get elected. As Hofstadter observed, “Just as the most effective enemy of the educated man may be the half-educated man, so the leading anti-intellectuals are usually men deeply engaged with ideas.”74

Who’s the biggest “outsider” in the US Senate today? At or near the top of your list is probably Ted Cruz, the government shuttering, Tea Party–backed junior senator from Texas. But Cruz attended an Ivy League school, served in the Bush administration, and is married to a Goldman Sachs executive. I can hardly think of a more elite résumé. According to GQ, “As a law student at Harvard, [Cruz] refused to study with anyone who hadn’t been an undergrad at Harvard, Princeton, or Yale. Says Damon Watson, one of Cruz’s law-school roommates: ‘He said he didn’t want anybody from “minor Ivies” like Penn or Brown.’”

Who’s the second-biggest outsider in the GOP today? Probably Senator Rand Paul—the ophthalmologist son of a doctor and US Representative who twice ran for president. Who’s the third? I’d say Senator Mike Lee, the son of Ronald Reagan’s solicitor general, who once clerked for Samuel Alito, now a Supreme Court judge. Now add in a couple of the other “anti-establishment” outsiders who (along with Cruz and Paul—and numerous others) are running for the Republican nomination in 2016. You’ve got Donald Trump, a billionaire real estate mogul who went to Wharton, and Ben Carson, a world-class neurosurgeon with a BA from Yale. These are overdogs posing as underdogs.

This is not a new phenomenon. Politicians have long crafted careful narratives about their personal histories to help to advance their careers, which often means trying to shed their more privileged backgrounds and portray themselves as heroes of the downtrodden and disenfranchised. America falls for it most of the time.

Reinvention doesn’t always equate to spinning. FDR, a patrician, contracted polio and learned to overcome obstacles, which helped him identify with down-on-their-luck Americans. Reagan, the son of an alcoholic from Dixon, Illinois, grew up to become a Hollywood movie star before being elected president. Some observers have attributed his experience as the son of a drunk to his psychological need to be liked—to be a peacemaker. Changing circumstances helped both FDR and Reagan connect with average Americans, even though they are also the very definition of elite. And sometimes the opposite happens. Over time, people with authentic credentials—in the process of bettering themselves or playing the game to get ahead—lose touch with their roots and seem to forget where they came from. As David Brooks noted a few years back, “Occasionally you get a candidate, like Tim Pawlenty, who grew up working class. But he gets sucked up by the consultants, the donors, and the professional party members and he ends up sounding like every other Republican.” Was he an insider or an outsider? Sometimes it seemed like Pawlenty himself didn’t know. No wonder the rest of us were confused. Four years later, another Midwestern working-class hero, Wisconsin governor Scott Walker, tried to compete with the silver spoon crowd. He won his first governor’s race by bragging about how he packed a brown bag lunch every day and bought his shirts at Kohl’s department stores. But on the presidential debate stage, he had little to offer but hackneyed talking points. His poll numbers tanked, and he cut his losses. Both Pawlenty and Walker are cautionary tales: a middle-class backstory is no substitute for a compelling governing vision.

It’s true that William F. Buckley once famously said, “I’d rather entrust the government of the United States to the first 400 people listed in the Boston telephone directory than to the faculty of Harvard University.” But what many modern “conservatives” miss is that Buckley was attacking Harvard’s liberalism, not complimenting the wisdom of Mr. Aaberg through Mrs. Adkins. Buckley critiqued the liberal elite, not the elite conservative (and apolitical) intelligentsia whose accumulated decades of study and expertise infused their brilliance. His quip packed a punch because he was calling the liberal faculty at Harvard even worse than the uninformed and the uneducated. (That Buckley was, ahem, a Yale man might have also contributed to this assessment.)

Despite the fact that experience is sometimes a liability, the good news is that the most recent crop of newly elected Republicans in Congress boasts a plethora of experience—and it’s not solely limited to experience as “career politicians,” either. In a New York Times column titled “The Governing Party,” right after the 2014 midterms, David Brooks took great pains to demonstrate the “deep roots” the newly elected Republicans had in “the dominant institutions of American society.” Among the candidates Brooks cited was James Lankford, newly elected senator from Oklahoma who has a divinity degree, Larry Hogan, the new governor of Maryland who founded a real estate development firm. Brooks also noted Tom Cotton, a Harvard graduate who served in Iraq with the 101st Airborne and in Afghanistan with a Provincial Reconstruction Team. Before his 2014 election to the US Senate, Cotton had served in the House of Representatives and had worked for the respected multinational consulting firm McKinsey & Company.

2. The Death of Compromise

Conservatives rightfully revere the Founders, who faced daunting challenges in framing our Constitution. Most agreed that the Articles of Confederation were too weak, but what should any new government look like? Small states (like New Jersey) might argue that all states deserve to be treated equally, while large states (such as Virginia) might argue that, since they have larger populations, it’s only fair their citizens are represented accordingly. The Connecticut Compromise solved this problem by providing each state with two Senators, but assigning representation in the lower House based on population. Of course, once it became clear that seats in the House of Representatives would be proportional, this created a dilemma for the majority of the South, which had a smaller free population. The answer was to increase the South’s population numbers by counting slaves. The North balked at this. In retrospect, it seems morally repugnant, but a compromise was reached by counting slaves as three-fifths of a person (for the purposes of representation).

Another challenge: Where to locate the nation’s capital? A compromise hatched a couple of years later, in 1790, solved that one. At the time, Secretary of the Treasury Alexander Hamilton wanted the federal government to assume the Revolutionary War debts still owed by the states—a bad deal for the southernmost, more agrarian states, which stood to gain little from the plan. Some Northern states such as Massachusetts owed a lot more money than some Southern states such as Virginia—which had mostly paid off its debt. As Jefferson told it, he organized a dinner attended by Hamilton and Virginia representative James Madison.75 The deal they brokered involved Madison not opposing Hamilton’s financial plan in exchange for placing the nation’s capital (previously in new New York and Philadelphia) in the “south”—a swamp where I sit writing these very words now—which came to be known as Washington, DC.

This tradition continued. The Missouri Compromise of 1820 and the Compromise of 1850 helped postpone the Civil War. Some might well argue that these compromises were morally indefensible—or that they merely postponed the inevitable. Others might see them as valiant, if ultimately doomed, attempts to avert a bloody war. Regardless, it is worth noting that the Civil War coincided with the death of three statesmen and compromisers from the various regions in conflict. It is, perhaps, not a coincidence that Henry Clay of Kentucky, John C. Calhoun of South Carolina, and Daniel Webster of Massachusetts all died in the 1850s. With the death of compromise, the Civil War was soon to come. After the war, the Compromise of 1877 served to undermine the legacy of Lincoln and preemptively end Reconstruction, thus paving the way for Jim Crow. (Like the 2000 election, the election of 1876 between Republican Rutherford B. Hayes and Democrat Samuel J. Tilden was fiercely disputed. Democrats, who controlled the House of Representatives, broke the deadlock by assenting to Hayes becoming president; in return, Hayes pulled federal troops from the South. It’s hard to overstate the harm inflicted by this compromise.)

Still, compromise has a long (if not always proud) tradition in America. The Constitution literally wouldn’t exist without it. Ironically, the very people today most likely to don tri-cornered hats and genuflect at the Founding Fathers and Ronald Reagan (who cut deals with Tip O’Neil and Mikhail Gorbachev) most vociferously oppose compromise. In fairness, beginning in the 1960s, conservatives who watched in horror as America lurched leftward have valid reasons to worry that some “compromises” are just a ruse—really part of a larger, incremental plan to radically change the nation. In this instance, the side that is perpetually conceding things in the interest of comity—even seemingly small things—eventually loses. This realization was expressed well by Gun Owners of America founder and California state senator H. L. Richardson,76 whose book Confrontational Politics has become a cult classic among grassroots conservatives. Writing about compromise, Richardson observes that liberals gain ground by asking for a lot; however, once opposition arises,

[t]he Liberal then offers a compromise, a partial solution is presented. Half, instead of the whole loaf, is offered. The Left suddenly creates the aura of appearing reasonable, moderating their request. Leftist dialectics is nothing more than planned retreat, a tactic used to confuse and throw the opposition off guard. Lenin called it an important tool in accomplishing overall goals.

He used the analogy of a man driving a nail with a hammer, the backward stroke being just as important as the forward thrust of hitting of the nail. Ask for much more than expected and then, when the opposition builds, give in a little, play the good guy willing to concede. Switch from bad guy to good guy, be conciliatory, be sweetness and light, offer “compromise.” Initiate the conflict then strategically back off.

This might not be a big deal, but what happens when this process is repeated over and over? It helps to visualize the political world as a football field. If each side begins in their own end zone, and the Left insists on scoring a touchdown, you might compromise and meet them on the fifty-yard line. Fair enough. But the next time they try to score a touchdown, the second compromise puts them on your twenty-five-yard line. If you wonder why conservatives sometimes appear to be digging in their heels on issues that seem somewhat trivial, or even indefensible in some cases, it’s because they believe that if you give the Left an inch, they’ll take a foot—and then outlaw rulers.

As such, Richardson might sound like he’s reflecting the paranoia of today’s “hell no” caucus, which scuttles reasonable compromise and legislative accomplishment. But he is only reporting on a playbook championed by liberals such as famed organizer Saul Alinsky. In his classic book Rules for Radicals, Alinsky notes that the term compromise is generally regarded as “ethically unsavory and ugly.” But he then adds that “to the organizer, compromise is a key and beautiful word.…It is making the deal, getting a vital breather, usually the victory.” Alinsky further advises his liberal/progressive/radical readers that “if you start with nothing” to “demand 100 percent, then compromise for 30 percent, you’re 30 percent ahead.”

“A free and open society is an ongoing conflict, interrupted periodically by compromises—which then become the start for the continuation of conflict, compromise, and an ad infinitum,” Alinsky says. Richardson less elegantly—but no less truthfully—dubs this “the ‘salami’ technique, one slice at a time until the whole loaf is consumed.”

Not all compromises are created equal. Some are salami, and some are just plain baloney. Some things—say, giving away Poland—ought to be nonnegotiable, while others—say, where we are going for dinner—aren’t worth fighting over. Some negotiators are sincere and act in good faith, while other negotiators play pernicious games. When we give in to the latter, merely in order to keep the opposition happy and return to a time of peace and civility, we become, as Winston Churchill said of appeasers, “like the man feeding the crocodile, hoping he eats him last.”

It’s important to differentiate between your enemies and your adversaries. Consider the statement offered up after the attacks of September 11, when President Bush famously said, “You’re either with us, or you are against us.” It was meant as an ultimatum to fence-sitting foreign governments who had tolerated terrorists’ presence on their soil. But it quickly became the Republican Party’s approach to domestic politics. Exhibit A was the Department of Homeland Security. It’s hard to remember, but there was a time in the not-too-distant past when that bureaucracy didn’t exist. Yet, post-9/11, President Bush decided he wanted it. And his supporters decided that if you didn’t want it, too, you were anti-American. (See the attack ad on then Georgia senator Max Cleland, a disabled Vietnam vet who was compared to Saddam Hussein and Osama bin Laden for opposing some homeland security measures.)

It’s nearly impossible to discuss how best to battle extremism when mere disagreement brands you as a traitor. This us-versus-them approach to politics made our debates pettier and shallower. It also eliminated compromise from the vocabulary of modern conservatives. In Texas, Bush was known for working across party lines, but his pledge to be a “uniter, not a divider” soon fell apart in Washington. During an interview I conducted with Reagan biographer Craig Shirley in 2015, he blamed some of this Bush-era polarization on having a “president of the United States refer to the loyal opposition as the ‘Democrat’ Party—which is something you would hear from a college Republican—instead of [saying] the ‘Democratic’ Party.” Shirley went on to argue that Reagan always called them the “Democratic” Party, because he was “always recruiting.”

The lack of compromise is a bipartisan problem in Washington, but it seems to hurt Republicans disproportionately. One example is John Boehner’s failure to reach a “Grand Bargain” on the budget with President Obama in July 2011. The fact that neither side could agree on a budget led to the sequester—which was intentionally designed to be bad enough to force both sides to make concessions. While both sides deserve blame, Republicans deserve special blame for mishandling the public relations aspect of obstruction. The government shutdown in October of 2013, for example, instigated a precipitous double-digit decline in Gallup’s favorability ratings for Republicans, arguably costing Virginia attorney general Ken Cuccinelli his gubernatorial election and postponing news coverage about the disastrous rollout of the Obamacare website for a month.

Chastened by the government shutdown, Republicans wisely averted another attempt until after the 2014 midterms. A November 2014 NBC News/Wall Street Journal poll showed that 63 percent of Americans favored the new Congress to “make compromises,” compared with 30 percent who preferred they “stick to their campaign promises.” That might not sound like a big deal, but as the Washington Post’s Aaron Blake noted four years earlier, “47 percent wanted middle ground and 43 percent preferred lines in the sand.”

Colorado witnessed a good example of compromise in late February 2014 when Weld County district attorney Ken Buck, a Tea Party Republican seeking to unseat incumbent Democrat US senator Mark Udall, switched places with Republican representative Cory Gardner, then running for reelection to the House. Cutting a deal where a sitting congressman would run for the Senate might sound like a no-brainer, but four years earlier, when Buck was a Tea Party favorite who lost to Udall, one suspects he would have scoffed at such a suggestion. The notion that he would cut a backroom deal, as one GOP state senator also vying for the Senate seat called it, is anathema to the grassroots ethos, which decried any sort of strategic compromise as tantamount to treason. As it turns out, the move was brilliant. Ken Buck is now in the US Congress, and Cory Gardner is now a US Senator. By cutting a deal, and by being humble enough to defer to a better statewide candidate, Ken Buck not only earned a seat in Congress, but he also proved that Colorado wasn’t the blue state many feared it had become. The truth is, if Udall had won reelection, a lot of people would have proclaimed that Colorado was lost. Buck, by virtue of his prudent and unselfish move, helped Republicans avoid that fate.

3. The Death of Political Institutions

The problem for conservatives—the reason there is such a thing as being “too dumb to fail”—is that even though some voters are becoming more sophisticated, the opposite feels true for the bases of both parties—neither of which want to hear hard truths and both of which demand pandering. And so, a politician who stands up to his or her own base and attempts his or her “Sister Souljah moment” (Bill Clinton’s 1992 public condemnation of comments made by hip-hop artist Sister Souljah) is more often than not punished for being courageous. Meanwhile, the base often rewards the person who tells them what they want to hear—who misrepresents reality or lies to us about what is possible. And so, we have this moral hazard. Politicians who want to win their party’s nomination for Senate or president—or any office—have little incentive to be truth tellers and almost every incentive to talk tough, boast about all the things they can do, and generally tilt at windmills. This plays into what is known as the “Tragedy of the Commons.” For those unfamiliar, it’s an economic theory that warns that individuals acting in their own self-interest can sometimes undermine the larger collective goals, thus depleting shared resources (and, in the long-run, resulting in a lose-lose scenario).

Why don’t more politicians think about the broader needs of their parties? They don’t partly because of the decline of institutions and respect for authority—a larger societal trend that began around the time of Watergate but has been accelerated in the political world thanks to changes in our campaign finance system. While our culture has diminished the importance of institutions ranging from marriage to the media to the office of president, the decline of the political party was merely part of a larger cultural trend. People used to get married and stay married. They used to go to work for a company, stay there for four decades, and then retire with a gold watch. They even—and younger readers may find this puzzling—used to remain one gender. Life was more predictable. Things were more permanent. Likewise, political parties once held incredible sway. People were loyal and deferential to them. And they also wielded tremendous power. This helped keep recalcitrant party members in line. But today, with the proliferation of outside groups that fund the candidates they like (and score against—and then attack—candidates who dare buck their orthodoxy) and the fact that campaign finance laws have largely neutered the political parties, the smart move for a conservative elected official might just be to “stand up to” the Republican National Committee (RNC.)

Party leaders have lost most of their institutional leverage. Keep in mind that selecting a presidential nominee by virtue of primary elections is a fairly modern phenomenon. The party bosses used to hash this stuff out in smoke-filled rooms. Now, there were a lot of problems with that, but (one suspects) they occasionally performed a service in weeding out selfish, irresponsible, incompetent, or crazy candidates.

Similarly, one can’t imagine a backbench congressman in the 1950s standing up to the Speaker of the House the way backbench Republicans do now. And just imagine what might have happened to a newly elected senator who dared to mess things up for the “Master of the Senate,” Majority Leader Lyndon Johnson. Again, this sort of autocratic system was far from perfect, but the trains ran on time (or, at least, not off the rails). Today, party leaders have fewer sticks and carrots. They cannot withhold campaign funding for politicians who—thanks to outside groups, the rise of SuperPACs, direct mail fund-raising, and the Internet—no longer rely on them for funding. They can’t even dole out pork in the form of earmarks to the extent they once did.

Complaining about too much democracy sounds crazy—almost as crazy as complaining about too much transparency. But actually, the surplus of democracy is also a real problem, even if it sounds absurd.

If politicians are constantly pandering, maybe it’s for a good reason. In the past, we elected “representatives” to essentially serve as our trustees for a set period of time. Sure, they received constituent mail and phone calls, and lobbyists visited them. And sure, they might occasionally see some polling numbers or a tough letter to the editor. And yes, on big votes, party leaders and bosses (now mostly impotent) would twist arms. But rank-and-file members generally could vote their conscience, knowing they wouldn’t be called on the carpet until Election Day, which—depending on their particular office—was two to six years down the road.

This is not to say they weren’t accountable. It’s just that they would be held accountable at the appointed time. But until that time, they could largely act without fear or consideration of snap public opinions or an excessive immediate retribution. They were free to take a relatively long view of politics—and to sometimes take an unpopular stand. Voters had time to cool down and judge the totality of a representative’s tenure: This was by design. The idea was to avoid a form of government susceptible to being swept up in the passions of the day and subverting checks and balances. The Founders wanted to avoid mob rule and the tyranny of the majority. But one senses that their concerns might be playing out as we speak.

Our system is not a direct democracy. In fact, the Founders feared it. Yet saying so sounds not only un-American but also, increasingly, un-conservative. Trends toward direct democracy represent another example of how modern conservatives have mimicked liberal tactics. Citizens do not (yet) log on to the Internet and directly cast votes on things. Some states have Progressive-era reforms like voter initiatives, referendums, and recalls (which are to blame for much political and economic dysfunction in places like California), but that’s not what I’m talking about.

We still have elected officials, and they still must stand for reelection at the appointed time. But the ceaseless stream of information and input they receive from constituents, interest groups, cable news, and the Internet makes it almost impossible for them to ignore the stimuli. Today’s politicians must feel more like American Idol contestants who survive by constantly seeking our approval rather than statesmen/community elders empowered to take tough stances. With Twitter, e-mail, constant polling, and twenty-four-hour cable news, our leaders must forever be at the beck and call of their constituents and pundits.

While many of these trends have been decades in the making, they have culminated at the very time when conservatives are without a clear leader and are suffering a form of identity crisis. It’s easier to manage the decentralizing forces of social media and third-party political organizations when you’re winning elections and enjoying a powerful and articulate president in the White House. Conservatives have had the opposite fortune. There are no conservative leaders who have the moral authority to put anyone in line, no William F. Buckleys who have the moral sway—the “juice”—to “write” anyone out of the conservative movement. Buckley dared take on the John Birchers and the Ayn Rands of the world, but one fears that if any prominent conservative attempted such a courageous move today, it would be he or she who is written out of the movement.

In my career covering the Republican Party and the conservative movement, I’ve observed a scant few such courageous acts that could restore respect for institutions. But one notable example came from Senator Mitch McConnell. In an attempt to win Republican control of the Senate in 2014, McConnell finally started playing hardball with outside groups and vendors who were supporting Tea Party candidates who were running primary campaigns against incumbent Republicans. One such group was called the Senate Conservatives Fund (SCF), founded by conservative former South Carolina senator Jim DeMint. “SCF has been wandering around the country destroying the Republican Party like a drunk who tears up every bar they walk into,” Josh Holmes, then McConnell’s chief of staff, told the New York Times. Then, referencing a scene from the film A Bronx Tale, Holmes continued: “The difference this cycle is that they strolled into Mitch McConnell’s bar and he doesn’t [just] throw you out, he locks the door.”

Another courageous act came in 2013 when conservative commentator S. E. Cupp informed the New York Times, “We can’t be afraid to call out Rush Limbaugh.” This was in response to Limbaugh’s rhetoric, such as his referring to Sandra Fluke as a “slut” in 2012. But acts of courage are the exception. It’s generally much safer to pander to the base than the establishment these days. And, by the way, everyone panders. Liberals certainly pander to their base, in some cases, via the redistributive state (by giving them our money). Conservatives face the problem of pandering to a dwindling share of the future electorate. The conservative base now forces candidates to say and do things that will render them vulnerable in general elections. “I don’t know if I’d be a good candidate or a bad one,” said former Florida governor Jeb Bush in December of 2014 at a forum hosted by the Wall Street Journal. “But I kinda know how a Republican can win, whether it’s me or somebody else.” A successful candidate, he continued, “has to be much more uplifting, much more positive, much more willing to be practical,” than recent nominees, and should also be willing to “lose the primary to win the general, without violating your principles.”

Most politicians prefer winning. This leads them to tell the base exactly what they want to hear, causing conservatives to sometimes say things that are wildly unpopular with the vast majority of Americans. Consider North Carolina senator Thom Tillis, for example. He was the moderate “establishment” candidate in his Republican primary. But in 2015, just months after being sworn in, in an apparent attempt to demonstrate his anti-regulatory bona fides, he randomly volunteered that he wouldn’t care if Starbucks decided to opt out of its policy requiring employees to wash hands, “as long as they post a sign that says, ‘We don’t require our employees to wash their hands after they use the restroom.’”

“The market will take care of that,” he continued.

This same urge to overcompensate led moderate Mitt Romney to declare he was “extremely conservative”—and to spend a lot of time talking about the makers and the takers, and to make that damaging gaffe about the 47 percent. In Romney’s case, he was at least partly pandering to the rich donors who write big checks. But most of the pandering is to the grassroots. And, as we are about to discover, it almost inexorably leads to a political dumbing down and to outright demagoguery. That’s partly because the present-day conservative coalition increasingly harbors a hotbed of anti-intellectual fervor.