SIX

Image

American Amnesia

NO MODERN politician has had a more conflicted relationship with government than Bill Clinton. A southern Democrat who came of age amid states-rights suspicions of the federal government, he admired JFK’s bold call for national service. A leader of the Democratic Leadership Council—the party’s centrist vanguard—he ran and won in 1992 as a tribune of the middle class, with universal health insurance a top goal. Within two years, his party in disarray and Republicans in control of Congress for the first time in forty years, he declared, “The era of big government is over.” Yet Clinton felt the vice tightening from the first days of his presidency. As his grand ambitions were whittled down to a deficit-reduction plan pressed on him by Fed chairman Alan Greenspan and Treasury Secretary Robert Rubin, he vented, “I hope you’re all aware we’re all Eisenhower Republicans. We’re Eisenhower Republicans here, and we are fighting the Reagan Republicans. We stand for lower deficits and free trade and the bond market. Isn’t that great?”1

To Clinton, of course, it wasn’t great. He had argued passionately for increased social investments. He was in favor of health insurance for all. He wanted to be a New Democrat, not a moderate Republican. Still, his invocation of Eisenhower was astute, and not just because his budget was actually quite moderate. The two leaders faced a similar challenge: how to shift governance back toward their party’s priorities within boundaries created by an influential predecessor. Just as Clinton had long argued that Democrats had to make their peace with the Reagan Revolution, Eisenhower had scoffed at members of his party who believed they could “abolish social security, unemployment insurance, and eliminate labor laws and farm programs.” The Republicans who thought they could roll back the mixed economy, he had sneered to his brother Edgar, were “negligible” and “stupid.”2

But there was a revealing difference between the two men’s private assessments. Eisenhower had told his brother that if any party messed with the New Deal, “you would not hear of that party again in our political history.” It was voters, he had suggested, who stood in the way of more sweeping challenges to the reigning order. To Clinton, the problem was not the electorate. The public investments and middle-class tax cut on which he had campaigned were popular. The problem was the establishment: those holding power in government and the economy. As he complained to his advisers, “You mean to tell me that the success of the economic program and my reelection hinges on the Federal Reserve and a bunch of fucking bond traders?”3 Eisenhower had worried about the American public. Clinton worried about American elites.

In their words to the nation, the two men spoke differently as well. Eisenhower’s first State of the Union Address in 1953—essentially the same length as Clinton’s first in 1993—mentioned “government” nearly forty times, and the overwhelming majority of his references to government were positive.4 Four decades later, Clinton spoke of government only around half as often, and the majority of his references were negative. On the other hand, Clinton spoke more than Eisenhower had about the deficit and debt and much more about taxes (all of which were lower in peacetime 1993 than they had been in wartime 1953).5 And Clinton mentioned organized labor only once in passing, while the closing of Eisenhower’s speech directly connected labor stoppages by unionized workers and the nation’s struggle against Communism: “Freedom expresses itself with equal eloquence in the right of workers to strike in the nearby factory, and in the yearnings and sufferings of the peoples of Eastern Europe.”

The gulf looks only greater if we consider how the agendas of these two presidents were received. The “middle way” that Eisenhower staked out gained strong bipartisan support. Forty years later, Clinton received zero votes from the opposition on his highest-priority goals. Under the direction of House GOP leader Newt Gingrich, Republicans waged a scorched-earth campaign against the president’s agenda. Within two years, they had brought down the president’s health plan, blocked his public investments, and then converted public discontent with Washington into a congressional majority. By then, Clinton probably wished he was the popular former general who left office having passed big infrastructure and education bills with broad cross-party backing.

Why did Eisenhower find support for the mixed economy, initiating massive new public investments with broad support within both parties, while Clinton could not? Why was Clinton so concerned about elite opinion and financial markets, while Eisenhower saw the voting public as the biggest restraint? And, most puzzling of all, why did putatively progressive leaders stop talking about government in positive terms? After all, Eisenhower was a Republican who, for all his moderation, was setting out an agenda substantially to the right of his forerunners’. Clinton was a Democrat. For all his moderation, he saw himself as a counterweight to an increasingly conservative Republican Party—a champion of middle-class voters, not powerful corporate interests. If Clinton, the Democrat, saw elite opinion as so powerful, and if Clinton, the Democrat, was so reluctant to talk about government in favorable ways, something very profound must have happened. What was it?

This is the question that carries us from part 1 to part 2, from the rise of the mixed economy to its erosion and current crisis. We unpack this puzzling transformation piece by piece, deconstructing and then reassembling the interlocking changes that have compromised our vital capacity to use government to advance American prosperity.

In the current chapter, we explain the parallel rise of two grave threats to the mixed economy: a new economic elite with ideas (and earnings) starkly distinct from the American mainstream and a newly influential economic philosophy that we call “Randianism” (after the radically individualistic thinking of the midcentury novelist Ayn Rand). At the outset, each of these developments—the economic and the ideological—was partly independent of the other. Over time, however, they became more and more intertwined. The increasing dominance of Randian thinking encouraged shifts in corporate behavior and public policy that exacerbated the intellectual and economic distinctiveness of America’s new economic elite: the deregulation of finance, the slashing of top federal tax rates, the growing links between the financial and corporate sectors, the upward spiral of executive pay. Most Americans did not buy into these new assumptions, much less embrace their results. But they did share one important belief with those on the winning side of this growing economic and ideological divide: that government could not be trusted to right the balance.

Yet this is hardly the entire story of the eroding political foundations of the mixed economy. Top corporate and financial executives and others at the pinnacle of the economy did not shape politics and policy on their own; in their efforts, they worked through interest groups and political parties. In the next two chapters, therefore, we show that these changes in elite thinking and behavior have been magnified, rather than countered, by two Great Enablers: the nation’s large business associations (chapter 7) and a Republican Party that has made a dramatic move to the right (chapter 8). No business group now plays the role that the CED once did of an independent voice for the broad collective concerns of business. Instead, the biggest—the Business Roundtable, the Chamber of Commerce, and the increasingly powerful political network associated with Charles and David Koch—have moved toward stances that place priority on the narrow interests of particular industries and those occupying executive suites, while advancing increasingly antigovernment worldviews.

These hugely resourceful and organized groups are the first of the Great Enablers. They have made common cause with the second: a Republican Party that has embraced and encouraged the Randian turn of the nation’s new economic elite. In the process, the GOP has abandoned not just its prior moderate commitments but also its willingness to work constructively with other political actors to update and strengthen the mixed economy. Indeed, the GOP has learned how to win politically by fostering dysfunction, to achieve its policy goals not by brokering agreement but by breaking government. With positive conceptions of government’s essential role marginalized and demonized in political discourse—denounced by Republicans and defended feebly by Democrats—Republicans discovered the benefits of the self-fulfilling prophecy: They could simultaneously cater to narrow corporate interests and denounce “crony capitalism,” feed political dysfunction and win by railing against it, undermine the capacity of government to perform its vital functions and decry a bungling and corrupt public sector.

In the final chapters of part 2, we survey the grave damage to effective public authority—and hence prosperity—that all this has produced. Chapter 9 examines the ways in which these shifts have enabled corporate interests that profit from imposing costs on the rest of us, enriching American capitalists even as they undermine the long-term prospects for American capitalism. Chapter 10 explores the collapse of effective governance that these interwoven changes have precipitated. We shall see that the mixed economy has not become irrelevant or outmoded. It is instead being steadily undermined by the concerted resistance of its foes and the increasing indifference of so many more, by the unchecked demands of narrow private interests and the increasing barriers to a sensible updating of our policies, and by the willful forgetting of an ideal of effective governance that remains, for all these changes, the key to our long-term prosperity.

The Great Forgetting

The economic model that propelled American prosperity in the twentieth century was more than a political or economic achievement. It was also an intellectual achievement. New conceptions of the economy came to dominate older understandings. John Maynard Keynes’s insistence that public spending could soften recessions was part of this new paradigm. Yet the more basic ingredient was an elevated respect for the capacity of government to address problems that the market alone could not. As one of the nation’s most prominent economists wrote in 1948, “No longer is modern man able to believe ‘that government governs best which governs least.’ Where the complex economic conditions of life necessitate social coordination and planning, there can sensible men of good will be expected to invoke the . . . government.”6

“The Rediscovery of the Market”

The man who wrote these words was Paul Samuelson, and they appeared in an unassuming textbook that would become something close to the nation’s economic bible, Economics. For three decades—from the first edition in 1948 through the tenth in 1976—Samuelson’s introductory economics text was the nation’s bestselling textbook. (At last count, it had sold over four million copies.) Samuelson said once, “Let those who will write the nation’s laws, if I can write its textbooks.”7

Within economics, Samuelson (who died in 2009) is best known for bringing mathematical rigor to Keynesian theory, a contribution for which he became the first American economist to win the Nobel Prize. For a generation of young Americans, however, Samuelson was the muse of the mixed economy—a term he used throughout his textbook. Yes, the United States relied heavily on private markets, Samuelson argued. But it was successful because it constrained and enabled those markets using public authority. “The private economy is not unlike a machine without an effective steering wheel or governor,” Samuelson wrote in the first edition of Economics before he launched into the “modern man” passage that summed up the postwar zeitgeist.8

Yet a funny thing happened on the way from the first edition of Economics to the tenth: Samuelson gradually changed his tune. Under fire from conservative intellectuals, he started playing down the mixed economy. The first salvos came from outside the profession: William F. Buckley Jr. ripped into the “collectivist character” of Economics in his 1951 God and Man at Yale.9 But by the 1970s, it was Milton Friedman, George Stigler, and other Chicago School economists leading the charge. To them, government was a drag on the economy—animated by ill-considered causes and beholden to special interests. Samuelson had defended postwar levels of taxation: “With affluence come greater interdependence and the desire to meet social needs, along with less need to meet urgent private necessities.”10 Friedman, by contrast, was “in favor of cutting taxes under any circumstances and for any excuse, for any reason, whenever it is possible.”11 Samuelson had the ear of JFK. Friedman had the ear of Barry Goldwater, whose 1960 manifesto The Conscience of a Conservative summed up the Chicago School message concisely: “I have little interest in streamlining government or in making it more efficient, for I mean to reduce its size.”12

Whether Samuelson was responding to the critics or the conservative shift they exemplified, he progressively retreated from the “modern man” passage. In the fourth edition of Economics, published in 1958, he made his conclusion more impressionistic: “No longer is the modern man able to believe ‘that government governs best that governs least’ ” became “No longer does modern man seem to act as if he believed . . .” In the 1973 edition, Samuelson cut the offending passage entirely. In subsequent editions, he retreated further. Rather than beginning with the various problems with the market that made “perfect competition” relatively rare, later editions started by outlining this microeconomic ideal. His publisher explained that the “leitmotif” of the new approach was the “rediscovery of the market.”13 As the historian Daniel Rodgers observes, “Samuelson’s mixed economy had fallen into sharply distinct parts: markets and government, rhetorically at polar opposites from each other.”14

What We Talk About When We Talk About Government

The idea that government and the market were rivals rather than partners represented a profound shift in political discourse. Even before Reagan proclaimed, “[G]overnment is not the solution to our problem; government is the problem,” leading intellectuals from the right well through the center were turning toward a vision of government and markets as zero-sum opponents: competitors rather than complements.15

We can see this shift in many places: the rhetoric of politicians, the content of legal opinions, the arguments made in leading journals of ideas. But perhaps the most convincing marker of the transformation is the language used in the medium that captures political discourse most consistently over long spans of history: the newspaper.

Using the New York Times, the economist David George has charted dramatic shifts in the portrayal of government and markets over the past forty years. In the mid-twentieth century, government was seven times more likely to be described favorably than unfavorably. Words such as efficient, competent, and creative were far more likely to precede government or public sector than words like inefficient and wasteful. Between 1980 and 2009, however, the balance shifted—from seven times more positive descriptions to roughly equal numbers.16 At the same time, references to government fell sharply overall, just as they did between Eisenhower’s and Clinton’s speeches. What we talk about when we talk about government, it seems, is to not talk about government at all.

Or at least not positively. Over the same period, stories used the term “big government” more and more often, despite the declining mention of “government.” Between 1980 and 2009, the phrase appeared roughly twice as often each year as it had between 1930 and 1979—even though, again, less was said about government overall. As the linguist Geoffrey Nunberg observes, “big government” is invariably pejorative, “implying an overweening state that is ‘doing something that the market could do better’ or ‘coercing people into doing what they ought to choose freely.’ ”17 This sort of big government is clearly what Bill Clinton had in mind when he declared it to have ended. But whether or not the era of big government is over, the era of decrying big government is going strong.

The Times is hardly a bastion of conservative thinking, so we can be pretty sure that these changes don’t reflect a sudden infusion of right-wingers into the paper’s inner sanctum. Instead, they likely mirror a shift in broader political discussions—one both dramatic and simple: Government was becoming the villain; the market, the hero.

Consider the changing language used to describe government activity. To say government “acts” is neutral; to say it “interferes” or “intervenes” is to endorse the zero-sum relationship between state and market that became prominent in the 1970s.18 And, in fact, the Times data show a growing shift from the neutral to the negative around this period. At the same time, unfavorable descriptions of “regulation” became much more common: From the 1930s through the 1970s, regulation was twice as likely to be described positively as negatively. From 1980 to 2000, the relationship was reversed, with negative descriptions outnumbering positive ones 2 to 1.

A similarly revealing transformation has played out in the words used to describe business. For much of the twentieth century, the most common term for someone who employed others was capitalist. Starting in the 1970s, however, another came to rival and then eclipse this traditional word: entrepreneur. The transformation can be seen not just in the Times but also in the tens of millions of English-language books tracked by Google, and it obviously carries a larger meaning. A capitalist finances production—half of the employer-employee relationship. An entrepreneur invents new products and even new ways of producing—the creative force behind prosperity. Indeed, in the Times, the word creative came to be associated increasingly with management rather than with workers. Where once workers were more often described as creative, by the 1980s, management was fifty times as likely to be cast as such.19

Not surprisingly, as corporate America’s stature waxed, organized labor’s waned. Even in the Times, unions were cast in favorable terms only a quarter of the time after 1980. This was a dramatic decline from previous decades, when positive references outnumbered negative ones.20 Here again, the Times mirrored changes in politicians’ rhetoric. In the New Deal era, according to a careful study of the Democratic Party’s platforms, “labor was viewed as the advance agent of reform.” But as organized labor declined, labor “receded entirely from the party’s presidential agenda.”21 Workers and unions, it seemed, were so mid-twentieth century. The future of American capitalism was in the hands of the capitalists.

The rhetorical apotheosis of this transformation might well be “job creator.” The phrase would not flower until the mid-2000s, at which point it would become ubiquitous. But the move toward seeing business as the visionary “creator” of jobs rather than the mere “provider” began much earlier. Starting in the 1970s, “job creation”—a phrase used rarely in prior decades—became more common than alternative wordings in the Times.

In this respect, the shift in rhetoric from Eisenhower to Clinton signaled far more than each man’s idiosyncratic background or inclinations. If you study all presidential speeches from 1947 through the mid-2000s, you see the same trends.22 On many issues, presidential positions have oscillated left and right based on the party label of the executive. But presidential rhetoric about the political system has headed in a single direction: toward more and more government bashing. The same is true of congressional rhetoric. The most common three-word phrases in the Congressional Record in the 1950s and 1960s were mostly nonpartisan and either unrelated to or favorable toward the mixed economy. In the 1970s, however, phrases such as “free enterprise system” and “lower tax brackets” became prevalent, especially on the Republican side of the aisle.23 In every river of public discourse, the mixed economy was on the rocks.

The Great Persuasion?

To many, this rhetorical transformation is the story of a paradigm shift—The Great Persuasion, as Angus Burgin calls it in his sweeping history of free-market thinkers like Friedman and Friedrich Hayek.24 Analysts in this mold are fond of quoting Keynes’s famous declaration that “the power of vested interests is vastly exaggerated compared with the gradual encroachment of ideas.”25

Others are more skeptical. They like to quote Upton Sinclair’s more cynical observation: “It is difficult to get a man to understand something, when his salary depends upon his not understanding it.”26 In this view, ideas carry limited weight unless they align with material incentives. And when ideas conflict with those incentives, it is the ideas, not the incentives, that are ignored.

In the shift of opinion against the mixed economy, however, Keynes and Sinclair were both right. Ideas were crucial, especially in the initial right turn. The emergence of “stagflation”—the combination of inflation and stagnation—strengthened the hand of economic thinkers who argued that government couldn’t manage the economy effectively. Had there not been vigorous critics of the mixed economy at the ready, had they not wielded a coherent and powerful set of arguments, these economic troubles surely would not have precipitated such a fundamental reversal.

Yet the rapid and durable repudiation of the mixed economy was hardly a “gradual encroachment of ideas.” Instead, new understandings swept the field because they intersected with and guided powerful economic interests that were becoming more and more influential within American politics. Facing meager profits and depressed stock prices, business leaders mobilized to lobby Washington as never before. Fatefully, they were also increasingly inclined to accept the diagnosis offered by the new market fundamentalists: The source of their woes was not foreign competition or deindustrialization or hostile financial players; it was government. The changing economy didn’t just change the conversation, in other words. It changed who had power and how those with power thought about their priorities.

Once the door opened to the new antigovernment stance, policy and profit-seeking reinforced one another. The free-market movement advocated financial deregulation and tax cuts, and these policies helped fuel a rapid and sweeping shift in corporate America. Companies faced intense pressure to become better integrated into an expanding global economy. Even more important, they faced intense pressure to become better integrated into an expanding financial sector. As corporate America orbited ever closer to Wall Street, it adopted Wall Street’s priorities as its own: immediate stock returns, corporate financial engineering, and extremely high executive pay closely tied to share prices. On the other side, the constraint on top management created by organized labor was rapidly weakening, as unions struggled in an increasingly hostile climate.

The result was not just enormous fortunes going to a narrower and narrower slice of executives. It was also an enormous shift in power toward a new corporate elite much more hostile to the mixed economy, much less constrained by moderates in government or by organized labor, and much more in tune with the new celebration of the “free market.” Ironically, by the time these changes had flowered fully, many of the industries that helped spark the new thinking and policies would be teetering on the brink or worse. But a new set of economic leaders was ready to carry the antigovernment message—in softer and harder forms—into both political parties.

That ’70s Show

The immediate cause of this broader transformation was the economic turmoil of the 1970s. The decade was not the economic wasteland it is often remembered as today. Income growth was actually fairly healthy, and many social indicators, such as college graduation rates, continued their rapid improvement. Still, the decade witnessed a series of destabilizing shocks, none more destabilizing than stagflation—the combination of high inflation and economic stagnation that marked the end of the long postwar boom. It was Paul Samuelson who had coined the term but Milton Friedman who had predicted the outcome. In Friedman’s eyes—and the eyes of more and more policy makers—stagflation proved that Keynesian macromanagement was a fool’s errand that would inevitably break down in the face of market forces.27

In retrospect, the economic tumult of the 1970s looks less baffling than it did at the time. The surge of inflation reflected both singular shocks (notably, the 1973–74 oil embargo by the Organization of Petroleum Exporting Countries, or OPEC) and obvious policy mistakes (Johnson’s guns-and-butter spending and Nixon’s urging of loose monetary policy to secure his reelection).28 Meanwhile, productivity growth was slowing, as the burst of economic activity after World War II gave way to the more normal expansion of rich countries at the edge of the technological frontier.29 At the same time, the United States faced greater competition from its affluent trading partners as they recovered from wartime devastation.30

It was inflation, however, that captured the public’s attention and drove the increasingly panicked national debate. Throughout the latter half of the 1970s, large majorities of Americans told the Gallup poll that inflation—never before a major response in the survey—was the number one problem facing the nation. In 1980 a peak of 83 percent cited it. (Three decades later, the most commonly identified problem would be “government.”)31 Part of the reason was that income tax brackets were not tied to inflation, so rising wages caused many households to pay higher tax rates even when the purchasing power of their incomes wasn’t any greater. Indeed, one of Reagan’s first moves upon assuming office was to index tax brackets to inflation, cutting off the automatic (and almost invisible) hike in tax revenues that had helped fuel postwar public investment.32

Stagflation raised the profile and influence of the critics of the mixed economy, and it shifted the national discussion from fostering growth to fighting inflation. Under pressure, Carter appointed the prominent inflation hawk Paul Volcker to head the Federal Reserve, where, as expected, he raised interest rates sharply. The move triggered the worst downturn since the 1930s and probably cost Carter the 1980 election.33 It also decimated the economic reputation of the Democratic Party, paving the way for Reagan to pursue a very different vision of government’s relation to the economy.

For all this, however, the challenges of the 1970s did not have to compromise the entire edifice of the mixed economy. Getting macroeconomic policy on track and confronting heightened foreign competition did not require unwinding government’s constructive role in ensuring broad prosperity. Nor did popular pressures. Although voters headed right as inflation headed up, the conservative shift in public opinion was short-lived. The elite turn against the mixed economy, however, just kept going and going—and, in fact, intensified through subsequent decades. To understand the scope and persistence of the change, we need to look closer at the new titans coming to dominate the American economic landscape.

Financializing America

Nixon’s response to the problems of the early 1970s was shaped by a fortysomething ex-industrialist on his economic team named Peter Peterson. A son of Greek immigrants with a knack for salesmanship, Peterson (known to most as Pete) had just stepped down as CEO of film equipment manufacturer Bell & Howell and had already made a name for himself with his acute diagnoses of the nation’s changing role in the world economy. In 1972, when Maurice Stans stepped down to run the Committee for the Reelection of the President (soon at the center of the Watergate scandal), Nixon tapped Peterson to become the nation’s twentieth secretary of commerce.34

Being There

By then, Peterson had seen plenty of the old economy from the inside. His first exposure came after he was kicked out of MIT in 1944 for cheating. MIT offered to expunge the black mark from his record if he went to work in the university’s Radiation Laboratory (dubbed the Rad Lab), yet another creation of Vannevar Bush’s federal scientific office. The eighteen-year-old Peterson cleared his name by purchasing supplies for the laboratory, unaware until later that he was contributing to the Manhattan Project.

Peterson’s expulsion proved to be his first big break. After finishing his degree at Northwestern University (and before moving to Bell & Howell), he went to business school at the University of Chicago, where Friedman and Stigler made an enormous impression on him. “I was shaped by their basic principles from then to now,” Peterson would write in his memoir, The Education of an American Dreamer. “They have stuck with me and proven far more practical than my Northwestern courses in retail inventory control and retail sales promotion. I often think how different my life would have been had the University of Chicago Graduate School of Business not been so close to my . . . office.”35

Peterson’s second big break was also unexpected: He was dumped by Nixon in 1973 amid Watergate palace intrigues. Fortunately for Peterson, dozens of corporate boards came calling, hoping to gain the prestige and connections that his background offered. He joined the investment firm Lehman Brothers, in part because he knew and liked Lehman principal George Ball, a Democrat who had served as undersecretary of state.36 It was the first of many bipartisan ventures that would end up serving Peterson, if not the broader public, well.

Peterson was a networker who straddled the worlds of Washington and corporate America. His greatest gift, however, turned out to be his uncanny ability to move to greener pastures just as the last was about to wither. He left Bell & Howell before the bottom fell out of American manufacturing, and the Nixon administration just before it collapsed in scandal. He then went to Lehman, where he pushed the venerable institution to move toward high-risk investments. Weakened by trading losses, Lehman would be sold to Shearson/American Express—just after Peterson was pushed out. Then, in 2008, it would fail spectacularly in an inferno of toxic assets and deceptive accounting.37

The sale of Lehman made Peterson rich. He plowed the proceeds into a partnership with former Lehman associate Stephen Schwarzman. The two men pioneered a new model of high-yield investment known as private equity, calling their firm Blackstone. (The name was a play on those of the partners: Schwarz is German for “black”; Petros, Greek for “stone.”) When Blackstone went public in 2007, Peterson and Schwarz-man became billionaires. Again, Peterson had gotten out while the getting was good. Within a year, Blackstone’s share price dropped 40 percent.38 By then, however, Peterson had moved on to a new passion: spending his fortune to influence public debate.

On FIRE

Peterson always seemed to ride the wave to its crest, jumping away just as it crashed. But his rising fortune reflected not just excellent timing but also larger economic tides. He entered Wall Street at the beginning of a three-decades-long increase in the share of the American economy devoted to finance—to the buying and selling of assets rather than the production or exchange of goods. This development, often referred to as “financialization,” battered workers caught in its disruptions and ushered in financial instabilities once thought solved. It also enriched and emboldened a new class of high-rolling executives—Peterson included—who were far more critical of the mixed economy than were the captains of industry they displaced.

When Peterson signed up with Lehman, manufacturing made up more than a quarter of national economic output. Finance, insurance, and real estate—FIRE, for short—was around 15 percent. By 2001, the two sectors’ relative positions had reversed. Even more striking, profits in the FIRE sector had climbed from a fifth of total corporate profits in the 1970s to nearly half in 2001.39

The United States isn’t the only country with a large financial sector. Yet American finance led the world in broadening and deepening the sector’s role in the rest of the economy, and even today it remains unique. What makes it distinctive isn’t the amount of financial leverage or the size of capital markets. It’s the relationship between the financial sector and the rest of the economy. Starting in the late 1970s, finance became not a servant of larger corporate aims but more and more the driver of them.

At first, this switch was mostly inadvertent. With inflation raging, President Carter and Democrats in Congress lifted interest rate ceilings on Savings and Loans in 1980.40 The change freed S&Ls to pay higher rates. It also sent them in search of higher returns to finance those rates—a search that would soon end in the nation’s biggest banking crisis since the 1930s. But these were baby steps compared with what would come next. By the early 1980s, free-market ideology, Reagan’s election, and the increasing sway of Wall Street had started an exuberant wave of financial deregulation that would continue for nearly three decades.

Fomenting the Shareholder Revolution

The first of these new deregulation warriors was a Wall Street conservative with a similar name to his boss’s: Donald Regan, Treasury secretary to Ronald Reagan. Fresh from a decade of running Merrill Lynch, Regan declared that his top priority was “deregulation of financial institutions . . . as quickly as possible.”41 Within two years, Reagan and Regan, working with a receptive Congress, wiped out many of the rules that had restrained finance for the previous sixty years. Gone were most of the restrictions on S&Ls. Gone were most of the rules that stabilized the mortgage market. For the first time, the federal government allowed companies to buy back their own shares to raise stock prices. For the first time, the federal government allowed the pooling of mortgages into so-called mortgaged-backed securities—the risky investment vehicles that would eventually destroy Lehman. And for the first time, corporate raiders started taking on established companies, financing buyouts with massive amounts of new borrowing.

Today, when it is routine to buy and sell corporations like commodities, it is hard to convey what a fundamental shift the takeover movement represented. Since the rise of the modern corporation in the late nineteenth century, business financing had presented a public face that conflicted with private realities. On paper, corporations were beholden to shareholders. Yet because most companies were owned by a multitude of diffuse shareholders, the ability of shareholders to direct managers was limited. Moreover, established companies financed their internal investments mainly through retained earnings instead of external sources, which further insulated them from outside pressure. Managers were stewards rather than servants; their job was to look out for the long-term interests of the organization rather than the short-term entreaties of investors. Indeed, Adolf A. Berle Jr. and Gardiner C. Means, in their 1932 classic The Modern Corporation and Private Property, went so far as to suggest that corporate ownership and corporate control had become so distinct that executives were largely unaccountable to shareholders.42

That all changed in the 1980s. If you were a Fortune 500 executive in the 1970s, you had much to fret about. But you didn’t have to worry about hostile takeovers. Now you did: Between 1980 and 1990, one-third of the Fortune 500 ceased to exist. The companies still standing were battle scarred. A third had experienced hostile bids in the prior decade. Two-thirds had adopted antitakeover defenses—designed mostly with the security of top management in mind rather than companies’ long-term interests.43 With all the churning, the tenure of corporate executives was becoming shorter. The imperative of keeping up stock prices—to protect against takeovers, not to mention deliver the big rewards that more mobile CEOs demanded—was becoming greater. Indeed, CEOs were inviting the barbarians into the castle, hiring high-paid Wall Street consultants who could help firms reengineer themselves so that more hostile players wouldn’t.

The public rationale for takeovers was that companies had become complacent and needed discipline. Yet the takeover model depended on cashing out, not maintaining control. That was because the enormous returns that corporate raids could yield reflected the enormous amount of exotic debt relied upon by corporate raiders. Just as with a home mortgage, making a small “down payment” on a company and financing the rest through borrowing transformed even small gains into astronomical returns. To make the deal sweeter still, debt was deductible as a business expense under the corporate tax code. For most companies, interest payments on debt were negligible. But for newly acquired companies loaded with debt, such payments could easily exceed earnings, making the deduction a major source of profit—and a major subsidy for takeovers.44

Because the purchase was leveraged, returns—and losses—were magnified. A quick payoff was essential to finance the borrowing and avoid potential ruin. For this reason, raiders did not go after struggling companies. They went after companies with undervalued stock and predictable cash flow that could be used to pay off debt.45 Big conglomerates were a favorite target: They had lots of parts that could be liquidated before the companies were sold to another buyer or taken public again. Companies with “excess” cash and “overfunded” pensions were also attractive. Raiders were less willing to talk about “excess” employees. But along with exotic new securities, the industry developed ever more disingenuous euphemisms for laying off workers: restructuring, trimming fat, downsizing, delayering, and, most Orwellian of all, workforce optimization.

Here again shifts in federal policy were critical. By filling the National Labor Relations Board with appointees hostile to organized labor and then breaking the high-profile strike of air-traffic controllers in 1981, Reagan signaled what had already become clear: Labor was down, business was up, and government had left the field.46 Unions had pushed for a major rewriting of federal law in 1978 to make it easier for workers to organize as production shifted south and companies adopted more aggressive antiunion strategies. The bill had been defeated by a Senate filibuster—rare at the time—in response to unprecedented corporate political mobilization. The writing was on the wall: From the early 1980s on, workers would increasingly bear the dislocations of adjusting to new economic realities. If the model of the 1950s and 1960s had been “retain and reinvest,” the model that replaced it was “downsize and distribute.”47

The celebrated goal of the new model was “shareholder value,” a phrase now so familiar that its radical implications are often forgotten. Business leaders had always seen healthy stock returns as one of the key goals of the corporation. But it was not the only goal. At least as important were the long-term growth of the company and the standing of its workers, as well as its responsibilities to its customers and its community. The traditional view was summarized by GE’s longtime chairman Owen Young, who led the company from the 1920s through World War II: “Stockholders are confined to a maximum return equivalent to a risk premium. The remaining profit stays in the enterprise, is paid out in higher wages, or is passed on to the customer.”48 As late as the early 1980s, the nation’s largest organization representing CEOs, the Business Roundtable, noted that “balancing the shareholders’ expectation of maximum return against other priorities is one of the fundamental problems confronting corporate management.”49

By the 1990s, however, the Roundtable had dropped all references to “balancing” and embraced the constant maximization of shareholder returns. As two economists explain, the new credo signaled “a fundamental shift in the concept of the American corporation”: “from a view of it as a productive enterprise and stable institution serving the needs of a broad spectrum of stakeholders to a view of it as a bundle of assets to be bought and sold with an exclusive goal of maximizing shareholder value.”50 The simplistic division of the economy into markets versus government paralleled a view of corporations as bundles of assets, independent from their social context.

This new conception of the corporation also carried profound implications for American understandings of shared prosperity. The valorization of shareholders (even if it was often a cover for the acquisitive aims of top executives or hostile-takeover engineers) challenged the notion that wealth was a social creation that rested on the efforts of multiple stakeholders, including labor and government. Instead, it implied that enhanced prosperity was generated by investors and executives, with their entrepreneurial creativity and investment daring. The Solow residual—the increased productivity due to enhanced knowledge and technical capacity within society—was not, in this view, a collective creation. It was the product of the “job creators” and “risk takers” who were rightly enjoying more and more rewards.

The Rise of the New Economic Elite

This new world was the one Pete Peterson wanted to be part of when he came to Lehman. Peterson saw a lucrative future in what he called “merchant banking,” a polite way of describing leveraged buyouts. The rumpled trader who would oust him, Lew Glucksman, recalls that “Pete . . . was obsessed with the amount of money William Simon made selling Gibson. And he couldn’t stop talking about it.”51

Simon, a former Treasury secretary under Presidents Nixon and Gerald Ford, had bought and sold Gibson Greeting Cards to make over $60 million on a $330,000 investment.52 (He would go from Wall Street to running the John M. Olin Foundation, a philanthropy now famous for funding the “counter-intelligentsia,” as Simon called it, of antigovernment thinkers.)53 No doubt Peterson thought that he, a former commerce secretary, could do at least as well. In time, he would. He would also reshape the way those around him thought about markets, government, and American prosperity.

Capitalist Tools

In 1982, the year that William Simon bought out Gibson Greetings, Forbes magazine published the first of its iconic lists of the four hundred wealthiest Americans. Topping the ranking was an octogenarian shipping magnate, Daniel Ludwig, with an estimated fortune of $2 billion (around $5 billion today, adjusted for inflation).54 Ludwig was one of only two billionaires on the list. The entire four hundred were worth around $225 billion (in current dollars). Readers seemed impressed. One federal judge, dismissing a lawsuit by a millionaire who wished to stay off the roster, declared, “Money is power, and the wealthy wield great power and influence economically, socially, and politically in this country, and the American public has a right to know who they are.”55

If money is power, however, that first four hundred were a bunch of weaklings compared with those to come. In 2014 every member of the Forbes 400 was a billionaire; indeed, 113 US billionaires were left off the list because they fell below the entry price of $1.55 billion.56 The combined net worth of the list tickled $2.3 trillion—ten times its 1982 level after adjusting for inflation.57

Those who looked closely at the Forbes 400 might have noticed another change: Wall Street increasingly eclipsed Main Street as a generator of outsized fortunes. In 1982 barely one in twenty-five of the Forbes 400 had made their money in the financial sector. Thirty years later, more than one in five had. No other sector—not technology, not retail trade, not media, not energy, not real estate, not consumer goods—had as large a share. Nor had any other increased in prominence as much as finance had, not even that quintessential twenty-first-century industry, computer technology. Finance and computing enjoyed about the same number of spots on the list in 1982. Three decades further into the digital era, finance had about two-thirds more spots. High-tech might have topped the list with Bill Gates, but high finance dominated the rest.58

When Pete Peterson left Lehman in 1983, his official salary was $225,000. Year-end bonuses and other perks raised the total to an after-tax equivalent of roughly $5 million. With his 1 percent stake in any future sale of Lehman, as well as equity in the firm and severance benefits, Peterson walked away with $18 million.59

Fast-forward to 2007, when Blackstone went public. On the day of the sale, Peterson made $1.9 billion, a hundred times what he had upon his departure from Lehman. His earnings in his last year at Blackstone were $200 million. Peterson’s partner, Steve Schwarzman, walked away from the deal with nearly $10 billion—after a year in which his take-home pay was almost $400 million.60

Mining Corporate America

Schwarzman was Peterson’s alter ego. An aggressive deal maker, he was as publicly lavish as the Depression-baby Peterson was austere. A diminutive five foot six next to the six-foot-plus Peterson, he was big in ambition. As a young Lehman partner, he saw the future in Simon’s Gibson deal and the even more spectacular 1979 buyout of Houdaille Industries by the private equity firm Kohlberg Kravis Roberts & Co. (Kohlberg made out well; Houdaille, a large conglomerate with almost eight thousand workers, was dismembered, and its core business so saddled with debt that it succumbed to Japanese competition.) Years later, Schwarzman still remembered his reaction to the Houdaille deal: “I read that prospectus, looked at the capital structure, and realized the returns that could be achieved. I said to myself, ‘This is a gold mine.’ ”61

Schwarzman found many gold mines. It was he who almost single-handedly brokered the sale of Lehman to Shearson/American Express. It was he who engineered the initial public offering of Blackstone that made Peterson and him multibillionaires. (By then, Peterson was so disengaged from the company that he wasn’t made aware of the IPO until six months into its planning.) 62 And it was he who would come to personify the excesses of the financial industry even as he racked up bigger and bigger victories.

On the eve of the Blackstone IPO, Schwarzman celebrated his sixtieth birthday with some friends—friends such as Barbara Walters, Colin Powell, New York mayor Michael Bloomberg, and the chief executives of all the major Wall Street banks. The musical headliner was British rock star Rod Stewart. The warm-up act consisted of soul singer Patti LaBelle backed up by not one but two Harlem choirs. Another opener, comedian Martin Short, joked that the Seventh Regiment Armory, where the event was held, was “more intimate” than Schwarzman’s $30 million penthouse on Park Avenue.63

Among the new financial wizards, Schwarzman’s astronomical earnings were nothing extraordinary. The year of Blackstone’s IPO, Institutional Investor’s blog, Alpha, released its annual list of the top twenty-five hedge fund managers. For the first time, the top three earned more than $1 billion each. Combined, the twenty-five managers pulled in $14 billion. “So much for steel, oil, railroads, and real estate—or microchips and software, for that matter,” the report announced. Hedge funds were “the best bet for making the biggest bucks the fastest in the postindustrial world.”64 Especially so because of an obscure provision of the tax code dating back to a very different Wall Street. Partners in private equity firms took most of their pay as a share of the investments they oversaw: other people’s investments. But they were allowed to treat these “carried interest” earnings as capital gains subject to a low tax rate. By 2014, Blackstone was rolling in $4.3 billion a year in profits with just 2,300 employees—more than the investment banking giant Morgan Stanley, with 55,000 workers. Thanks to the creative use of the carried-interest tax break, it paid just 4.3 percent in taxes on its profits.65

The year of Blackstone’s IPO marked another milestone: It was the year that the share of household income going to the richest 1 percent of households equaled that reached on the eve of the stock market crash of 1929, with nearly one in every four dollars accruing to this tiny slice of American society. The gains were even more concentrated than this staggering number suggests, since roughly half of the top 1 percent’s income went to an even smaller group: the top 0.1 percent. In 2007 the top 0.1 percent averaged more than $7 million in annual income, its share of national income having increased fourfold over the prior generation.66

Who are these fortunate 0.1 percenters? Tax records show that most of them belong to two groups: financial executives (around 20 percent) and corporate executives outside finance (around 40 percent). (By way of comparison, sports and media stars—often singled out in popular commentary on the rich—make up just 3 percent.)67 In other words, the majority of the superrich are corporate and investment managers. What’s more, nearly all of these managers enjoy close ties to Wall Street—and not just because they operate in a business environment fundamentally reshaped by financialization. At the top, most compensation takes the form of stock options and other capital gains. This group looks very different from the business elite described in 1950 by the great management theorist Peter Drucker: “Where only twenty years ago the bright graduate of the Harvard Business School aimed at a job with a New York Stock Exchange house, he now seeks employment with a steel, oil, or automobile company. . . . There is very little room in an industrial economy for international banking, international capital movements, and international commodity trading on which the power and position of the ‘capitalist’ ruling groups rested primarily.”68

Of course, the new economic elite isn’t a monolithic “ruling group.” Those at the top include Democrats as well as Republicans, liberals as well as conservatives. Nor do they operate as some kind of secret cabal, privately coordinating their assaults on the mixed economy. Still, the changing face of the economic elite matters a great deal. As we saw in chapter 5, the moderate elements of the business community were critical to the establishment and legitimation of the mixed economy. The public voices of this elite were restrained and engaged—a reflection of wartime experience, regular engagement in national and community affairs, the constraints of labor and moderate Republicans, and the relatively modest economic gap between those at the top of corporate America and rank-and-file workers.

Today’s economic elites look very different. They are tied less closely to local production and investment. They are tied more closely to Wall Street. And they have much greater resources relative to the rest of society. They are also, we shall see, much more skeptical of the mixed economy. Again, the views of those at the top are far from monolithic (and if we are to understand fully their engagement with government, we have to consider not just their individual attitudes but also the organized activities of the nation’s major business associations—the subject of the next chapter). But on core economic issues, the ideological middle of the new economic elite is far to the right of the postwar business elite, not to mention the midpoint of the contemporary American electorate. Indeed, we can see its two central tendencies in the two men who founded Blackstone: Steve Schwarzman and Pete Peterson.

Masters of the Universe

“Who is John Galt?” That question appears again and again in Ayn Rand’s Atlas Shrugged. In the book’s morality tale of capitalists overcoming government collectivism, Galt is the handsome genius who leads a “strike” by the “producers” that ultimately causes the world economy to collapse. The message of Atlas Shrugged is simple: Government destroys freedom; only creative capitalists produce wealth; everyone else is a “looter,” a “moocher,” an “incompetent,” feeding on the elite’s innovations. In one of Galt’s most memorable speeches, he declares: “The man at the top of the intellectual pyramid contributes the most to all those below him, but gets nothing except his material payment, receiving no intellectual bonus from others to add to the value of his time. The man at the bottom who, left to himself, would starve in his hopeless ineptitude, contributes nothing to those above him, but receives the bonus of all of their brains.”69

When Rand’s book appeared in 1957, it was greeted with savage rebukes by reviewers. “It is probably the worst piece of large fiction written since Miss Rand’s equally weighty ‘The Fountainhead,’ ” judged Robert Kirsch in the Los Angeles Times. “It would be hard to find such a display of grotesque eccentricity outside an asylum.”70

A half century later, however, the book luxuriated in the embrace of leading political and economic figures, from Fed chair Alan Greenspan to House GOP budget guru, 2012 VP candidate, and current House speaker Paul Ryan. In 2008, sales of the book reached record highs, and conservative commentators spoke openly about business owners “going Galt.” As one Forbes column gushed in 2012, “Galt epitomizes all that is glorious of capitalism in its purist form—innovation, self-reliance, and freedom from government interference.”71

Hard Randianism

Galt also epitomizes a way of thinking that has become much more common among the nation’s elite. As already hinted, we call it “Randianism”—or, since we’ll soon look at its softer cousin, “hard Randianism.” The distinctive core of hard Randianism isn’t laissez-faire (a very old fancy); it’s the division of the world into a persecuted minority that heroically generates prosperity and a freeloading majority that uses government to steal from this small, creative elite.

Within the shifting ideological climate of the post-1980s era, it was perhaps inevitable that the rise of the superrich would foster a new dog-eat-dog mentality. Still, the increasing openness and stridency of these views bear notice. For most of the top 0.1 percent, the incentives to make a public display of affection for libertarianism are limited. Why alienate shareholders and consumers or invite the attraction of nosy and noisy activists? Better to speak softly and spend behind the scenes if necessary.

It was easier to be quiet, however, when only labor unions and the leftmost wing of the Democratic Party questioned the wisdom of financial deregulation or tax reductions for big incomes and estates. In the wake of the 2008 financial crisis—when Greenspan himself felt forced to admit his “model” of the economy was wrong—that was no longer true. Facing increased scrutiny, the investor class now had to explain why it deserved the friendly policies of prior decades.

Steve Schwarzman rose to the occasion. Speaking to a nonprofit board in 2010, he complained of a “war” between President Obama and financial firms such as his. Obama’s proposal to eliminate the carried-interest tax break was “like when Hitler invaded Poland in 1939.” Schwarzman walked back the remark, but he also became one of the top supporters of Republican presidential candidate Mitt Romney.72

Nazi analogies turned out to be popular. Hedge fund manager Leon Cooperman compared Obama’s election to the rise of the Third Reich. Billionaire investor Tom Perkins took to the Wall Street Journal editorial page to warn of a looming “Kristallnacht” if the “rising tide of hatred against the successful one percent” was not stopped. Asked later if he regretted the remarks, Perkins insisted “the parallel holds.”73

As extreme as such statements were, they suggested just how deep the sense of grievance ran. Indeed, Cooperman became something of a folk hero within the financial industry by writing President Obama an open letter condemning his and his “ ‘minions’ role in setting the tenor of the rancorous debate now roiling us that smacks of what so many have characterized as ‘class warfare.’ ” (Hedge fund manager Anthony Scaramucci told a writer for The New Yorker that Cooperman was the “pope” of the industry revolt against the president.) “The divisive, polarizing tone of your rhetoric,” Cooperman wrote, “is cleaving a widening gulf, at this point as much visceral as philosophical, between the downtrodden and those best positioned to help them.”74

Gilded Bootstraps

The complaints of the new superrich embodied two claims. The first was that those at the top fully deserved their riches. In 2012 the top earner on Institutional Investor’s Alpha list (with $3.9 billion in pay) was Ray Dalio, chief of Bridgewater Associates, the world’s largest hedge fund. Among financiers, Dalio was best known for his online business-strategy tract Principles. “Self-interest and society’s interests are generally symbiotic,” declares Principles. “Society rewards those who give it what it wants. That is why how much money people have earned is a rough measure of how much they gave society what it wanted.”75

Before the financial crisis, the head of Citigroup, Sandy Weill, was similarly self-congratulatory: “People can look at the last twenty-five years and say this is an incredibly unique period of time. We didn’t rely on somebody else to build what we built.”

Weill’s up-by-his-bootstraps assertion was obviously false. Before Citigroup became a ward of the state during the financial crisis, Weill displayed a trophy of sorts in his office: a four-foot wide piece of wood etched with his portrait and the words “The Shatterer of Glass-Steagall.” The Glass-Steagall Act was the New Deal legislation that separated commercial and investment banks to minimize the risk of bank failures and self-dealing. Without the act’s repeal—signed happily by President Clinton in 1999 at the urging of Weill and his partners—Weill could not have built what he built: a megabank that paid him a staggering $785 million over five years.76

Nor was Weill alone in benefiting from financial deregulation and other policies friendly to the investor class. From the 1970s through the 2000s, one of the best predictors of rising pay in the financial sector was the pace of deregulation—which explained perhaps half of the difference between what workers in finance earned and what similarly skilled workers in other sectors took home.77 As we will see in chapter 9, the financial sector also enjoyed many other favorable policies. Most notably, the consolidated institutions that came to dominate Wall Street received a huge implicit subsidy because investors were willing to accept lower returns from them, assuming—rightly, it turned out in 2008—that government wouldn’t let them fail if their risky bets turned sour.78

Skyrocketing financial pay drove up CEO earnings in nonfinancial companies, too. For one, it raised the benchmark against which all executives’ salaries were judged. Even more important, it encouraged CEOs to demand more and more pay in various forms of stock options. The justification was that executives prospered only when the firms that they led did well. The reality, according to many studies, was different: that executives profited more from luck and short-term measures to boost stock prices than from excellent long-term performance.79 CEO compensation routinely failed to include even basic safeguards against lucky profits, short-termism, and heads-I-win-tails-you-lose deals, even as it hid from public accounting enormous numbers of goodies (company jets, golden parachutes, lavish retirement packages) that might have provoked shareholder or public pushback.

To be sure, many CEOs were enormously talented. Yet talent wasn’t sufficient to explain why American CEOs earned so much more than CEOs in other nations or than corporate heads had in the past, or why their marginal tax rates were at least 50 percent lower as a share of their much higher pay than the rates paid by CEOs of the immediate postwar era. To understand these outcomes required looking beyond the workings of the corporate world to the increasingly solicitous American policy environment.

Pity the Rich

The second distinct claim embodied in the complaints of the top 0.1 percent was that government and the public were parasitic on the accomplishments of the economic elite. Cooperman, in his letter to President Obama, wrote that the rich “employ many millions of taxpaying people, pay their salaries, provide them with health care coverage, start new companies, found new industries, create new products, fill store shelves at Christmas, and keep the wheels of commerce and progress (and indeed of government, by generating the income whose taxation funds it) moving.” At a public event, he elaborated: “Our problem, frankly, is as long as the president remains anti-wealth, anti-business, anti-energy, anti-private-aviation, he will never get the business community behind him. The problem and the complication is the forty or fifty percent of the country on the dole that support him.”80

Tom Perkins complained, “I don’t think people have any idea what the one percent is actually contributing to America.” He suggested the problem might be fixed with a change in American voting rules: “You don’t get to vote unless you pay a dollar in taxes . . . If you pay a million in taxes, you should get a million votes. How’s that?”81

Wall Street was not the only place that Randian thinking flourished. Even in the more progressive Silicon Valley, similar grumblings could be heard. In a 2009 essay, the young cofounder of PayPal, Peter Thiel, wrote: “In our time, the great task for libertarians is to find an escape from politics in all its forms—from the totalitarian and fundamentalist catastrophes to the unthinking demos that guides so-called ‘social democracy.’ . . . The fate of our world may depend on the effort of a single person who builds or propagates the machinery of freedom that makes the world safe for capitalism.”82 Along with Patri Friedman—grandson of Milton Friedman—Thiel founded the Seasteading Institute, a nonprofit dedicated to creating a floating city in international waters to realize the libertarian dream of a pure free market.83

In the origin myth of the top 0.1 percent, globalization had given rise to a new meritocracy with a steep pyramid of value. Executives were no longer managers. They were supertalented entrepreneurs who produced the economic growth on which an ever-expanding public sector relied. American business needed the stern discipline of constant reinvention that only financial markets could provide. And, according to even relatively moderate voices within the new economic elite, so did the American middle class.

A Kinder, Gentler Libertarianism

Randianism was a strong brew—too strong for many in corporate circles. It would be a mistake, however, to ignore the watered-down form of it that was imbibed much more widely. Peter Thiel’s blunt libertarianism was an outlier in Democratic-leaning Silicon Valley. But open disdain for government was mainstream. Google’s CEO, Larry Page, envisioned a utopian “Google Island” where technologists could innovate free of government’s heavy hand. To Chamath Palihapitiya, a tech venture capitalist (and part owner of the Golden State Warriors), “Companies are transcending power now. We are becoming the eminent vehicles for change and influence. . . . If companies shut down, the stock market would collapse. If the government shuts down, nothing happens, and we all move on, because it just doesn’t matter.”84

Never mind that if government had shut down in the decades after World War II, none of the basic components of the iPhone or Google’s search algorithm or, for that matter, the World Wide Web would have come to exist. Steve Jobs was well known for his disdain for government’s ineptness. Yet he never could have created his pioneering products or made billions without the enormous reservoir of public investment and publicly trained talent that nurtured the innovation hub of Silicon Valley and fed directly into every major element of the technology on which he drew.

The Penny-Pinching Plutocrat

An even better representative of what might be called “soft Randianism,” however, is Pete Peterson. No Steve Schwarzman, Peterson is a moderate by the standards of the new elite. For one, he does not flaunt his vast wealth. “I’m not one that really enjoys living large,” Peterson wrote in his 2009 memoir. “I have no desire to be a conspicuous consumer. When I see a thirtysomething hedge funder loudly revving up his red Ferrari convertible in the Hamptons, I feel much more contempt than envy.”85

Yet Peterson has proved happy to spend conspicuously on one thing: building political support for big spending cuts. In the early 1980s, he emerged as one the most vocal deficit hawks in the nation. After becoming a billionaire, he pledged $1 billion to a new private foundation headed by him, his wife, and his son Michael. The aim of the Peter G. Peterson Foundation, in its founder’s words, was to tackle “our nation’s massive, unsustainable debts and deficits.”86

Peterson’s obsession with the debt began around the time he started at Lehman. His first high-profile article, published in the New York Times Magazine in 1982, was titled “No More Free Lunch for the Middle Class.” It advocated steep cuts in retirement and health benefits to make these programs a safety net for the needy rather than insurance protections for the middle class.87 In his memoir, Peterson complained, “I see an indulgent country living almost entirely in the moment, afflicted with an aggravated case of myopia, and a sense of entitlement, one of my least favorite words.”88

The greatest example of this “indulgence,” in Peterson’s telling, was Social Security: “a vast Ponzi scheme” that would go “bankrupt subsidizing middle-class and affluent Americans.”89 Social Security was, in fact, restructured substantially in 1983. But Peterson deemed the bipartisan commission chaired by Alan Greenspan a disappointment because it “asked no sizable sacrifice from anyone who would retire for another twenty years.”90 Over the coming quarter century, even as Social Security declined as a source of income for the aged and secure private pensions disappeared, Peterson churned out book after book about how the bipartisan support for Social Security and Medicare was bankrupting the nation.

Peterson’s books all had the same conclusion: The middle class was living large, and benefits had to be cut. He had almost nothing to say about how to restrain health costs—the fastest-rising part of federal government, and a distinctively American pathology—beyond cutting benefits. Nor did he see much role for government other than providing a safety net. Peterson derided those who might see “government intervention as a spur to prosperity.” He joked, “Perhaps there exists a major ‘industrial’ policy that actually accomplished its grandiose purposes without baleful side effects. If so, however, the story is a well-guarded secret.”91

Rise of the Soft Randians

Peterson was hardly a voice in the wilderness. Soft Randians could be found throughout the business world and within both parties. Indeed, as the GOP moved right, the Democratic Party became their natural home. In part, this was because of the tighter and tighter links between the Democratic economic establishment and Wall Street, which we will look at in chapter 9. Many of the key soft Randians were tied to Democratic policy making through financial-executive-turned-DC-insider Robert Rubin. The former cochair of Goldman Sachs served as Clinton’s Treasury secretary and, in that capacity, helped push financial deregulation, balanced budgets, and modest new efforts on behalf of the disadvantaged—the triple package that came to be known as “Rubinomics.”92 (Rubin would then take a role as “senior adviser” at Citigroup, the financial conglomerate that Clinton had enabled with Rubin’s support; the firm went down in flames amid the 2008 financial crisis but not before Rubin had earned $25 million.)93

Other soft Randians included Larry Summers, Rubin’s successor as Treasury secretary, an academic economist who also championed deregulation. Shortly after leaving the Treasury, Summers opined: “There is something about this epoch in history that really puts a premium on incentives, on decentralization, on allowing small economic energy to bubble up rather than a more top-down, more directed approach.”94 Summers knew something about incentives: he earned millions by working with hedge funds and other financial institutions after leaving office.95

But nobody spent like Pete Peterson. Even by the high-rolling standards of the day, his advocacy set new records. The foundation funded the Peterson Institute for International Economics, the Committee for a Responsible Federal Budget, the Concord Coalition, the Comeback America Initiative, and even a new web-based media outlet, the Fiscal Times. Peterson financed a string of youth-oriented organizations and invested in the development of a high school curriculum, “Understanding Fiscal Responsibility.” For several years, he funded a series of town hall meetings across the nation that brought together thousands of Americans to hear frightening warnings about the budget.96 (Apparently not frightening enough: When the meetings were done, 85 percent of participants backed raising Social Security taxes rather than cutting benefits.)97

Later, as the United States struggled through the deepest economic downturn since the Great Depression and most Americans understandably focused on the problem of unemployment, these and other Peterson-funded initiatives radiated across the nation. Huge events involving political luminaries such as Bill Clinton captured media attention and helped make the debt the number one topic in Washington. According to an analysis of media coverage by the National Journal, stories about the deficit eclipsed stories about unemployment beginning in early 2010—at a time when the unemployment rate was still above 9 percent and long-term unemployment remained at record levels. In early 2011 there were over two hundred stories per month on the deficit in the nation’s five biggest newspapers, compared with just sixty-five on the jobless.98

The policy impact of this impressive attention to Peterson’s cause is difficult to determine. But after the initial recovery package in 2009, overall government policy pivoted rapidly toward austerity. During the steep economic downturn of 1981, public employment rose steadily even during the darkest days. By contrast, after 2009, for the first time in modern American history, it actually fell, as state budgets plummeted and federal employment stagnated. By 2012, the gap between the historical pattern of public job growth and the post-2009 crunch—even without taking into account the expansionary effects of public spending on private employment—amounted to more than one million jobs.99

Pain Above Party

The roster of political figures involved in Peterson’s efforts came from both parties. In 2011 the major Peterson-backed groups handed out a “Fiscy” Award to Republican budget chair Paul Ryan for his “leadership on confronting our fiscal challenges.” Ryan’s “leadership” consisted of putting out a budget plan that slashed spending but never specified where those cuts would come from and included so many tax cuts that, even with all the magic asterisks, it contemplated deep deficits for years to come.100 Apparently, reducing benefits for the middle class was leadership enough.

In 2010 Peterson also funded the staff of an ostensibly federally run commission authorized to come up with a budget compromise. When the commission failed to produce a plan with sufficient support to pass on to Congress, he created a commission of his own and employed the bipartisan duo that had run the federal predecessor: Democrat Erskine Bowles (former chief of staff to President Clinton) and Republican Alan Simpson (an ex-senator from Wyoming). Bowles had spent the last decade earning millions on corporate boards. He raised eyebrows by praising Paul Ryan, who defected from the federal commission’s proposal because it countenanced higher taxes. But Simpson earned the greater attention when, in an email to a women’s group, he called Social Security “a milk cow with 310 million tits!”101

Despite these off-message moments, media coverage of Peterson’s initiatives was overwhelmingly favorable. Allying with Peterson was portrayed as an act of statesmanship rather than an endorsement of a specific, contestable viewpoint. When Peterson launched the Campaign to Fix the Debt, in 2012, 127 of the nation’s top CEOs signed up, with many donating $1 million of their own money to the cause. One of the enthusiastic backers of Fix the Debt, Jimmy Lee, vice chairman of JPMorgan Chase & Company, likened the nation’s challenge to that of a troubled company: “Some companies tinker, just sort of tinker around the edges. Other companies, on the other hand, really recognize the problem . . . and start thinking about the twenty-, fifty-, hundred-year future of the company and spend some time trying to fix it in a really big, material way.”102

In the corporate parlance that appealed to soft Randians, “fixing things” meant “downsizing”—downsizing government and downsizing the unrealistic expectations of the middle class that government had fostered. “America’s lifestyle expectations are far too high and need to be adjusted so we have less things and a smaller, better existence,” explained one billionaire, Jeff Greene, who had made his fortune betting against subprime mortgages. Greene spoke in Davos, Switzerland, at the World Economic Forum’s annual gathering of top public officials and corporate leaders. Greene had traveled to the forum in a private jet, accompanied by two nannies for his children. He reported he was planning a conference to be held at a fancy hotel in Palm Beach, Florida, titled “Closing the Gap.”103

Two Americas

At the same time that the complaints of the top 0.1 percent were providing a vivid image of America’s changing elite, a more systematic picture was emerging from academic research. And what it found was what F. Scott Fitzgerald had said long ago of the world of privilege he chronicled: “The rich are very different from you and me.”104

Thinking Like a Millionaire

Perhaps the simplest question tackled by this new research is also the most difficult: What do the superrich, as a group, think about politics and policy? Difficult, because the very rich rarely show up in opinion surveys. Never do more than a handful get contacted, much less deign to respond—that is, until 2011, when a team of political scientists used an intensive interview strategy to reach a sufficiently large sample of wealthy citizens to chart their opinions. The average wealth of the respondents was over $14 million; average annual incomes exceeded $1 million. And they turned out to be very different from you and me. As the study’s three authors—the political scientists Benjamin Page, Larry Bartels, and Jason Seawright—explained,

Our evidence indicates that the wealthy are much more concerned than other Americans about budget deficits. The wealthy are much more favorable toward cutting social welfare programs, especially Social Security and health care. They are considerably less supportive of several jobs and income programs, including an above-poverty-level minimum wage, a “decent” standard of living for the unemployed, increasing the Earned Income Tax Credit, and having the federal government “see to”—or actually provide—jobs for those who cannot find them in the private sector. . . . Wealthy Americans are much less willing than others to provide broad educational opportunities. . . . They are less willing to pay more taxes in order to provide health coverage for everyone, and they are much less supportive of tax-financed national health insurance. The wealthy tend to favor lower estate tax rates and to be less eager to increase income taxes on high-income people.105

Perhaps the most arresting finding was that the richer the respondent, the more conservative he or she generally was on economic issues. This result was more than a statistical artifact: The richest of the superrich were also more likely to be Republicans, but that didn’t explain their increased conservatism. At the same time, other differences—age, race, gender, even occupation—mattered little. Upton Sinclair would have been unsurprised. The greater the potential conflict between ideology and income, the larger self-interest loomed.

One Dollar, One Vote

Do these distinctive positions of the affluent make a difference? Apparently so: In a blockbuster study published in 2013, two political scientists, Martin Gilens and the just-quoted Benjamin Page, examined a familiar question: How many of the policies that the public supports become law? But they gave it a novel twist: They separated out the opinions of the bottom, the middle, and the top of the income distribution. (Of course, because these were typical opinion surveys, their definition of the top was pretty broad—survey respondents at the 90th percentile.) They also calculated where major interest groups such as the Chamber of Commerce stood on each of the issues covered by their survey data.106

The result? When all these groupings were considered, neither the middle class nor the poor seemed to have any influence. Instead, whether policy changed related most strongly to whether those at the top of the economic ladder supported the change. And among major interest groups, it was business groups that were the most influential.

Of course, this upper-class tilt doesn’t mean government never passed laws that poor or middle-class voters wanted. When it did, however, they were usually laws that the well off and business groups wanted, too. On many issues, in fact, there wasn’t a huge difference in positions across the income spectrum. But when it came to economic policy, the disagreements were often sharp. (And they would have been even sharper in all likelihood if the surveys had separated out the superrich.) The top income group was substantially more conservative—indeed, further from the middle in its views than the middle was from the bottom—and its distinctive views appeared to have the greatest influence on policy.

Furthermore, these striking findings probably understate the disconnect. Even though they encompass nearly two thousand survey questions, opinion polls don’t ask about many of the low-profile but lucrative policies that favor the top 0.1 percent—policies that are generally crafted behind closed doors, with limited public scrutiny or input. Over these two decades, Americans were never asked about the carried-interest provision that helped Steve Schwarzman pay low federal taxes, or about the various deregulatory changes that enabled Sandy Weill and others to profit from a much riskier and more lucrative financial sector. Yet even on high-profile issues that garner the attention of pollsters (and, presumably, voters), the affluent seem more than capable of holding their own. The results bring to mind the complaint of one of the superrich’s most prominent dissenters, Warren Buffett: “There’s a class war, and my class is winning.”107

Still, there was one area where the general public and the superrich were on the same page: They both hated government.

Right Turn?

These new political studies confirm what common sense suggests: Most Americans are far less conservative on economic issues than those at the top. Indeed, students of public opinion have found no evidence of a consistent rightward shift in public opinion since the 1980s. Surveys have only a limited capacity to capture public views in ways that are true to Americans’ complex and sometimes contradictory opinions. But nothing in them suggests a fundamental shift in values comparable to the elite turn against the mixed economy. For example, Americans appear increasingly inclined to believe that hard work (rather than luck or help from others) determines success in life.108 But the change is modest and seems as likely to reflect the economic strains of these decades as any change in underlying attitudes.

True, the antitax fervor of the late 1970s was a mass movement with significant public support, and Reagan’s election reflected a notable right turn in public opinion. Within a few years, however, the pendulum swung back. The political scientist James Stimson has assembled an index of the “public mood” that tracks the ideological movement of the American electorate over the past sixty years. Stimson’s measure has fluctuated over time, but it hasn’t shifted dramatically or consistently to the right (or left).109 As another political scientist sums up the voluminous data, “There is virtually no compelling evidence that more Americans have actually embraced conservatism since the 1960s.”110

This basic stability of opinion does not, however, extend to Americans’ trust in government. Here the pendulum has swung toward distrust, and stayed there. At the tail end of Eisenhower’s presidency, over 70 percent of Americans said they trusted the federal government “to do what is right” most of the time or just about always. In 1980 only 25 percent of Americans expressed that level of trust. The rest—almost exactly the same proportion that had expressed high trust in 1958—said government could be trusted only some of the time or never. And while trust in government recovered briefly in the wake of 9/11, it was back at record low levels by the late 2000s.111

Experts still debate why trust plummeted in the United States. Nearly all big institutions have lost trust since the 1960s (though with notable exceptions, such as the military). Yet government has earned a distinctive, and distinctively deep, opprobrium. The initial loss of faith seems to have been linked to discrete events—the urban riots of the 1960s, the Vietnam War, Watergate—but the persistent flatlining remains a puzzle. The strongest explanations center on the increased insecurity of middle-class Americans and, more important, the sense that government could do nothing to remedy it. Put simply, many Americans lost confidence in the capacity of government to safeguard economic prosperity and security, and the party of government—the Democrats—paid the price.112 Once trust fell, it took little time for political entrepreneurs to realize that stoking antigovernment sentiments was a winning strategy. Politicians used to run for Washington; now they ran against it.

One thing is clear: Distrust among the public is rooted in different considerations than those animating the aggrieved 0.1 percent. A key complaint, in fact, is that those at the top have undue influence. In 1964, according to the premier academic survey of US voters, the American National Election Studies, more than two-thirds of Americans felt that government was “run for the benefit of all the people.” By 2008, only 30 percent felt that way; the other 70 percent believed that it was “pretty much run by a few big interests looking out for themselves.” While those at the top complained about politicians’ craven responsiveness to the electoral rabble, the rest of Americans complained that the rich ran the show.113

Whatever the cause, declining public faith in government pushed in the same direction as the Randian leanings of the economic elite: away from the mixed economy. At the individual level, low trust appears to weaken liberal leanings among voters. Public support for specific programs that provide direct benefits, such as Social Security, appears largely impervious to declining trust. Yet when it comes to more distant or prospective benefits—the benefits typical of the long-term investments that the mixed economy requires—low trust means little willingness to put faith in public officials or their policies. If waste and incompetence are rife, if tax dollars are thrown down bureaucratic rat holes, if politicians are in it only for themselves or those who shower money on them, why support new initiatives or put faith in existing policies where the benefits are hard to see? “Trust but verify” becomes “distrust and defeat.”

The Sound of Silence

The antigovernment wave that formed in the 1970s was a result of political mobilization and rhetorical creativity, the economic shocks of the decade, and the changes in understanding and advocacy that flowed from them. The first tremors were set off by stagflation and its ideological reverberations. But the wave gained power and speed because of the self-reinforcing economic and policy transformations that were unleashed by this initial surge—and the dramatic shifts in America’s economic elite they fostered.

These changes did not go unnoticed or occur without pushback. Yet those who sought to defend or resurrect the ideas under siege found themselves caught in what communications experts call a “spiral of silence.” In such a spiral, opinions become dominant because of acquiescence as well as acceptance. Even if individuals do not agree with an idea, their sense that it is shared broadly makes them reluctant to voice dissent. In time, this anticipation can create self-fulfilling cycles—a “spiral”—in which conflicting ideas are pushed to the periphery. When alternative understandings are no longer voiced confidently, we collectively forget their power.

Consider the fate of the label liberal. Why did Americans turn away from identifying with a political tradition that had played a vital role in defeating fascism and securing postwar prosperity? Two leading political scientists, Christopher Ellis and James Stimson, give a partial answer:

What changed . . . was that astute politicians on the left stopped using the word liberal to describe themselves. Before the change, the public saw liberal aligned with popular Democratic programs. . . . This is a curious case where what is individually rational, for individual politicians to avoid the liberal label, may be collectively nonrational, as they become subject as a class to being associated with an ever more unpopular label as it goes undefended. And as popular politicians avoid the liberal label, it provides an opportunity for their conservative opponents to fill the vacuum with unpopular personalities and causes. The asymmetrical linguistic war sets up a spiral in which liberal not only is unpopular, but becomes ever more so.114

The positive conception of government at liberalism’s core became snared in a similar spiral. When President Eisenhower delivered his first State of the Union Address, he drew on a broad reservoir of support for the mixed economy. He took for granted that government made fundamental contributions to our shared prosperity. Those within his party who thought otherwise were marginalized. Business leaders, too, recognized that they had to engage with government and labor as partners. Many genuinely accepted the partnership, but all understood that they had to accommodate it.

Forty years later, when Bill Clinton took the podium, the world looked different. The reservoir of enthusiasm for government was dry, baked away by the relentless attacks on government that politicians of both parties had found were the surest way to national office. Declining public trust eroded support for active government and created a political vacuum that powerful private interests filled. A revitalized Republican Party led the assault, as chapter 8 will make clear. Yet even the party of government—and those like Clinton who led it—found the spiral of anti-Washington sentiment hard to escape, especially as those powerful private interests became increasingly central sources of financial support.

The corporate world had changed as well. The financial restructuring that began in the 1980s reshaped the character, leadership, and culture of American business. Among those favored by these changes, older understandings of what produced prosperity gave way to new conceptions of the relationship between business and government, the process of wealth creation, and the contribution of managers versus workers—conceptions sharply at odds with those supporting the mixed economy. In the new corporate world, business leaders who praised the active role of government were harder to find. No less fateful, business associations that could engage with political leaders to pursue broad prosperity were harder to find, with profound consequences for the mixed economy that the next chapter will explore.