Hugh Carey was governor of New York in the middle 1970s. New York City was going bankrupt. Plenty of other cities and states across what was already by then known as the frost belt were in similar shape. However, in what was also then coming to be known as the Sun Belt, it was, relatively speaking, boom time. The population of the South and southwest would explode by 40 percent over the next two decades, twice the national rate. Meanwhile in Carey’s words, Yankeedom was turning into “a great national museum” where tourists could visit “the great railroad stations where the trains used to run.”1
Actually, tourists weren’t interested. Abandoned railroad stations might be fetching in an eerie sort of way, but the rest of the museum was filled with the artifacts of recent ruination that were too depressing to be entertaining. It is true that a century earlier, during the first Gilded Age, the upper crust used to amuse itself by taking guided tours through the urban demimonde, thrilling to sites of exotic depravity or ethnic strangeness, traipsing around rag-pickers alley or the opium dens of Chinatown, or ghoulishly watching poor children salivate over store window displays of toys they could never touch.
Times have changed. Better out of sight and out of mind. Nonetheless, the national museum of industrial ecocide, a mobile collection moving from city to city, grew more grotesque from year to year, decade to decade.
For instance, Camden, New Jersey, had been for generations a robust, diversified small industrial city. But by the early 1970s, its reformist mayor, Angelo Errichetti, said it “looked like the Vietcong had bombed us to get even. The pride of Camden… was now a rat-infested skeleton of yesterday, a visible obscenity of urban decay. The years of neglect, slumlord exploitation, tenant abuse, government bungling, indecisive and short-sighted policy had transformed the city’s housing, business, and industrial stock into a ravaged, rat-infested cancer on a sick, old industrial city.” If we keep in mind that the mayor could describe things this way over forty years ago, and that we can still read stories today about Camden’s further decline into some bottomless abyss, it can help us reckon with how long it takes to shut down a whole way of life.2
Once upon a time Youngstown, Ohio, was a typical smokestack city, part of the steel belt running from Pennsylvania through Ohio and on into the heavy-industry core of the upper Midwest. As in Camden, things started turning south in the 1970s. Over a ten-year period beginning in 1977, the city lost fifty thousand jobs in steel and related industries. By the late 1980s, when it was “morning again in America,” it was midnight in Youngstown: foreclosures all over town, epidemic business bankruptcies and criminal and domestic violence, collapsing community institutions including churches and unions and families and municipal government, physical and emotional maladies left untreated. Burglaries, robberies, and assaults doubled after the plant closings. Child abuse rose by 21 percent, suicides by 70 percent in two years. There were an average of two thousand personal bankruptcies annually during the mid-1980s. One-eighth of Mahoning County went on welfare. The city filled up with the detritus of abandoned homes: work clothing, unpaid bills, scrap metal and wood shingles, shattered glass, stripped-away home siding, canning jars, swing sets, dead storefronts. Fifteen hundred people a week visited the Salvation Army’s soup line. The Wall Street Journal called Youngstown “a necropolis,” noting miles of “silent, empty steel mills” throughout the Mahoning valley and a pervasive sense of fear and loss. Bruce Springsteen would memorialize that loss in “The Ghost of Tom Joad.”3
The steel mills of Gary, Indiana, were built on a thousand acres of swampland and sand dunes by Elbert Gary, chairman of United States Steel during the first decade of the last century. His idea was to flee the growing labor pressures and municipal regulations of Chicago and Pittsburgh by putting up a “disposable city.” Just to be sure, the company rerouted the Calumet River so it might serve as a barrier between the mill and the town. Harper’s Weekly commented that “the strategic position” of the mill “indicates a premonition of trouble. The Gary steel-mills will be an open shop and the swarming hordes of Huns and Pollacks will think twice… before crossing the medieval mote [sic] to gain the industrial stronghold beyond.”
The best-laid plans go awry. In 1919 Gary exploded during that year’s steel strike. Martial law was declared and a thousand troops of the Fourth Division of the U.S. Army quashed the uprising. One reporter observed that “the magic city has become a weird nightmare.” Another nightmare, just as bleak in its own way, was a half century down the road. By 1985, the deindustrialization of the steel belt had left one-third of Gary’s population living below the poverty line. Dispossessed steelworkers in the “magic city,” who had once made $15 to $20 an hour, were lining up at Wendy’s hoping to work for $3.35.4
If you were unfortunate enough to live in Mansfield, Ohio, for the last forty years, you would have witnessed in microcosm the dystopia of destruction that was unfolding in similar places. For a century, workshops there made stoves, tires, steel, machinery, refrigerators, and cars. Then Mansfield’s rust belt got wider and wider as one plant after another went belly-up: Dominion Electric in 1971; Mansfield Tire and Rubber in 1978; Hoover Plastics in 1980; National Seating in 1985; Tappan Stoves in 1986; a Westinghouse plant in 1990; Ohio Brass in 1990; Wickes Lumber in 1997; Crane Plumbing in 2003; Neer Manufacturing in 2007; Smurfit-Stone Container in 2009; and in 2010, GM closed its most modern and largest stamping factory in the United States and, thanks to the Great Recession, Con-way Freight, Value City, and Card Camera also shut down.
Midway through this industrial calamity, a journalist echoed Hugh Carey’s words as he watched the Campbell Works of Youngstown Sheet and Tube go dark, musing that “the dead steel mills stand as pathetic mausoleums to the decline of American industrial might that was once the envy of the world.” What’s particularly impressive about this dismal record is that it encompassed the alleged boom times of Reagan and Clinton. Through “good times” and bad, Mansfield shrank, becoming skin and bones. Its poverty rate now is 28 percent and its median income is $11,000 less than the national average. What manufacturing remains is nonunion, and $10 an hour is considered a good wage. Mansfield’s fate was repeated elsewhere in Ohio—in Akron, for example, once the rubber capital of the world, whose dismal decline was captured by the Pretenders in “My City Was Gone.”5
In the iciest part of the frost belt, a Wall Street Journal reporter noted in 1988 that “there are two Americas now, and they grow further apart each day.” She was referring to Eastport, Maine. There the bars filled up with unemployed and underemployed. Although the town was the deepest port on the East Coast, few ships docked there anymore and abandoned sardine factories lined the shore. The journalist had seen similar scenes of a collapsing rural economy from “coast to coast, border to border”: museums displaying boarded-up sawmills and mines and storefronts, closed schools, rutted roads, and ghost airports.6
Closing up, shutting down, going out of business—last one to leave, please turn out the lights! So it was in towns and cities around the country. Public services—garbage collection, policing, fire protection, street maintenance, health care—atrophied along with the economic musculature. So too did the physical and psychic makeup of people, their body chemistry and moods and relations with others—things we don’t customarily think of as economic “assets,” but without which society grows weak and morbid. High blood pressure, cardiac and digestive problems, and mortality rates tended to rise. So too did doubt, self-blame, guilt, anxiety, and depression. The drying up of social supports among friends and workmates haunted inhabitants of these places just as much as the industrial skeletons around them.7
When Jack Welch, soon to be known as “Neutron” Jack Welch for his ruthlessness (and anointed by Business Week as “the gold standard against which other CEO’s are measured”), took over the running of General Electric in the 1980s, he set out to raise the stock price by gutting the workforce. Welch was frank: “Ideally you’d have every plant you own on a barge ready to move with currencies and changes in the economy.” Either that or shut them down (or at the very least keep amputating). During his first two years at the helm of GE, Welch laid off more than 70,000 people (20 percent of the workforce), three years later another 60,000 were gone. But imagine what it was like in places like Schenectady, which lost 22,000 jobs; in Louisville, where 13,000 fewer people were making appliances; in Evendale, Ohio, where 12,000 who used to make lighting didn’t anymore; in Pittsfield, Massachusetts, where 8000 plastics makers got laid off; and in Erie, Pennsylvania, where 6000 locomotive workers got pink slips.8
Life as it had been lived stopped in GE and other one-company towns. Nor did it always happen at the hands of an asset stripper like Neutron Jack. Take Solvay, a town near Syracuse. Solvay Process, a maker of soda ash, went belly-up in the mid-1980s, thanks to the inexorable economic mechanics of deindustrialization. When it did, the bars closed, the luncheonettes closed, the local trucking companies did too, and marriages ended and cars got repossessed, houses got foreclosed, and the local government all but closed as tax revenues shrank by half.9
Two traveling observers making their way through this American wasteland in 1984 depicted such places as “medieval cities of rusting iron.” It was a largely invisible landscape filling up with an army of transients, moving from place to place at any hint of work, camped out under bridges, riding freight cars, locked out of overstuffed mission houses, living in makeshift tents in fetid swamps, often armed, trusting no one, selling their blood, eating out of Dumpsters.10
Nor were the South, the rest of the Sun Belt, and the mountain west immune to this wasting disease. At first, the region grew robust at the expense of the aging industrial heartland. Local political and business elites actively recruited up north. They had a lot to offer: low taxes, municipal and state subsidies, loans, and land grants, plus no unions to speak of. The old Confederacy together with western newcomers like Phoenix or Orange County executed a historical about-face, making deindustrialization a form of regional revenge; the former “colonies” were now the headquarters for banks, mass retailers, and low- and high-tech manufacturers.
But the financial imperatives driving deindustrialization were eventually felt everywhere. Empty textile mills, themselves often runaway migrants from the north, dotted the Carolinas, Georgia, and elsewhere. Half the jobs lost due to plant closings or relocations in the 1970s happened in the Sun Belt. High-tech manufacturers also shut down. When Kenmet, a maker of capacitors and other sophisticated electronics, closed in 1998 in Shelby, North Carolina, the town and plant were likened to a “morgue,” “a morbid place” of resignation, and its people were sometimes accused of “rural malingering.” One such “malingerer” had this to say: “We’re up a creek [if] anybody gets sick or dies. There’s just that little feeling in the pit of your stomach like please don’t let disaster come.”11
Ten years later in Colorado Springs, one-third of the city’s streetlights were extinguished, the police helicopters were sold, the budget for watering and fertilizer in the parks was eliminated, and surrounding suburbs closed down the public bus system. In Prichard, Alabama, monthly pension checks to the town’s 150 retired workers stopped. During our Great Recession, one-industry towns like Elkhart, Indiana (“the RV capital of the world”), Dalton, Georgia (“the carpet capital of the world”), Blakely, Georgia (“the peanut capital of the world”), or Lehigh Acres, Florida (the housing boom), were closing libraries, firing the police chief, and taking other desperate survival measures. Tax revenues that used to pay for these elementary services were no longer available, thanks to the imploding of local economies and public policies that sought to appease big business by reducing or wiping out their tax liabilities.12
After the financial system tanked and budgets shriveled, Wilmington, North Carolina, could no longer provide fire protection because it couldn’t pay to replace worn-out motors for pumping water. Streets had become nearly unnavigable as they crumbled from disrepair; garbage got picked up less often; upkeep of public gardens, parks, and community centers stopped. Sidewalks rotted; recycling ended. Cops and firemen shared in the general demoralization.
Even exemplars of the Sun Belt industrial miracle were at pains to sustain it. Phoenix, for example, had with great success invited in a broad range of industries, many of them at the technological cutting edge. By the turn of the new century, however, the city was well along the roadway to a postindustrial future. That prospective future had already become the present in such once unlikely places as San Jose, the capital of Silicon Valley, and in this case the future wasn’t working. Verging on bankruptcy, California’s third largest city was closing libraries, letting go city workers in droves, shutting down community centers, and asking residents to put up with gutted roads: “We’re Silicon Valley, we’re not Detroit. It shouldn’t be happening here. We’re not the Rust Belt,” lamented one shocked city councilman.13
MIT released a study in 2010 about America’s 150 forgotten cities, all doomed by deindustrialization. Not forgotten but down and out were such hallmarks of the country’s industrial rise and triumph as Buffalo, Cleveland, Albany, and Allentown, topping their list of the country’s “Ten Dying Cities.”14
Detroit—no metropolis has been more emblematic of tragic decline from the outset of our era of disaccumulation right up to the present day. Once a world-class city that was full of architectural gems, with a population in the 1950s that had the highest median income and rate of home ownership of any major American city, the Motor City now haunts the national imagination as a ghost town, like all ghosts dead, yet alive. Once the nation’s fourth larger city, with 2 million residents, its decrepit hulk is home to 900,000. In one decade alone, from 2000 through 2010, the population hemorrhaged by 25 percent—that’s nearly a quarter of a million people, or about the size of post-Katrina New Orleans.
Vast acreage—one-third of the metropolitan land area, the size of San Francisco—is now little more than empty houses and empty factories and fields gone feral. A whole industry of demolition, cordage, waste disposal, and scrap metal companies has sprung up to tear down what once was; Plant Closing News, which began in 2003, is just one new enterprise that chronicles the Motor City’s decline (teardowns average one hundred each month). With a jobless rate of 29 percent, some of Detroit’s citizens are so poor they can’t pay for funerals, so bodies pile up at mortuaries. Bus routes have been eliminated, as have streetlights. Plans are afoot to let the grasslands and forests take over, to shrink-wrap the city, or to give it to private enterprise—or maybe all three.
Take the story of the Detroit Zoo. Once a fully populated public-owned home for wildlife of all sorts, it was nearly privatized in 2006. While an outcry stopped that from happening then, new management reduced it to the barest minimum of zookeepers and animals for them to care for. An associate curator once in charge of elephants and rhinos went in search of other work as his wages were disappearing along with the animals. He found a job with the city chasing down feral dogs, whose population had skyrocketed as more and more of the cityscape went back to wilderness.15
Thanks to information technology there are now websites offering “our delightful selection of de-industrial t-shirts, art, photography, and drawings,” including posters and postcards for those tourists susceptible to schadenfreude first dreamed of by Hugh Carey. Anything can be aestheticized in our postindustrial age, including dereliction and decay. In Detroit, according to the writer Paul Clemens, who spent a year “punching out” in a closing auto plant, this hypersophistication, part high-mindedness, part environmentalist chic, has become a cottage industry in its own right, converting industrial wastelands into artistically designed parks.16
In the Motor City (and in other core industrial centers like Baltimore) “death zones” have emerged, where neighborhoods verge on medical collapse.
Perhaps these stories of decline and decay merely constitute local instances and symptoms of a familiar story about capitalism’s penchant for “creative destruction.” Old ways die off, sometimes painfully, but that’s part of the story of Progress. New wonders appear where old ruins stood and people get healthier and wealthier, if not necessarily wiser.
But what if Edward Bellamy’s time traveler from 1888 woke up in Boston in the year 2000 and found to his dismay that society seemed headed back to where he’d come from? Cities were decaying, people were growing poorer and sicker, bridges and roads were crumbling, sweatshops were growing more common, more people were incarcerated than anyplace on the planet, workers were afraid to stand up to their bosses, unions were barely managing to stay alive, the air, water, and land were filling up with poisons, schools were failing, daily life was growing riskier, debts were more onerous than most could have imagined, inequalities were starker than ever.
A recent grim statistic suggests this dystopian view may be not quite as far-fetched as it might otherwise seem. For the first time in American history the life expectancy of white men and women has dropped. Researchers have reported that the life spans of the least educated have actually fallen by about four years since 1990. The steepest declines were among white women lacking a high school diploma, who lost five years of life, while for men the drop was three years. These numbers not only are unprecedented for the United States, but come very close to the catastrophic decline for Russian men in the years following the collapse of the Soviet Union.
Because these life and death statistics are linked to where one ranks in the economic/educational hierarchy, they are also evidence of a country’s ever-growing inequality—in this case, a fatal inequality. The life expectancy of African Americans has always and still does trail far behind that of the white population (although now because of this recent decline, black women and less educated white women are in the same boat).
But the more global import for assessing what’s been happening over the last generation or so is captured by another number: as of 2010, American women had fallen to forty-first place in the United Nations ranking of international life expectancy, from fourteenth in 1985. Among developed countries, American women now rank last. Younger Americans die earlier and live in poorer health than their counterparts in other developed countries. Indeed, death before the age of fifty accounts for about two-thirds of the difference between males in the United States and one-third for females as compared with their counterparts in sixteen other developed countries.17
Why? This demographic crash landing is so unanticipated it is mystifying. The obesity epidemic, rising rates of smoking among women, wider and wider abuse of prescription drugs, guns, spreading psychosocial dysfunction, millions and millions lacking health insurance, overwork, “death zones”—singularly and together—are getting looked at. Whatever the final explanation, this social statistic is the rawest measure of a society in retrogression, of a country in the throes of economic anorexia. At the most elementary level, it is stark repudiation of one of the nation’s most coveted conceits: that the New World is the land of Progress ne plus ultra.
And there is one other marker of this eerie, counterintuitive story of a developed nation undergoing underdevelopment. It too is a reproach to an equally cherished national tenet. For the first time since the Great Depression, the social mobility of Americans is moving in reverse. Every decade since the 1970s, fewer people have been able to move up the income ladder than in the previous ten years. Now Americans in their thirties on average earn 12 percent less than their parents did at the same age. Danes, Norwegians, Finns, Canadians, Swedes, Germans, and the French all enjoy higher rates of upward mobility than Americans. Remarkably, 42 percent of American men raised in the bottom one-fifth income cohort remain there for life, compared with 25 percent in Denmark and 30 percent in the notoriously class-stratified land of Great Britain.18
We have become familiar with this lament of what is loosely called “the vanishing middle class.” Except for the top 10 percent, everyone else is on the down escalator. The United States now has the highest percentage of low-wage workers (earning less than two-thirds of the median wage) of any developed nation. The standard of living for most Americans has fallen over the last quarter century, so that the typical household income in 2012 was just what it was twenty-four years earlier. The comedian George Carlin once mordantly quipped about what can be described only as a lost illusion, “It’s called the American Dream because you have to be asleep to believe it.”19
During the long nineteenth century, wealth and poverty lived side by side. So they do again today. In the first instance, when industrial capitalism was being born, it came of age by ingesting the valuables (land and inorganic resources, animals and vegetables and human muscle power, tools and talents and know-how, and the ways of organizing and distributing what got produced) embedded in precapitalist forms of life and labor. Because of this economic metabolism, wealth accumulated in the new economy by extinguishing wealth in these precapitalist older ones.
Whatever the human and ecological costs, and they were immense, the hallmarks of Progress, at least in a strictly material sense, were also highly visible. America’s capacity to sustain a larger and larger population at rising levels of material well-being, education, and health was its global boast for a century and a half.
These statistics about life expectancy and social mobility suggest those days are over with. Wealth, great piles of it, is still being generated, some of it displayed so ostentatiously it is impossible to overlook. Technological marvels still amaze. Prosperity exists, although those living within its charmed circle constitute a less and less numerous caste. But a new economic metabolism is at work.
Primitive accumulation at the expense of noncapitalist ways of life accounted for that earlier epoch of Progress. For the last forty years, however, prosperity, wealth, and progress have rested in part on the grotesque mechanisms of auto-cannibalism, or what has been called disaccumulation, a process of devouring our own.20
Traditional forms of primitive accumulation continue elsewhere—across the vastnesses of rural China and Southeast Asia, in the villages and mountains of Latin America, in the forests and plains of Africa. In these places, peasants, peddlers, ranchers, petty traders, scavengers, handicraftsmen, and fishermen get engorged by great capitalist enterprises that are often headquartered abroad. These hundreds of millions provide the labor power and cheap manufactures that buoy up the bottom line of global manufacturing and retailing corporations, banks, and agribusinesses.
Here in the homeland, however, the profitability and prosperity of privileged sectors of the economy—first of all, in the arena of finance—have depended instead on stripping away the meat and bone from what was built up over generations.
Once again a new world had been born. This time it depended on liquidating the assets of the old one. This liberated capital might be shipped abroad to reward speculation in “fictitious capital” (currency and commodities markets, securitized debt, and so on). Or it might be invested to restart the engines of primitive accumulation in the vast global outback. The rate of domestic investment in new plants, technology, and research and development began declining in the United States during the 1970s, and the falloff actually accelerated in the gilded ’80s. Dismantling took many forms: deliberately allowing plant and equipment to deteriorate until useless, using it as a cash cow via resale, milking profits and sending them elsewhere rather than reinvesting to update facilities, shutting down part of a plant and running it at low levels, or selling off parts to a jobber.
During the 1970s alone, between 32 and 38 million jobs were lost due to this kind of disinvestment, which was common practice in old (New England textile factories) and new industries alike (New England aircraft manufacturers). Manufacturing, which after the Second World War accounted for nearly 30 percent of the economy, by 2011 had dropped to a bit more than 10 percent. Since the turn of the millennium alone, 3.5 million manufacturing jobs have vanished and 42,000 manufacturing plants closed. On average between the years 2000 and 2011, seventeen American manufacturers closed each day.21
Nor are we witnessing the passing away of relics of the nineteenth century. Of the 1.2 billion cell phones sold in 2009, none were made in the United States. In 2007, a mere 8 percent of all new semiconductor plants under construction globally were located here. The share of semiconductors, steel, cars, and machine tools made in America has declined precipitously just in the last ten years or so. Even the money sunk into high-tech telecommunications has been largely speculative, so that only 1 to 2 percent of the fiber-optic cable buried under Europe and North America was ever turned on, even at the height of the dot-com boom. Today, only one American company is among the top ten in the solar power industry (photovoltaic cells), and the United States accounts for a mere 5.6 percent of world production. Only GE is among the top ten in wind energy. Much high-end engineering design and R&D work has been expatriated as well. By 2004 America trailed China in exports of high technology. The portion of printed circuit boards, which are at the heart of advanced industry, produced here at home has steadily declined and now accounts for 8 percent of global output compared with 26 percent in 2000. The cherished belief that stuff made in America is inherently of superior quality is encouraged by companies like Apple. The company cultivates that idea in its iPhone ads, noting that the product is “designed in California.” But it’s not built there. Today there are more people dealing cards in casinos than running lathes and almost three times as many security guards as machinists.22
Meanwhile, the fastest growing part of the economy has been the finance, insurance, and real estate (FIRE) sector. Between 1980 and 2005, profits in the financial sector increased by 800 percent, more than three times the growth in nonfinancial sectors. During the ten years from 1978 to 1987, profits in the financial sector averaged 13 percent. From 1998 through 2007, they came in at a stunning 30 percent.23
Creatures of finance, rare or never seen before, bred like rabbits. In the early 1990s, for example, there were a couple of hundred hedge funds; by 2007, there were ten thousand. A “shadow banking” system consisting of hedge funds, private equity firms, security brokers dealing in credit instruments, and an array of mortgage entities grew up alongside the conventional one to account for half of the whole financial industry before the crash of 2008. A whole new species of mortgage broker now roamed the land, supplanting old-style savings and loan institutions and regional banks. Fifty thousand mortgage brokerages employed four hundred thousand brokers, more than the whole U.S. textile industry. A hedge fund manager put it bluntly: “The money that’s made from manufacturing stuff is a pittance in comparison to the amount of money made from shuffling money around.” Forty-four percent of all corporate profits in the U.S. come from the financial sector compared with only 10 percent from the manufacturing sector. Lawrence Summers, Bill Clinton’s last secretary of the treasury, succinctly summarized where matters stood: “Financial markets don’t just oil the wheels of economic growth; they are the wheels.”24
Too often these two phenomena—the evisceration of industry and the supersizing of high finance—have been appreciated as parallel to but not organically tied to each other. Yet another fable, adorned with the hieroglyphics of differential calculus, tells a reassuring story. Sad it might be that, for some people, towns, cities, and regions, the end of industry meant the end. But that is (as it always has been), so the myth goes, only the unfortunate yet necessary prelude to a happier future pioneered in this latest case by “financial engineers” equipped with a new technical know-how to turn money into more money while bypassing the intermediary messiness of producing anything. And lo and behold: prosperity! This tale, however, contains a categorical flaw.
Actually, the ascendancy of high finance was premised on gutting the industrial heartland. That is to say, the FIRE sector not only supplanted industry but grew at its expense—and at the expense as well of high wages and the capital that used to flow into those arenas of productive investment.
Think back only to the days of the junk bonds, leveraged buyouts, megamergers and acquisitions, and asset stripping in the 1980s and ’90s. What was getting bought, stripped, and closed up—all in order to support windfall profits in high-interest-paying junk bonds and the stupendous fees and commissions paid to “engineer” these transactions—was the flesh and bone of a century and a half of American manufacturing. Wall Street’s renewed preeminence, its own “morning again in America,” was in every way bound up with this midnight vanishing of a distinct species of American economic and social life. For some long time now, our political economy has been driven by “I” banks, hedge funds, private equity firms, and the downward mobility and exploitation of a casualized laboring population cut adrift from its more secure industrial havens.
“Deindustrialization” is antiseptic terminology for social devastation. In fact, it marked a fundamental overturning of a whole way of life, transforming the country’s economic geography and political demographics, bearing with it enormous social and cultural ramifications. Whole towns, regions, unions, churches, schools, local businesses, and community hangouts, political alliances, venerable traditions, and historic identities went down with the smokestacks. Feelings of despair, loss, and resentment filled the emptiness.
In their landmark 1982 book The Deindustrialization of America, Barry Bluestone and Bennett Harrison plotted this fateful interconnection. Deindustrialization, as they saw it, entailed the diversion of capital “from productive investment in our basic national industries into unproductive speculation, mergers and acquisitions, and foreign investment.” This did not occur by happenstance; it was because the rate of profit in American industry began a long-term decline in the 1970s in the face of heightened competition from the reconstructed, postwar economies of former enemies and allies alike.25
For the first time in nearly a century the country bought more than it sold on the international marketplace. Trade deficits became a permanent fixture of the U.S. economy. Signs of decline mounted. The U.S. share of global GDP fell from 34.3 percent in 1950 to 24.6 percent in 1976; American oil production declined from 50 percent after the war to 15 percent; steel from 50 percent to 20 percent. Japan captured the home electronics market (TV, video, numerically controlled machine tools) as well as big chunks of the car, textile, and shoe businesses. Germany did the same in metalworking. The American share of the world market in manufactured goods shrank by 23 percent. Productivity growth slowed and even shrank by the end of the 1970s. During the next decade, the Bureau of Labor Statistics estimated that between 1.5 and 2 million jobs evaporated each year as factories and ancillary businesses shut down.
National alarms went off. In 1974, Business Week published a special issue on “The Re-industrialization of America” (how naïve that sounds now, when talk about how to re-create American industry is mainly dismissed by “experts” in the age of finance as Luddite babble). The New York Times ran a five-part series on the same subject and Congress held hearings. Remedies of various kinds were applied. Some, like compelling international rivals to raise the value of their currencies relative to the dollar, helped save or enlarge American export markets temporarily. Corporate profits could also be sustained artificially by cutting taxes, by loosening the constraints of government regulation, by floating ever-larger budget deficits in part to finance a vast expansion of the arms industry. Forcing down costs, especially labor costs and the social costs of the welfare state, would lessen the pain by increasing it. This made union busting and cuts in social programs and infrastructure maintenance and replacement a fixture of public life. Slicing away at the accumulated fat of corporate middle-level management and the rusting productive facilities they managed became part of the tactical repertoire.26
None of this could in the long term solve the underlying, intractable problem of depressed profit rates, overcapacity of production, and the paucity of enough lucrative outlets for new investment. Only depressions performed that kind of radical surgery, with great thoroughness and ruthlessness. By denuding the economic landscape of less remunerative enterprises, those that survived such deep downturns could start afresh, buying up cheaply bankrupted assets, and paying needier workers a fraction of what they once earned. Draconian credit tightening of the sort attempted by Federal Reserve Board chairman Paul Volcker at the end of the 1970s came too scarily close to simulating that kind of across-the-board triage. Indeed, it was too unsettling a reminder of that solution infamously suggested fifty years earlier by Secretary of the Treasury Andrew Mellon when he recommended curing the Great Depression by allowing it to run its course and “liquidate labor, liquidate stocks, the farmers, liquidate real estate.” Volcker didn’t venture quite that far, but did manage to produce the Reagan recession of the early 1980s. More dying branches of American industry were pruned away; however, the cure threatened to be worse than the disease. The recession was the severest since the 1930s.27
Short of this sort of heroic purging of the economy, however, currency manipulations, military spending, government-financed trade deficits, rejiggering the tax structure in favor of business and the wealthy, social austerity, deregulation, and an armored assault on the ossified structures of corporate America would have to do. These measures produced good-enough results during the Reagan era, creating a chimera of prosperity—an illusion, that is, unless you were attached to one of the charmed circles surrounding the petrochemical, finance, real estate, and telecommunications industries.
It was all too easy to be seduced by the flourishing of various forms of “paper entrepreneurialism,” by the escalation of corporate profits, by the magical mushrooming of venture capital (up 544 percent during the decade), by the frantic race among universities to set up “entrepreneur” course offerings, and by the atmosphere of luxe that made its debut at the Reagan inaugural ball. A thriving rentier class—those relying on various forms of investment income (interest, capital gains, rent, dividends)—made it harder to see that the percentage of national income paid out in wages and salaries fell by one-tenth. Few noticed that during the same decade the income of the bottom 10 percent of the population (25 million people) also fell by one-tenth—this was the first time since the 1930s that such a sizable number of citizens had suffered a serious decline in their standard of living.28
A rentier society is not necessarily a prosperous one. Actually, the growth in GNP was much less than it had been during the 1950s or ’60s, and even less than the lamentable ’70s. Net business investment fell and there was a great shrinkage in new capital formation, notwithstanding the tax cuts that were designed to encourage it. The fifty corporations receiving the largest tax breaks from the Reagan cuts of 1981 actually reduced their investment over the next two years.29
Asset stripping and financial deregulation in particular thus worked in tandem. Amending, paring down, and repealing a slew of financial prohibitions and supervisory agencies did not cause capital resources to move away from production into various forms of financial speculation. Instead, these measures expedited an outflow of capital already under way, thanks to the dilemma of profitability in American industry.
For example, we think of usury laws as medieval legislation. But as a matter of fact, usurious rates of interest were illegal in the United States as late as the 1970s. Then they were modified or abolished. So too, Federal Reserve regulations once required minimum down payments and maximum periods of repayment for housing loans, and another set of regulations required the same for credit card loans for cars, appliances, and durable goods of all sorts. Beginning during the Carter administration and accelerating during the Reagan and Clinton regimes, all of this was dismantled. Commercial banks were once restricted on the interest rates they could pay on deposits. By 1982 they weren’t. Later in the decade, they were allowed to underwrite commercial short-term paper, municipal bonds, and mortgage-backed securities. And, by the end of the Reagan years, they could even underwrite equities (that whole class of investments which, unlike bonds or most other loans, entailed ownership rights but also were much riskier), making it up to 25 percent of their business.
Inflation and opening up the sluice gates of financial speculation made it difficult for savings and loan institutions to compete because they were still limited by law in what they could pay depositors and what they could invest in. So those rules were jettisoned, and S&Ls even started putting people’s savings into high-risk junk bonds, securities that would soon enough plummet in value but only after sucking dry the material wherewithal of the companies they were used to purchase. All this ended in the savings and loan debacle at the end of the Reagan administration, when 1000 of the 3400 S&Ls went bankrupt or lived on only as “zombie banks.” (Zombie banks are financial institutions with a net worth less than zero that continue operating thanks to government support.)
Between 1985 and 1992, more than 2000 banks failed; only 79 had gone under in the whole of the otherwise dolorous 1970s. In the Clinton administration more prohibitions were relaxed; most notably, the Glass-Steagall Act, the New Deal law that separated commercial from investment banking, was repealed. These actions, including the freeing of the derivatives market from any government oversight, produced the asset bubbles that blew up in the stock market, dot-com, and, later, subprime mortgage debacles. New laws opened the door to the creation of financial holding companies that could do anything they pleased—including loaning, investing, and speculating—regardless of the barriers which had once separated these quite different, sometimes counterposed, activities.30
Boom times under Reagan, Clinton, and George W. Bush were premised on the securitization of everything in sight, from mortgages and student loans to credit card debt. A voluminous thesaurus of financial engineering inventions appeared: first exotic arcana like OIDS (original issue discount securities) and PIKS (payment in kind securities), then the deeply mystifying CDOs, credit default swaps, and off–balance sheet vehicles. There were bonds that paid interest in the form of other bonds. Wall Street became adept at converting even the homeliest forms of debt like car loans, boat loans, credit card bills, RV loans, and student debt into “asset-based securities.” Banks became megabanks as they competed for the riskiest, most lucrative paper. The bigger they got, the riskier they became. The whole system was so highly leveraged and precarious it was acutely vulnerable to the slightest disturbance.31
Deregulation outside the financial sector had the same withering impact. Carter initiated the deregulation of airlines, trucking, and railroads. Reagan was even more zealous, eliminating restraints on oil prices and electric and gas utilities. Telecommunications followed. Lowering costs was the rationale and sometimes that happened, sometimes not. For example, in the electrical power industry those states that deregulated in ways promoted by the energy company Enron saw their rates rise by $48 billion more than the average costs in states that retained traditional regulations.32
Long gone were the days when Wall Street did what it was purported to do: namely, raise capital to finance industry and other long-term enterprises. As one observer noted, collateralized debt obligations raised “nothing for nobody. In essence they were simply a side bet—like those in a casino—that allowed speculators to increase society’s mortgage wager without financing a single house.”33
Worse than that, however, the freeing of finance unleashed it to leech away the values accumulated over generations in American industry. Arguably the strategic goal of neoliberal policy was to liberate the powers of finance (while suppressing the social wage). The idea was to expand the mechanisms and techniques for originating, mobilizing, and marketing debt through the ingenuity of financial engineering at the expense of tangible resources while calling on the government (that is, the rest of us, free market ideology notwithstanding) to assume the moral hazard and social risk when default beckoned. Marx observed long ago that “all this paper actually represents nothing more than accumulated claims, or legal title, to future production.”34
Tithing the underlying economy in this way left it anemic and vulnerable. Under the regime of lean-and-mean, which remains with us to this day, hurdle rates for profits remained so high that the only way to meet them was to reroute capital out of long-term commitments (to research and development as well as to plant and equipment) and into portfolio investments in commercial and residential real estate, buying up companies (often for resale), or in stock trading or currency hedging that generated short-term earnings.
All of this facilitated the merger and acquisition and leveraged buyout mania on the Street. At the same time it led to a relentless lowering of labor standards and unionization because industries could no longer easily pass along some labor costs to consumers now that they were unregulated. As a result, workers in all these sectors—pilots, machinists, truck drivers—found their wages squeezed, their unions thrown on the defensive and surrendering to two-tier wage structures. New hires—that is to say, the future working class—would be compelled to move backward in time. Moreover, they would be accompanied on their journey by small-town and rural America. Deregulation of the whole transportation network penalized small towns, smaller cities, and low-income areas generally, as they lost airline and railroad connections, paid jacked-up local phone rates, and receded into invisibility.35
Not greed but rather the rigors of a survivalist competition to stay ahead of the profit curve produced a kind of economic brinksmanship. As the time horizon for realizing earnings grew shorter, capital was misallocated into financial manipulations, unproven technologies, and real estate, all of it heavily leveraged, its risks camouflaged. Little was added to the net stock of productive resources (except in the telecommunications industry, where too much was added and for the same speculative purposes.) But as a consequence, a lot was added to the bank accounts of traders, brokers, bankers, and top managements who psyched out the market correctly.36
Corporations in trouble, no longer able or willing to finance themselves with retained earnings, needed to please Wall Street to keep afloat. And they did so for two decades and more by the kind of ferocious cost-cutting “Chainsaw” Al Dunlap made famous at Scott Paper and Sunbeam. Dunlap had a bulldozer voice and a personality to go with it. He was apt to crow about his working-class parentage—his father was a union shop steward at a shipyard, his mother worked at a five-and-ten—to show his sympathies as he slashed and burned. Best-selling author of Mean Business, he earned his spurs by immediately laying off one-third of the blue-collar workforce (11,000 people) and shrinking the white-collar battalions from 1600 to 300 at Scott Paper.
Harley-Davidson, long practiced in this same art, was still at it during the Great Recession. The motorcycle maker stopped hiring in America and projected cutting loose more than one-fifth of its workforce, making up the lost capacity by ratcheting up the hours of those who remained. Practices of this kind were also common at corporations like Ford, General Electric, Alcoa, and Hasbro. Stock prices invariably soared as the mayhem descended. This strategy of deliberately wasting away was especially attractive to companies during the crisis periods of the dot-com collapse and the Great Recession.
Alternatively, a firm might look for new revenue not in manufacturing but by creating its own financial auxiliaries to speculate in everything from Eurobonds to currencies, credit cards and home mortgages, leases and insurance. Companies like General Electric and General Motors did this with a vengeance. GE Capital was responsible for 40 percent of GE’s revenue by 2008 and half its profits, dollops of which came from betting on collateralized debt obligations until they soured. Meanwhile, GE’s outlays for research and development dropped by 20 percent in the 1990s. Enron, before it became infamous, derived a similar share of its revenue not from power generation, but from lending, trading, and other financial activities, including fraud. It practiced a kind of commercial savagery. When a brush fire broke out in California, exacerbating that state’s summertime electrical energy crisis, an Enron trader, overjoyed that rates would rise, exclaimed, “The magical word of the day is ‘Burn Baby Burn.’ ” Another talked about the money they “stole from those poor grandmothers in California.”37
Investing in mergers and acquisitions was an appealing alternative only because the underlying assets of the companies being commingled or bought could then be pared down or liquidated, and pensions and health insurance obligations could be defaulted on, with the costs passed on to the public treasury. Devalued assets could in turn be picked up at bargain-basement prices and resold. Indeed, the point of most transactions was to buy in order to sell, which tended to leave long-term investment out in the cold.
It is only a slight exaggeration to say that the new corporations emerging out of this bazaar of buying and selling were in a new business: the fabrication of companies to trade back and forth. During the 1980s, nearly a third of the largest manufacturing firms were acquired or merged. It would be more apt to call all this churning rather than investing. In 1960, institutional investors held stock for an average of seven years; that was down to two years by the 1980s.38
Disaccumulation thus waged war against capitalism on behalf of capitalism. Absorbing or discarding the plant, equipment, and human resources locked up in manufacturing and other enterprises enriched the financial sector by eviscerating these other forms of capitalism. This should not be confused with what happened during the long nineteenth century. Big corporations and trusts then did drive smaller competitors into bankruptcy or take them over. But the net effect, looking at the situation strictly from the standpoint of production, was to enlarge the productive wherewithal of the nation’s capitalist economy.
Nor was the merger and acquisition mania of recent days like the conglomerate rage of the 1960s, where companies diversified their portfolios and eluded antitrust law by acquiring, usually with their own funds, a grab bag of unrelated businesses. In that instance, the corporation as a social institution remained in place. True, the conglomerate era left behind behemoths of inefficiency and managerial ineptitude and the lean-and-mean era would admittedly administer a purgative. But that corrective was a harsh one with immense collateral damage.
Moreover, many of the companies gobbled up, their allegedly incompetent managements evacuated, turned out to be better superintended by their old guard than by the buy-and-sell artists who momentarily replaced them. So, for example, high-octane rhetoric-propelled deals like the merger of Chrysler and Daimler-Benz in 1998, which predicted costs would be saved, economies realized, and markets expanded. Two years later—once income had plummeted, market capitalization had grown anemic, share prices were halved, workers had been downsized, six plants had closed, and assets auctioned off—the deal was called a nightmare. Many others ended likewise, but their failures paid large dividends on Wall Street.
As one analyst has described it, the modern corporation has now diffused into a nexus of contracts in which shareholder value is served first of all and last of all by managements acutely sensitive to that constituency alone. American industry was melted down to provide a vast pool of liquid capital that could be rechanneled into purely speculative trading in paper assets—stocks, bonds, IPOs, mortgages, derivatives, credit default swaps, structured investment vehicles, collateralized debt. And that covers only the relatively humdrum. What about trading bundles of insurance contracts for the terminally ill? Securitization was the alchemist’s stone of dispossession. Tangible assets got liquefied and turned into bundles of tradable intangibles.
Most of this had negligible job-creating impact. But it offered rates of return no longer available elsewhere in the economy. It did, that is, until it didn’t. What was conceived of as ways of dampening risk (what else after all is a “hedge” fund supposed to do?) ended up aggravating it past the breaking point. After the dot-com crash of 2000, nearly all the thousands of IPOs of the late 1990s fell below their initial offering price. Half of those not already out of business were selling for $1 a share or less. The values incinerated in that implosion matched the value of all the homes in the United States. Their financial engineers had long since moved on.39
Just a few years later, during the global financial meltdown, the homes themselves would go extinct, washed away in an unnatural disaster. Between 2008 and 2011 three million people lost their homes. Eleven million homeowners found themselves “underwater” by 2011. Thirteen trillion dollars in houses and stock went up in smoke. By 2010, family income, long on a treadmill sloping down, was lower than it had been in the late 1990s. Investment in plant, equipment, and technology as a portion of GDP was less than in any decade since World War II. One in every seven Americans was being chased by a creditor. Some, in desperation, became reckless gamblers, loading up on even more precarious forms of debt in a futile race to stay one step ahead of the sheriff, engaged in a kind of self-sabotage.40
Yet for a generation financial engineers—and their ideological apologists—imagined themselves as crusaders. They were out to save the “exploited class” of shareholders being gulled by complacent and sometimes corrupt management. One business school professor noted that “maximizing shareholder value” was “embraced as the politically correct stance by corporate board members and top management.” It was a “mission-driven cause” that “overcame the wrongful allocation of capital and embodied the sacred identity of profit and private property.”41
Wrongful allocation indeed! What were often depicted as rescue missions turned out to be sophisticated forms of looting. Capitalism in whatever form has always encouraged such behavior, especially with regard to alien societies—and particularly those inhabited by people with the “wrong” complexions, physiognomies, and customs. Alongside the plundering, however, law-abiding enterprise grew up. But in our era of finance-driven speculation, the line between criminal looting and conventional “investment” grew hazier.
A preponderance of evidence accumulated suggesting that investment banking had become increasingly and even inherently corrupt. An avalanche of insider trading, market manipulation, misrepresentation, favoritism, kickbacks, conflicts of interest, and frauds of the most exotic variety has led one observer to describe the last twenty years as the “age of deception.” It’s unlikely that all of a sudden a whole subculture had descended into the criminal underworld (although the temptations to do so had arguably never been higher, and some obviously succumbed). More plausible and more worrying is that deception had become the new normal. Every player—“I” banks, regulators, IPO promoters, mutual fund operators, hedge funds, auditors, rating services—had been reared in a system whose whole purpose was to game the system. This was not a secret but a boast.42
America remains the world’s second manufacturing center, behind China. This is true even though the U.S. economy has lost its technological lead in many sectors, despite the wholesale hemorrhaging of manufacturing jobs during the Great Recession, and notwithstanding data-rich predictions that millions more jobs will be exported abroad over the next decade or so. What remains at home, however, will survive mainly because of the austere circumstances working people have been compelled to accept if they want to work at all. The Great Recession accelerated an onslaught of wage cutting, furloughs, two-tier wage hierarchies, and doubled job assignments already under way for two decades. Moreover, deindustrialization, as devastating as it has been, hardly captures all that has been devoured by the new order of finance-driven flexible capitalism.43
Wounds appeared everywhere. Infrastructure rotted, becoming inefficient and downright dangerous. The United States now ranks twenty-fifth in the quality of its infrastructure, trailing Brazil, India, and China in its transportation network, for example. One-quarter of highway bridges were recently deemed “structurally deficient” and another third poor or mediocre by the American Society of Civil Engineers, who also gave the nation’s roadways a grade of D minus. Other elements of the infrastructure fared better, getting a grade of D plus in 2013, up from the D of four years earlier. Public transportation also improved from four years earlier—it got a D. While Europe spends 5 percent of its GDP and China 9 percent on infrastructure, the United States manages only 2.4 percent. According to investment banker Felix Rohatyn, “Three quarters of the country’s public school buildings are outdated and inadequate. It will take $11 billion annually to replace aging drinking water facilities; what was budgeted for that, before sequestering, was less than 10% of that. Unsafe dams have risen by a third and number 3500. Half the locks on more than 12,000 miles of inland waterways are functionally obsolete.”44
Neglect, however, is only half the story. Certainly letting public facilities decay helps suppress the social wage supported by tax revenues, some of which actually comes from corporate America. One way to further reduce the corporate tax bill is to allow public services to deteriorate. While it may affect the daily lives of ordinary people, making them more unsafe, unhealthy, or inconvenient, in a land where the well-off insulate themselves from the public arena through access to private goods and services, this wasting away of public hardware is not so hard to bear. And not borrowing to replace worn-out bridges, tunnels, power plants, and waterworks cheapens credit for leveraging more lucrative if speculative endeavors in the world’s marketplaces: fewer demands for capital makes it less costly.
Something even more retrograde has been going on at the same time. Our neoliberal infatuation with the free market and the irresistible political influence of big business has led to the rediscovery and updating of the Enclosure Acts of eighteenth-century England. What had then been “the commons”—land, water, forest, and wildlife, available to all—was by legislative fiat converted into the private domains of Britain’s landed elite. Here in the United States over the last generation, public facilities, resources, and services—waterworks, transportation, public lands, telecommunication networks, airwaves, herbs, forests, minerals, rivers, prisons, public housing, schools, health care institutions, in sum everything from zoos to war-fighting—got privatized, turning the commonweal into profit centers for the incorporated. One writer has called this the “gutting edge of accumulation by dispossession.” It also goes on abroad, inscribed in trade agreements, where, for example, all the costs of environmental regulation and protection, if they are allowed at all, are borne by the host countries.45
Over the course of the last quarter century, poverty grew, crippled lives, and wore new faces. Some reappeared as the urban wretchedness the turn-of-the-century reformer Jacob Riis would have recognized. More showed up in the countryside and even in derelict suburbs; indeed the numbers of the suburban poor are now the fastest growing. Rural poverty reached 17 percent by the 1980s, its growth fed by the decline and death of extractive and smaller industry. During the new millennium’s first decade, the numbers of people categorized as living in “extreme poverty,” especially in the suburbs and in Midwestern cities, grew by one-third. And the biggest leap in people living below the poverty line happened in the Sun Belt, in places like Coral Gables and Fresno.
Between 2004 and 2007 (that is, before the Great Recession struck), more than 30 percent of Americans were poor at least once for two months or more; by 2007 one-third of all children lived in households that had been poor for at least a year. And just as the economy capsized in 2008, 40 percent of the 40 million officially poor people were “very poor,” meaning their incomes were less than half of what the government defined as poverty level. Close to 40 million people depended on soup kitchens or food pantries.46
Many of the impoverished were and are the marginalized unemployed. Back in 1978, Senator Edward Kennedy warned of the rise of “a permanent underclass”; and the 1980 Census recorded astonishing levels of poverty—over 20 percent—in cities like Newark, Atlanta, Miami, Baltimore, and Cleveland. Viewed by the overclass as “unreachable,” victims of their own cultural derelictions, excesses, and addictions, these people were disproportionately African American. This ascription implicitly functioned for many as the cause of what needed to be explained. That the deterioration of the economy, the shift in its center of gravity toward finance, the anemia and isolation of the labor movement, and cowardly public policy might better explain their exclusion grew less and less palatable in a culture infatuated with the market.47
But millions more work. And even during the Clinton boom years a white underclass emerged along with a rainbow underclass of new immigrants. By 2010, more than 15 percent of the population (46 million people) was living in poverty, the highest proportion since 1993. As of this writing, more people rely on food stamps—44 million—than ever before. Even before the Great Recession, the Agriculture Department estimated that 35 million people were “food insecure.” Some resort to food auctions, where items are cheap because they’re often past their sell-by dates. And as it had during the first Gilded Age, poverty descended into a shadow land of the criminal: local laws proliferated outlawing panhandling; harassing the homeless became commonplace; begging, loitering, camping, or sleeping in parks could land you in jail; there were even laws to warn off vagrants and outlaw squeegee guys.
When poverty captured headlines during the first Gilded Age, it was inextricably bound up with exploitation at the workplace. When poverty got rediscovered in the early 1960s, it was, on the contrary, more often associated with exclusion from the workplace in urban ghettos, in exhausted rural regions in Appalachia, and elsewhere. Now we’ve come full circle and poverty, more often than not, is associated once again with the sweatshop labor that Jacob Riis and others made infamous. When McDonald’s held its first-ever national hiring day in 2011, it signed up 62,000 people, more than the net job creation of the whole national economy in 2009. Those 62,000 were lucky to be plucked out of a pool of 938,000 applicants, a rate of acceptance lower than the entering freshman classes at Princeton, Stanford, and Yale. What awaited these fortunate ones was the lowest average wage then on offer in the American economy, less than half of the prevailing wage.
McJobs are the signature accomplishment of a faux recovery. A new category of the “near poor” was created to account for this swelling population of those who toil yet can barely manage to get by. About 50 million people now live in such near poor families. The largest increase in private-sector jobs during the “recovery” has been the low-wage food services and retail sectors. While 23 percent of the jobs lost during the Great Recession were low-wage ones, 49 percent of new jobs are low wage. Many are temporary in the retail, waste management, food services, and health care fields.
If you’re old and lucky (which usually means you belong to a union still capable of defending its members), you might hang on to where you got. But if you’re young you might well be headed down even before you get a chance to start rising. Entry-level wages for male high school graduates have dropped 19 percent since 1979; for females by 9 percent. The median income for men in their thirties is 12 percent less than it was for their fathers’ generation. As of 2005, two-thirds of young working people with a high school degree had no health insurance, compared with one-third in 1979. Households of people from twenty-five to thirty-four years old are carrying debt loads that exceed their annual income. They can’t afford housing, college, or cars.48
And once again immigrants, legal and illegal, make up a large proportion of the sweated labor force: 27 percent of drywall workers; 24 percent of dishwashers; 22 percent of maids and housekeepers; 22 percent of meat and poultry workers; 21 percent of roofers. Through the subcontracting arrangement now preferred by lean-and-mean flexible corporations, millions of these people work for major companies.49
Convict labor, in the nineteenth century a staple of industrial and agricultural enterprise, is making a comeback. The largest incarcerated population on earth (next is Rwanda) has become a pool of forced labor for a range of private and public enterprises, including many of the Fortune 500. The Corrections Corporation of America and G4S (formerly Wackenhut) sell inmate labor at subminimum wages to Chevron, Bank of America, AT&T, and IBM, among others. Corporations can, in most states, lease factories in prisons or lease prisoners to work on the outside making office furniture, taking hotel reservations, fabricating body armor, butchering meat, and sewing pants.50
Fear of falling, always the nightmare underside of the American dream, became a reality for millions cut adrift from once secure anchors in the economy. Even castoffs from white-collar jobs at banks and insurance companies found themselves lining up in pinstriped suits at Family Dollar stores for jobs paying $11 an hour. The country’s economic geography shifted accordingly. Thus the portion of Americans living in middle-income neighborhoods shrank; in 1970 about 65 percent of families lived in such communities, but by 2011 only 44 percent did.51
A cycle of creeping economic superfluity emerged decades ago. First furloughs, followed by brute pay cuts and the silent intimidation to work harder for less or else. Whether in unions or not, workers enjoying the modest pleasures of middle-class life were compelled to shed them. Like those at the Sub-Zero company in Wisconsin, which made freezers and ovens, where it was either a 20 percent pay cut or the company was off to Kentucky or Arizona or south of the border. Thousands of identical dilemmas presented themselves over the last generation. The outcome was invariably the same, so that anecdotal misfortunes became the open sores of a social disease. Now millions consider themselves fortunate if they can find any kind of job, often at the lowest ends of the hierarchy as home health aides or waitresses, willing to accept painful wage cuts. On average a worker losing a stable, decent-paying job was likely to see his or her earnings fall by 20 percent over the past quarter century.52
What once was the conventional forty-hour work week became for many a fifty- or sixty-hour week, in part because people held down two jobs and in part because the wonders of advanced electronics tethered employees to their jobs 24/7. One out of six middle managers worked more than sixty hours. Down at the lower end of the workplace hierarchy, security guards at G4S, for example, averaged the same. At other venues if hours didn’t go up, workloads did. If family income managed to just tread water, it was because 70 percent of women with children under the age of seventeen work now (59 percent of them with children under six) compared with 13 percent in 1950. Making matters worse, the United States is one of only four countries—Liberia, Swaziland, and Papua New Guinea being the others—that does not mandate paid maternity leave.53
Many of the afflicted—young, middle-aged, and even older—slid down into a provisional world of informal or temporary employment. The president of a temp agency in Toledo observed, “What I’m seeing is a large number of very talented people who are trying to land anything. Everyone is moving down.” What is sometimes called “contingent labor” has grown at an accelerating rate for decades. During the 1980s, it grew far faster than the U.S. labor force as a whole: ten million (or 25 percent) of the jobs created during the Reagan years were temporary ones. That pace picked up in the 1990s, by which time contingent workers accounted for one-fifth of the total workforce. It showed up in virtually all sectors of the economy, in fields and factories, at construction sites and warehouses, at offices and retail outlets. “Permatemps” made up more than a third of Microsoft’s workforce in the late ’90s. Temp agency revenues from placing professional workers doubled.54
“Flexible labor”—that is to say, disposable workers—met the imperatives of a declining, deregulated, and finance-driven economy. Mergers and acquisitions obeying lean-and-mean protocols, along with the flight of capital abroad, created a vast pool of idling, available labor. It sped up the process of deskilling and the disassembling of complex work into simpler tasks. Competition following deregulation generated intense pressures to lay off permanent employees. The premium placed on short-term financial results further ratcheted up those demands to compress labor costs as a way of funding leveraged buyouts. Temporary labor became the twentieth century’s fin de siècle version of Marx’s “reserve army of the unemployed,” now folded into huge, bureaucratic temp agencies and mobilized on permanent standby. All have become free agents—free, that is, of the security of tenure, retirement income, health care, vacation days, sick days, holidays, and any possibility of effectively voicing their displeasure in the workplace. Employers large and small are thereby also freed of legal obligation to pay into Social Security, Medicare, or unemployment insurance accounts, to respect wage and hour laws, or to pay workmen’s compensation.
This flotilla of free-floating working people—some call it the “precariat”—now make the hotel beds and clean the toilets, grow the food, harvest the food, serve the food, man the call centers, care for the sick and the young, dig the ditches, build the buildings, stock the warehouses, drive the trucks, landscape the villas, clear the wreckage, sell the merchandise, design the website, enter the data, and flood the temp agencies, all without any way of knowing where and how they’ll manage day to day, month to month, year to year. Even nurses, scientists, accountants, lawyers, and teachers work on “contingency.” The ingenuity of “human relations” professionals has been tapped to create a breviary of contingent work: day laborers, independent contractors, on-call workers, freelancers, temps, and part-timers. By far the most exotic subspecies is the oxymoronic permatemp, someone who works at the same place for years but always confined inside the gulag of temporary employment with all of its indignities, lower wages, lack of benefits, and all-around exclusion from company perks. Microsoft is not the only notorious user of the permatemp as high-tech menial; Hewlett-Packard, Verizon, and Intel also employ them.55
FedEx, for example, is a major user of the “independent contractor” category of labor. Nominally, its drivers “own” their own trucks but are in every conceivable way controlled by FedEx. However, because they are nominally “owners” they are ineligible for protection by the nation’s labor law and don’t qualify for benefits, workmen’s compensation, or unemployment insurance.56
Worse yet is the lot of the migrant agricultural laborer. Our country is in a literal sense fed by degradation. Workers live and toil amid filth, crammed into barrackslike shelters not fit for animals, smuggled across borders by “coyotes” into twenty-first-century forms of indentured servitude. Like their nineteenth-century rural antecedents, they seem utterly isolated in the American outback; but they actually labor, through a “flexible” network of contractors and subcontractors, for the largest retailers in the world.57
So vulnerable, ignored, unable to exercise elementary rights, living outside the law’s protections, often foreign, sometimes “illegal,” this world bears some of the features of a caste. Profits depend on its exploitation. Perhaps even more telling, so too do whole ways of life. That bazaar of high-end consumption, all its pleasures and conveniences, rests on this underworld, willed into invisibility. Our system of financial capitalism or “credit capitalism” relies to some considerable degree on the geometric multiplication of the working poor not only in the global South, where we are accustomed to finding it (and largely ignoring it), but now here at home.58
Debt, the original mechanism of primitive accumulation (both when capitalism was getting started in the West, and later when it was bringing the global South under its imperial sway), has likewise been eating away at the innards of everyday living for a long time now. It is an affliction of the old and young, low- and middle-income earners, the unskilled and overcredentialed.
Debt sometimes acts as the Dr. Jekyll and Mr. Hyde of commercial society. For some it has been a blessing, for others a curse. For some, the moral burden of carrying debt is a heavy one. And no one lets them forget it. For others, debt bears no moral baggage at all, presenting itself rather as an opportunity to advance, and if proved insupportable, is dumped without a qualm.
It turns out that those who view debt with a smiley face, who approach it as an amoral pathway to wealth accumulation, and who tend to get forgiven the larger their defaults, come from the higher echelons of the economic hierarchy. Then there are the rest, who get scolded, foreclosed, and dispossessed, leaving scars that never go away, wounds that disable the future. This upstairs-downstairs differential class calculus might be called the politics of debt. In our modern era of disaccumulation, this social arithmetic has taken on a poignant irony when it comes to hearth and home.
Securitizing mortgages (subprime and other kinds) and turning them into collateralized debt obligations traded around the world epitomized the “paper entrepreneurialism” turn-of-the-century capitalism had become. Rewards were unimaginably lush on all the Wall Streets of the planet. Mutating homes into securities, and treating them as ATMs, as millions did, fatally undermined an older cultural order which our financial age capitalism could treat only as an impediment.
Facing the social upheavals of the Great Depression in the 1930s, including the anger generated by mass foreclosures, the New Deal had rushed through various housing and farm finance reforms. They expressed the deep conviction articulated by President Roosevelt that “a nation of homeowners, people who own a real share in their own land, is unconquerable.” This was a bipartisan persuasion. Republican president Coolidge, a Vermont puritan, believed “no greater contribution could be made to the stability of the Nation and the advancement of its ideals than to make it a Nation of home-owning families.” Presidents before and after Silent Cal, including FDR, felt likewise that the savings and loan industry was premised on the notion that “a man who has earned, saved, and paid for a home will be a better man, a better artisan or clerk, a better husband and father, and a better citizen of the republic.” Their reasoning once seemed straightforward enough: “Thrift is a disciplinarian of self-denial, temperance, abstemiousness, and simple living.”
How alien that now sounds to a world in which the home reemerged as a cash cow—until it could no longer be milked. Cautiously constructed mortgages once functioned as the lubricant of social stability and appealed to empowered circles like the Roosevelt administration for just that reason. The unchaining of debt turned that world upside down. It unsettled, intimidated, and in the end ravaged the “homeland,” and not only at home.59
Household debt in 1952 amounted to 36 percent of total personal income; by 2006 it accounted for 127 percent. Between the late 1970s and the late ’90s the average monthly charge on credit cards climbed from 3.4 percent of income to 20 percent. In 1980, fewer than 1 percent of all financial institutions offered home equity loans, but by the end of that decade, 80 percent of all banks and 65 percent of savings and loans did. If you were under the age of thirty-five after the turn of the millennium, you belonged to the “debt generation,” relying on credit for education, housing, and health care as debt levels grew at twice the rate of income.
You could borrow to get a boat, a degree, a place to live, or just to stay alive. Financing poverty proved lucrative, for example. Taking advantage of the low credit rating of poorer people and their need for cash just to pay monthly bills or to eat, some check-cashing outlets, payday lenders, tax preparers, and others levied interest in the mid three figures. And many of these poverty creditors were tied to the largest financiers, including Citibank, Bank of America, and American Express.
Poor or not so poor, pressure to fall deeper and deeper into debt had less to do with uninhibited appetite than it did with the pervasiveness of insecure employment, the decline in state supports, and slowing economic growth, especially among the elderly, young adults, and low-income households. Credit promised to function as a surrogate “plastic safety net.” A sizable majority of low-income households resorted to credit not to live the high life, but to meet emergency bills and basic living expenses. But as income levels shrank, the net sagged… and then ripped.
Poor or not so poor, shouldering freight loads of debt became increasingly impossible. The portion of disposable income spent on servicing debt rose from 10 percent in 1983 to 14.5 percent in 2006. For older and younger people and low-wage earners, over half their pretax income went to servicing their debt. Nor were there any savings left to cover the deficit: the amount of personal income saved in 1979 averaged 8.9 percent; it was nearly negative, at 0.6 percent, in 2007. Personal bankruptcies quintupled since the late 1970s; between 1980 and 2005, they leapt from about 300,000 annually to 2 million.60
Meantime, when FIRE caught fire, it got bailed out, which is a kind of double-indemnity form of auto-cannibalism. After all, the sector first grew mighty by ingesting the surrounding economy; when the markets did a free fall, the commonwealth was again tithed to keep them from crashing. Champions of the risk society off-loaded risk, when it became too risky, onto the shoulders of everyone else—leaving the whole social fabric at risk.
Retrogression has a stark arithmetic. During the past thirty years, hours of labor have gone up (nine weeks per year longer than the work-ethic-obsessed Germans; the average middle-class American couple works 540 hours—that is, three months—more each year than the same couple did a generation ago); the number of years people work during a lifetime has gone up; the number of family members working has gone up; and the proportion of family income needed to reproduce the next generation (child care, health, education) has gone up.
Nose-to-the-grindstone calisthenics, however, have left the social organism not stronger but dehydrated. One-quarter of the workforce earns less than the poverty level for a family of four; 50 million live in poverty or near poverty, and that number increased by about 30,000 people each year during the post-dot-com boom; the number of families with children under six living in poverty has gone up; the number of impoverished children rose too, ten times faster than the increase in the total number of children; one-third of those kids live in households that can’t consistently afford to eat; and the United States has the highest infant mortality rate of any of the twenty developed countries that belong to the Organization of Economic Cooperation and Development (OECD), as well as the highest incidence of obesity, mental illness, and consumption of antidepressants.61
Alternative markers of social well-being have also tracked steadily downward. Fortune magazine predicted in 1967 that wages would rise by 150 percent between then and the year 2000. As it turned out, real income in 1990 was exactly where it had been when those prophecies of good times ahead were first enunciated. “Morning again in America” dawned only for a favored sliver. Wages have at best stagnated while productivity rose markedly. The gross disparity in the distribution of income and wealth, more skewed than ever before, has by now become a commonplace of public notice, inscribed memorably in the Occupy Wall Street rubric “We are the 99%.”62
Public health care declined and the need for private insurance rose; housing prices and interest rates became more volatile. Once productivity and wages had risen in tandem, doubling between 1947 and 1973. Since then, productivity has increased by 84 percent while the average hourly wage has stayed flat. The value of the minimum wage declined precipitously during the whole era, by a third just in the ten years between 1979 and 1989, way below the poverty threshold. Even after the latest full increase in the federal minimum wage took effect in 2009, its real value is less than it had been a half century ago.63
A generation ago, one-third of workers in the private sector had traditional defined-benefit pensions (including 84 percent of those working in companies employing more than 100 people); now 16 percent do, as corporations shifted into 401(k) plans or froze pensions, or just got rid of them. No matter, as according to the Pollyannas of the new order, people will “e-tire” (meaning they had better join the ranks of the precariat or suffer the consequences) instead of retire. They were invited, by proponents of the new order, to practice the arts of “self-actualization” in their sunset years. Three-quarters of low-wage workers had no employer-paid health insurance, and three-quarters no sick days.64
Fifty years ago, half of those who just lost a job received unemployment insurance. Before the Great Recession, that proportion had declined to a third (and less than one-fifth for low-wage workers). That was itself a symptom of the precarious downward trajectory set in motion by capitalism’s vaunted new flexibility. Millions now compelled to work part-time or irregularly, on board when needed, overboard when not, could no longer meet the minimum monthly earning requirements to qualify for unemployment benefits.
Alongside those lucky enough to have regained some means of livelihood, a demoralized band of the long-term unemployed, numbering at least 1.5 million people, has gathered, having all their insurance exhausted after 99 weeks. The “99ers” come from all walks of life, including the college-educated, middle-management castoff who found himself not only out of work but evicted, down to his last couple of hundred dollars, and about to move into his car. Shape-ups forming outside home-improvement stores and plant nurseries in Las Vegas included jobless immigrants alongside Anglos. On average, people now stay unemployed for six months, a figure never seen since such statistics were first recorded in 1948. During the Great Recession, an astounding 19.4 percent of all men in their prime earning years (twenty-five through fifty-four) were jobless—this was also a record. The “99ers” are making up a minisociety of the superannuated.65
All sorts of public provisioning and protection, not just unemployment insurance, grew similarly more scarce. “Welfare as we have known it” was abolished by the Clinton administration. Federal training programs for the technologically unemployed shrank drastically. Federal housing subsidies have dropped by two-thirds since the 1970s. During the Reagan years, spending on food stamps declined by 17 percent, school lunch programs by a third. Today, preschool enrollments are among the lowest in the developed world. The value of Pell grants for low-income college students, which once constituted 84 percent of tuition, by the first decade of the new century covered only 32 percent.
Auto-cannibalism acquired its own special vocabulary. What we once called firing has been euphemized as “business process engineering,” as “slimming,” “right-sizing,” “rationalizing,” “focused reduction,” “reinvesting,” “outsourcing,” “release of resources,” “redundancies,” or more cheerfully, “career change opportunities.” Only rarely did someone speak more plainly, as did Labor Secretary Robert Reich when he acknowledged in 1996 that “the job security many workers experienced in the three decades after World War II is probably gone forever.” On the other side of a widening class chasm, the language of predatory triumph trumpeted the new order—for example, best sellers bore such chest-thumping titles as Barbarians at the Gate, and The Disposable American. Intel CEO Andy Grove chose Only the Paranoid Survive as his book’s title to capture the pathology that had become the new normal.
Conservative pundit George Will noticed which way the wind was blowing when it first began in the early 1980s and was amazed: “This represents a transfer of wealth from labor to capital unprecedented in American history. Tax revenues are being collected from average Americans… and given to the buyers of U.S. government bonds—buyers in Beverly Hills, Lake Forest, Shaker Heights, and Grosse Point and Tokyo and Riyadh. If a Democrat can’t make something of that, what are the Democrats for?”66
Indeed! It was reasonable to suppose, as many did over the decades to follow, that this drift of events would run aground on the shoals of resistance to such gross inequality, injustice, and decline. That did not happen, not among Democrats certainly, but also more portentously perhaps, not among those on the front lines whose lives were being reengineered and sometimes dismembered by the new capitalism. Embedded deep within the political and cultural life of the new order were the mechanisms of acquiescence, the means of persuasion and coercion, the promise and fear, that kept it aloft.