As an economic organism the labor movement barely registers a pulse. Politically its heartbeat is more detectable but just how or whether those compressions oxygenate its other vital organs is hard to tell. A historic low point was reached in 2012, when the percentage of unionized workers in the private sector declined to a level not seen in a century—an anemic 6.6 percent. This was the woeful anticlimax of a long generation of decline.
How unlikely that seemed as the tumultuous decade of the 1960s ended. Trade unions represented more than a third of the workforce; teachers and other public sector workers all over the country had achieved the right to organize and bargain collectively; nonunion companies felt the pressure to meet union standards of wages and benefits; Great Society programs enlarging the welfare state, especially Medicare and Medicaid, were law thanks in large measure to the labor movement’s political clout; key industrial unions like the United Automobile Workers could claim genuine credit for helping bring down Jim Crow; and while the leadership of the AFL-CIO saluted the Cold War and the hot one in Vietnam, some national unions and their affiliates joined the antiwar resistance that helped effectively depose a sitting president.
Depending on which end of the looking glass we peer through, this might be judged a Panglossian view of the way things really were. After all, the head of the AFL-CIO, George Meany, refused to attend the 1963 March on Washington (or allow his group to endorse it) because he along with J. Edgar Hoover believed Martin Luther King to be a closet Communist or fellow traveler. The federation’s foreign operations did yeoman work undermining labor and political movements abroad that didn’t subscribe to the dogmas of the free world. Compared with earlier times, little dissent was tolerated within most major unions when it came to major issues of foreign or domestic policy, not to mention on matters of union economic strategy. Construction unions (and not only them) fiercely resisted efforts to dismantle the racial protocols in place for generations that kept jobs there lily-white. (Indeed thirty years earlier, the AFL had lobbied hard to exclude a nondiscrimination clause from the Wagner Act so as to protect its segregated locals.) Concrete efforts to organize the open-shop South were scarce and, except when bound up with the civil rights movement, avoided. And the rest of the unorganized parts of the labor force were left that way.1
Many people today look back at the entire midcentury period with great fondness, wishing for its revival. So it is useful to note that what was once called labor-liberalism included in its makeup much of what now seems deplorable. Yet it is at the same time striking how robust the institutional and political capacity of working people to stand up to the “interests” was, as compared to the sorry state of the union now.
At no time was that muscularity more on display than during the early 1970s, when an avalanche of strikes marked the end of the long 1960s. The first years of the new decade witnessed the largest strike wave since the great postwar shutdowns of 1945–46, when the labor movement last took to the streets to determine not only its own fate but the future of the country.
Nearly every province of the economy was shaken: mailmen and teachers, coal miners and steelworkers, farmhands and auto assemblers, garment makers and long-haul truckers, meatpackers, longshoremen, construction and subway employees, telephone operators, and railroad and social workers all walked out, some more than once and some, as in the case of the postal and subway workers, defying the law. Two hundred thousand postal employees ignored a Taft-Hartley injunction and their own frightened leaders, making theirs the largest wildcat strike in American history. In New York City, drawbridge workers took home fuses, electrical components, and keys so the bridges remained upright. Long-haul truckers stopped trucks moving goods into cities like Cleveland. The governor of Ohio ordered 4100 National Guardsmen into the streets.2
If there was a proletarian moment during the insurgent sixties, this was it, even if it crossed time zones into the seventies. It was the nature of these uprisings as much as their timing that made them so. Many were initiated by rank-and-file workers (often these were wildcat strikes, undertaken against or without the approval of the union bureaucracy). In steel, coal, and elsewhere, they meant to overthrow an entrenched and often corrupt leadership. These shop-floor militants were just as frequently young men and women, some Vietnam veterans, deeply influenced by the counterculture, the civil rights and black power movements, feminism, and the antiwar movement. Their beards, beads, long hair, and Afros, as well as their insouciant disregard of established institutions on the shop floor and union hall, marked them as kindred spirits to the sixties’ antic antiauthoritarianism. The president of the UAW Local in Lordstown, Ohio, called the strike at the General Motors factory there “the Woodstock of the working man.”3
So too did their demands break the mold of conventional collective bargaining and echo a broader impulse to throw a wrench into the machinery of postwar liberal society. Most famously, at the newly opened GM plant in Lordstown (running the fastest assembly line in the world), strikers were worked up about the intense speed of the assembly line, the overbearing discipline imposed by management, the numbing routinizing of their jobs, the demeaning hierarchies of race and gender that permeated the workplace, and the dangers to their health and their souls. More pay, once the universal solvent of these antagonisms, was no longer a foolproof lubricant.
For a brief few years, “your money or your life” became a trickier question to answer. Even the government felt compelled to pay attention. A 1973 report, “Work in America,” described an epidemic of what it called “blue collar blues” sweeping through the best-rewarded precincts of industrial America.4
Yet the tide was turning. One reason for ratcheting up the pace of work at Lordstown and elsewhere was to fend off the competition from European and Japanese industries flooding the market with lower-cost goods produced with more modern technologies. Huge chunks of the world market in household electrical appliances and metalworking were lost to Japan and Germany respectively. So too in steel and in America’s signature industry, cars, where 22 percent of the domestic market belonged to Japan by 1980. This was the beginning of what became the long decline of American industry still under way today.
More immediately, the rest of the 1970s was marked by economic stagnation in the West, particularly in the United States. Here inflationary and speculative pressures on the dollar and the abandonment of the Bretton Woods fixed exchange rates exacerbated the impulse to drive down labor costs. Trying to maintain forward momentum when the economy was headed in reverse was a task even a more unified movement would have found difficult.5
But however spirited, the upsurge remained a splintered one. Despite its breadth and energy, and notwithstanding its occasional victories over major corporations, the government, and its own union hierarchs, the strike wave of these years did not actually measure up to the one after the war or the ones that made the thirties so memorable. It is not only that the triumphs were few and often not lasting. Unlike their predecessors, these were often provincial affairs, local to one union or one company or town. They rarely cross-fertilized and never established an organizational presence as a coherent movement. While activists expressed cultural and ideological affinities with the era’s zeitgeist of resistance and liberation, these remained largely implicit, matters of lifestyle and attitude, not the elements of a programmatic alternative to the crisis of confronting Cold War labor-liberalism.
To be sure, there were exceptions to all these “defects.” The postal employees, communications workers, and coal miners confronted whole industries and even the federal government. Especially in the auto industry but elsewhere as well, efforts to conjoin the struggles for racial and class liberation took root. Running against the prevailing winds in the private sector, the decade witnessed great union gains among public employees, especially among women and African Americans. The general restiveness breathed new life—if only momentarily—into what remained of the more reform-minded circles of the New Deal inside the Democratic Party. Nonetheless, the last proletarian rising did indeed turn out to be the last and least of them. At the outset of the 1970s, 2.5 million people participated in strikes involving at least 1000 workers; come 1980 that number had shrunk to 300,000. Industrial unions began a massive hemorrhaging. The consequences were grave not only for the life expectancy of the labor movement but for all those outside its ranks.6
In the gravest offense to the amour propre of the American credo, the arrow of history pointed backward. Once industrial unions were firmly entrenched after the war, the gap between this primary labor force (those unionized or by economic osmosis enjoying the unions’ benefits without belonging) and secondary workers living outside that charmed circle, usually in nonindustrial sectors, had lamentably widened for a generation. Now it began to narrow. But that wasn’t good news. What it signaled instead was that the deindustrialization and de-unionization of primary workers brought them ever closer to the working and living conditions of the swelling ranks of the secondary labor force. In meatpacking, steel, aircraft, flat glass, clothing, petroleum refining, or wherever one looked, earnings fell precipitously, on average by 43 percent. Between 1979 and 2012, the productivity of the average American worker, after inflation, rose by 85 percent but the average, inflation-adjusted wage increased by only 6 percent and wages in manufacturing declined by 21 percent.
You might be picking crops, answering the telephone, packing fish, stocking warehouses, emptying bedpans, entering data, serving up food, cleaning hotels, or minding the megastores of America’s domestic third-world economy. Or you might still be toiling on the shrinking reservations of core heavy industry, surviving the latest round of lean-and-mean downsizing, clinging to your unions, skills, and middle-class aspirations. No matter, as all increasingly felt the sting of hyper efficiency drives and social stagnation.7
Invisibles working in the shadow lands of the secondary sectors had always lacked the security, dignity, material compensation, and rights that had defined civilized capitalism. Now their once envied brothers and sisters felt these afflictions as well. “Brothers and sisters”? Even that language felt archaic.
One reason for the decline was that a sclerotic union bureaucracy—lacking in imagination and political courage, cognizant perhaps that the world they’d known was turning against them but hoping the old formulas would still compute, anxious to hang on to what they had (access, influence, treasuries)—stood in the way. The hierarchs of the AFL-CIO were by this time light-years removed from the more audacious imaginings of their predecessors. A century earlier, even the AFL, its allergy to anticapitalist radicalism notwithstanding, talked about “the class struggle” and held the state in suspicion. Now management prerogatives had become sacrosanct as George Meany explained in 1976: “Who are you if you are a labor man on a board of directors? Who do you represent? Labor does not want to run the shop. In the United States participation is absolutely and completely out.”8
It was hard enough to face up to the assaults of powerful and determined steel, coal, and auto companies, and to neutralize the racial demagoguery that had stifled organizing in the South for generations or the psychological and cultural phobias that left women workers out in the cold. How much harder to attempt all that when, in as many cases as not, the bureaucracy left behind as the institutional legacy of an older era’s democratic rebellion now disowned its own inheritance.
Betrayal, however, no matter how often it may have happened, is too thin an explanation for the historic undoing of a movement that had changed the face of twentieth-century America. Plenty of fresh blood was in any event pulsing through the arteries of the workplace, ready to revitalize its fighting capacity. The rebellions were evidence of precisely that. But they faced more formidable enemies than an encrusted labor bureaucracy.
Economic slowdowns, not to mention actual retrogression, are hard to weather under any circumstances. There was nearly a decade of such regress, beginning in the mid-1970s and continuing through the early Reagan years. In the heartlands of American industry, the homelands of the labor movement, this slide never ended, even while other segments of the economy picked up. Even before the financial meltdown of 2008, machine tools, consumer electronics, auto parts, appliances, furniture manufacturing, and telecommunications equipment had become anemic imitations of what they once were. We still haven’t touched bottom.
High levels of unemployment—over 10 percent during the Reagan recession of 1981–1983—alongside unprecedented inflation in the high teens made collective bargaining into at best a series of defensive maneuvers. This was in part the point of a deliberately induced severe contraction, otherwise known as the “Volcker recession,” perpetrated by the Federal Reserve chairman to squeeze the last drop of cost-driven inflation out of the economy. Unemployment and bank failures reached levels not seen since 1940 and records were set for bankruptcies and farm foreclosures. It had become an axiom of mainstream economics that there existed a “natural rate of unemployment”; a rate lower than that would produce inflationary pressures on labor costs. As Reagan adviser David Stockman explained, unemployment “was part of the cure, not the problem.” Paul Volcker would later candidly observe that “the most important single action of the Administration helping the anti-inflation fight was defeating the air traffic controllers’ strike.” He was driving home the point that labor unions were responsible for American industry’s lack of competitiveness. Destroying the air traffic controllers’ union was a high-visibility way of demonstrating the government’s determination to break the power of unions more generally.
As one company after another shifted its operations either into the right-to-work (i.e., nonunion) South and Sun Belt or abroad, the labor movement’s leverage further diminished. What new manufacturing jobs got created here were mainly sited in these “growth” regions. By 1972, over a quarter of the U.S. manufacturing workforce lived in the South. Outsourcing abroad was an irresistible strategy. The large-scale export of jobs first emerged as the economy entered the downward spiral of the seventies. Even at that early juncture, one-quarter of all new capital investment by U.S. companies in electrical and nonelectrical machinery, transportation equipment, chemicals, and rubber manufacturing was happening abroad, mainly in Europe. By the 1980s, investment in the third world (notably Brazil, Taiwan, Mexico, and South Korea) accelerated, as did the flow of finished goods back into the American market.
Saving money was not the only reason to relocate work offshore. “Off-shoring strategies are more about shifting relations of power than gaining efficiency,” observed one management scholar. Well before the new system was branded “flexible,” management had recognized the wisdom of reversing old practices by decentralizing production, thus weakening the leverage of unions. After the Flint sit-down strike of 1936–37, for example, GM never again wanted to be held hostage by worker actions in a single facility so vital it could shut down its whole far-flung enterprise. The company soon enough began breaking apart and relocating plants around the country, especially in the South, to weaken the sinews of solidarity.
As a global phenomenon, flexible accumulation made the obstacles to union growth that much more formidable. It allowed for quick changes in location and the divvying up of production units into smaller and smaller sites, which tended to discourage joint action. Differences in labor law from one country to the next, not to mention the sometimes sharply opposed outlooks of workers in less developed regions hungry for work at any price, made joint union action all the more difficult.9
Workplaces are mini-political as well as economic organisms and what may seem to obey a strictly economic logic usually carries a hidden agenda about who’s in charge. Under the new regime of flexible production, companies off-loaded functions once performed internally to outside contractors and subcontractors. Many of these operated as nonunion facilities. In part that was precisely the point, its effects showing up right away on the bottom line. Disaggregating the production process in this way also eroded the social ties that often underlay union sentiment in more centralized work environments.10
In some cases, like trucking, what flexibility turned out to mean was transforming employees into independent contractors—independent therefore of most labor protections and union representation, and free to absorb the costs of equipment and fuel that were once the burden of the trucking companies.
Flexibility on the shop floor sometimes meant installing “quality circles” and other forms of faux worker participation in planning and executing work routines. This studied overture to making work life more democratic and creative turned out to be a kind of participatory-management discipline. To some very considerable degree, work discipline in a capitalist economy must always be internalized if its system of free wage labor is to function smoothly. Under flexible arrangements this became even more emphatically so.
Moreover, in a society saturated in the democratic atmosphere given off by the civil rights and women’s liberation movements, it was inevitable that zeitgeist show up eventually on the shop floor. Management initiatives that purported to invite greater degrees of shop-floor democracy, abetted often by compliant unions, were the ghostly remains of the liberationist impulses set loose by the earlier rebellion against work, now translated into the sterilized if fashionable argot of labor-relations experts. And quality circles and other managerial experiments inviting worker input did indeed sometimes improve efficiency and productivity. They were a bottom-line pick-me-up, but not much else.11
Tread water if you can. That became the unspoken rule of thumb within the shrinking ranks of organized labor. Businesses drove the point home, insisting on and, in most cases, winning major concessions. Wage increases went from meager to none to less than none as one round of concessionary bargaining followed another, decade after decade. In a finance-driven economy infatuated with sophisticated mathematics, differential calculus, and Gaussian copulas, simpler arithmetic told the good news and the bad: layoffs = high stock price. For example, at Sears when 56,000 people lost their jobs, the company’s share price rose 4 percent; Xerox shares increased by 7 percent when 10,000 workers were handed pink slips. And this was happening in the boom years of the 1990s. During the Great Recession, what began as furloughs in both the private and public sectors quickly devolved into layoffs and, for those who went hats-in-hand begging to remain on the payroll, brutal wage cuts of 20 percent and more.12
More than wage givebacks were demanded by employers who threatened—even though it was illegal to do so—to pack up and move unless they got what they wanted. Fringe benefits—vacation days, medical insurance, pensions, holidays, job security, and more—were sacrificed in an atmosphere of fear and intimidation. Cowering, even if there was no other practical alternative, did little good: de-unionization or fatally weakened unions spread plaguelike throughout the industrial heartland, including at major corporations like Phelps Dodge, Eastern Airlines, Greyhound, Boise Cascade, and International Paper. In 1983, more than half of telecommunications workers were union members, but fewer than 30 percent would be by the turn of the millennium. Teamster union members accounted for 60 percent of the drivers in 1975 but only 25 percent by 2000. Wages during that stretch fell by 30 percent, by 17 percent in steel, by 40 percent in meatpacking.13
Soon enough, even the future was offered up in appeasement. Beginning in the 1980s and accelerating rapidly after that, unions began agreeing to two-tier wage arrangements in which new hires—young men and women in the main—would come on board at wages (and benefits if they kept any) half or even less than half of what existing employees were making—and with no real hope of ever catching up (new contracts now contain a lifetime cap on pay). A newly hired autoworker today brings home about what his grandfather did in 1948. In an economy of diminishing expectations, all of this was met with barely more than a whimper of discontent.
Once rare, two-tier arrangements now cover 20 percent of all union contracts, and not just in heavy industry but also in retail, nursing, supermarkets, and among public workers. And these tiers (a caste system of veterans and “B-scalers”) define health and pension benefits as well. That this further corrodes the basis of an already fragile solidarity is a painful reality, one likely to grow harsher.14
Alongside the carcass of the old economy a new one was being born. Not the glamorous one in the Silicon Valley and Alley on the two coasts, but the grittier, shabbier, under-the-radar one found in mass retail outlets, fast-food franchises, Inland Empire warehouses, assembly-line laundries, hotel kitchens and bathrooms, day-care centers and hospital wards, landscapes of the elect, garment sweatshops, casinos, fly-by-night construction camps, data-processing and call centers, tomato fields and almond groves, packing plants, fish factories, and taxicabs.
Here is where the work goes on that makes high-end lifestyles possible, the underbelly of “new class” prosperity and its global city. Here, given its distinctive darker complexion, is the new caste serving the new class. Here the mechanisms of the free labor market operate inexorably not only to depress standards of living but to discourage and demoralize the will to resist. Here in this vast twilight zone firings are instantaneous, severance unheard of. Here malnourishment, bad housing, homelessness, drugs, and bad health have everyone living on the precipice, with no margin for error.
In 2007 a quarter of those classified as poor were employed full-time, all year. Most lived in an isolating, fearful climate, subject to chronic police surveillance, new vagrancy laws, and stop-and-frisk profiling; if they were unfortunate enough to have to turn to welfare, they had to first pass a battery of drug tests, paternity tests, criminal background checks, and school attendance tests: a recipe for docility if ever there was one. Too much fear, too much desperation, too much was at stake, especially for those birds of passage from abroad, whether here legally or not, or for those refugees from deindustrialization just hanging on.15
Technological marvels, heralded as signposts of a new and liberated economic era, carried with them a nightmarish underside. Electronics of all sorts and the microprocessor in particular allowed for a level of constant surveillance of the workforce and measurement of its output. These systems may count every keystroke made by data-entry and data-processing clerks. “Telephone tapping” at call centers allows for real time oversight. Active or magic badges track an employee’s movements and locations. Video surveillance techniques, once envisioned by Frederick Taylor (the godfather of scientific management), now help in performance evaluations. While initially electronic surveillance focused on clerical workers in finance, insurance, and telecommunications, it is now common among waiters, nurses, hotel maids, and others. All sorts of workplaces, from Verizon phone centers (where the pay is good) to Walmart store aisles (where the pay is lousy) subject their employees to a minute and threatening scrutiny and a dense rain forest of penalties for every infraction, everything from unlicensed bathroom breaks to unauthorized schmoozing. Airlines, for example, run a performance evaluation system for flight reservation and sales agents. At one airline, agents are expected to handle 275 incoming calls a day with a 90 percent booking ratio. If they fall below that standard, they get disciplined; so too if a call lasts more than 215 seconds or if an agent should take more than twelve minutes away from the computer to use the bathroom. American Airlines installed remote-controlled surveillance software to supplement its listening devices at its Dallas–Fort Worth operation.
All of this seeps into the psyche of workers, so that soon enough they become their own disciplinary police. And that is exactly how the originators of prison surveillance schemes like Jeremy Bentham’s “panopticon” expected them to work. Titles used by temp workers for their online zines capture the ire of many: “McJob,” “Working for the Man,” “Contingency Crier,” “Guinea Pig Zero,” and others constitute a literature of anger, resentment, and on occasion revenge and sabotage. Speeded-up work and spiraling levels of stress became the norm for nurses in assembly-line surgeries in Boston, Guatemalan fish packers in New Bedford, and every subspecies of labor in between.
Lean-and-mean meant doubling up on work assignments without complaint and keeping your head down to avoid the next round of employee triage; after all, someone enlisted in the army of temporary workers could fill your spot on a moment’s notice.16
To be caught in this web of flexible accumulation (or what might more aptly be called “flexible dispossession”), pressured from all sides and in precarious circumstances wherever you turn, breeds a distinctive emotional chemistry: part anxiety, part depression, part stupefaction, hopelessness, dread, an immobilized passivity. And all the while the tormented worker is supposed to be wearing a smiley face, adding emotional effort to an experience that conspires against such feigned happiness.
Fear inside the black box of the workplace is arguably the common state of life in all capitalist economies. Power is vested in the holder of the property title, whose writ is law. People work there at will—that is to say, at the will and on the terms dictated by property, however that property may be embodied. There are only two exceptions that restrain that prerogative. Unions may. Or the government may. Unions in these newer sectors of the economy are largely unheard of. But what about the government?17
Government too has seemed blind to and sometimes complicit in what was going on. Lately “wage theft” has become a public scandal. Employers all over the country think nothing of violating labor laws covering minimum wages, overtime pay, hours of work, and safety regulations—all the basics of civilized capitalism. Beating the system is the system. They know no one is watching. Heading back to the future in this way has become the norm for millions of working people. It started decades ago.
Official indifference or open political hostility to the rights and material necessities of labor picked up steam soon after the rebellions of the early 1970s died away. Just moments after he assumed office, Ronald Reagan fired all the striking aircraft controllers (their union had actually supported his election) and permanently replaced them (at first with the army). This was widely perceived by the business community as a signal that times had changed. Unions could be confronted more aggressively, their rights challenged or abrogated and their striking members replaced permanently with others who once upon a time had been but no longer were called scabs.18
Even before the Reagan years, the Supreme Court had ruled that workers could not strike for the duration of the union contract, no matter how routinely the contract was circumvented or violated. So began a backward march into the nineteenth century, when the injunctive power of the government was regularly deployed to break unions and their strikes. (Indeed, by far the most common use of the Sherman Anti-Trust Act from its passage in 1890 until the New Deal was not, as one might suppose, to break up monopolies, but as the legal justification for issuing injunctions against union strikes, considered illegal combinations in restraint of trade.) This proved a notably effective weapon against some of the wildcat uprisings of the seventies as well. The exception was the miners’ strike of 1977–78, which lasted 110 days. That walkout occurred in defiance of President Jimmy Carter, union president Arthur Miller, and a Taft-Hartley injunction ordering the men back to work. Victory or not, the government was making it plain which side it was on. While organizing drives had once culminated in union victories two-thirds of the time, by the end of the seventies the labor movement was losing in a majority of cases, and matters only grew worse after that. The Wagner Act, once called labor’s emancipation proclamation, became itself a bureaucratic maze ingeniously deployed by management legal teams to delay the resolution of grievances and organizing drives until the stamina and resources of their labor adversaries were exhausted. Much of this happened under Democratic as well as Republican administrations.
Funds and personnel to enforce labor laws were whittled away over the decades to follow. A serial violator of health and safety laws was picked by President Reagan to run OSHA. It took a court decision to force the agency to enforce its own mandates. The National Labor Relations Board was increasingly staffed by people friendly to business; Reagan even appointed Donald Dotson, who had devised anti-union strategies for Wheeling-Pittsburgh Steel, to run it.
Rulings made organizing increasingly difficult. Management practices once considered “unfair” under the Wagner Act of 1935 became commonplace. For example, captive meetings, in which employees were compelled to listen to management’s antiunion propaganda, were normalized. Threats to move or shut down a plant if its workers organized, also a legally “unfair” way to cow the workforce, were largely ignored. Reprisals for organizing, including firings, all of which were presumably outlawed, became instead customary. Elections to decertify unions, once rare, no longer were.19
Political mobilization by the business community and by legislative satellites like the American Legislative Exchange Council (an organization founded in the 1970s of conservative politicians and policy wonks, heavily funded by major corporations like Exxon Mobil and Koch Industries, which formulates prospective legislation across a broad range of issues) further eroded government labor protections in states around the country. Class consciousness in these quarters rose markedly beginning as the Keynesian synthesis disintegrated. In 1971, Lewis Powell, later a Nixon appointee to the Supreme Court, wrote a memo to the United States Chamber of Commerce. It did not mince words: “The American economy is under broad attack.… The overriding first need is for businessmen to recognize that the ultimate issue may be survival—survival of what we call the free enterprise system, and all this means for the strength and prosperity of America and the freedom of our people.”20
Among the first to answer that call to arms was the Business Roundtable. An organization run by two hundred top CEOs—representing the core of American capital-intensive manufacturing, including Ford, GM, DuPont, Alcoa, GE, and AT&T—it was created in 1973 to counter hostility to the corporation then still percolating in public life. It originated specifically in an effort to defang the labor movement and its campaign to repeal crippling clauses to the Taft-Hartley Act that outlawed the union shop and helped make the South impregnable for union organizers.
Peak corporations mobilized to repress the rise in construction costs in particular, but in manufacturing more broadly. The National Right to Work Committee, founded decades earlier by former Congressman Fred Hartley, achieved a great victory when President Gerald Ford vetoed a law that would have legalized “common site” picketing—meaning the right of different construction crafts to support one another at the same site. Another business confection, the Public Service Research Council (colloquially known as Americans Against Union Control of Government), helped force President Carter to back away from labor reform. Business sponsored PACs grew at the rate of one per day in the 1970s.21
When, in 2011, Scott Walker, the newly elected governor of Wisconsin, moved to deprive the state’s public workers of their rights to engage in collective bargaining, mass demonstrations and even an occupation of the Capitol in Madison ended in defeat as did the subsequent effort to recall the governor. Indeed, public employees, especially teachers and their unions, became the scapegoats of choice all over the country. By 2012, the momentum established decades earlier climaxed. In Michigan, where the heart and soul of industrial unionism was born, the legislature passed a right to work bill that was unthinkable just months before, and actually followed the Democratic Party’s national victory. And Michigan wasn’t the only state in the union heartland to do that and more. Child labor laws were chipped away at in states like Utah, Minnesota, Maine, Ohio, and Missouri, allowing young people (as young as fourteen) to work in formerly prohibited industries or eliminating the need for work permits or relaxing rules that prevented youth from working until ten at night on school days. Newt Gingrich suggested putting poor children to work in “the cafeteria, in the school library, in the front office,” so they could have the same chance as middle-class kids “to pursue happiness.” Meantime, a federal study found that young workers (between ages fifteen and seventeen) suffer from higher rates of work-related injuries compared with those twenty-five and older.22
Assaults on the social welfare state further undermined the wherewithal to stand up to the new order. Unemployment insurance covered a declining segment of the workforce and a declining percentage of income. On the eve of the financial meltdown in 2007 one-third of the unemployed received payments compared with one-half in the 1950s; less than one-fifth of low-wage workers were eligible, blocked by new barriers requiring minimum monthly earnings to qualify. Those still clinging to benefits could bring home on average only 35 percent of their weekly wage. Federal training programs for the technologically displaced shrank from $17 billion in 1980 to $5 billion in 2005. The federal minimum wage, already at the poverty level in 1980, fell another 30 percent below that in the next decade. After it was finally raised in 2007, its real value in 2009 would still be less than it had been a half century earlier.
Federal housing subsidies diminished by two-thirds since the seventies, even as rents and prices rose. Family provisions essential to the modern work experience—day care, paid maternity leave, health care, living wages, and paid sick leave—were provided in niggardly amounts, cut, or never even got a serious legislative chance. Federal government metrics kept the poverty line out-of-date and way below the level needed to maintain a decent way of life: a nifty way of keeping down the numbers of those eligible for aid. Privatization of what everybody once took for granted were public goods—water, education, public housing, transportation—further depressed the social wage.
Abolishing “welfare as we have known it,” as the Clinton administration managed to do, was a way of enlarging the pool of vulnerable, low-wage workers with no other option but to become, if they were able, employees at will, no matter the terms and conditions of their work. What had been maligned as “welfare dependency” was traded in for a new dependency on businesses rooting in soil fertilized by need. And so the business community vigorously supported welfare “reform,” especially its provisions allowing for flexible wage scales as an end run around the minimum wage and its tax subsidies and training grants. It made for an attractive package indeed, in particular in a tightening labor market. All in all, welfare reform added a million low-wage workers to the labor pool by 2002.
Defunding college education at an accelerating rate, at a time when it was seen as the only pathway to upward mobility, reproduced a demoralized population of working-class and eventually middle-class youth, adding still more pressure to the households they came from. In 1980 state governments contributed nearly 80 percent of the cost of undergraduate education; now students bear more than one-half. Pell grants, which once accounted for over 80 percent of tuition, were paying 32 percent by 2005. Student debt now runs about $1 trillion; more than 40 percent of twenty-five-year-olds in 2013 carried student debt, and 60 percent of them owed more than $10,000.23
Where, one might reasonably ask, was the Democratic Party while its core social constituencies were suffering? The party of the New Deal last acted like it was one in the late 1970s. Back then it mustered support for labor reforms to overcome some of the crippling aftereffects of the Taft-Hartley Act and other obstacles to union organizing. Onetime vice president Hubert Humphrey, along with prominent elements of the business community (Ford, Harriman Brothers, the Bendix corporation, and other major manufacturers hoping to recoup their position in an increasingly competitive global marketplace through government subsidies for research and development and new technologies), tried as well to address the initial phase of deindustrialization by resurrecting old New Deal proposals for a national development bank. Humphrey and his allies called for capital controls to stop the outflow of manufacturing investment to Mexico, Brazil, and East Asia. And they tried resuscitating the full-employment legislation that had been defanged just after World War II. However, these propositions died in Congress and had no real friend in the White House, signaling that the party of “labor’s emancipation” was already on its way elsewhere.24
It went in search of new followings in the emerging middle classes among socially liberal professionals, technocrats, and other white-collar commuters from the suburbs. African Americans remained loyal Democrats thanks to the civil rights legislation of the 1960s. But in part for that same reason other blue-collar constituents began migrating away to various forms of racial populism championed by third-party demagogues like George Wallace or even into the arms of the Republican Party. The GOP embraced the silent majority’s resentment of limousine liberals and their countercultural and multicultural allies. Demography, racial politics, and cultural emancipation converged to shift the Democratic center of gravity in the direction of the new class.
George McGovern’s 1972 presidential campaign first signaled which way the wind had started blowing. The McGovern coalition excluded the labor movement as a conservative element. In turn, the labor movement excluded the McGovern campaign, which it found obnoxious to its cultural and patriotic instincts. (Labor abstention no doubt contributed to McGovern’s crushing defeat.) The embryonic New Democrats found a fresh comfort zone in the liberated individualism of identity politics, its moral concerns trumping the economic ones of the old party.
That political migration from economic to cultural politics caused no discomfort in corporate boardrooms, which were ready to view these new identities as so many lucrative niche markets. By 1990 one-half of the Fortune 500 companies employed a full-time staff to manage diversity. But the social liberalism of identity politics also set in motion a logic of fragmentation that could chisel away at the fragile solidarity of an earlier era, especially as that solidarity had itself always carried within it latent fissures and inequities of race and gender and work hierarchies. And after all, the immersion in identity politics was by definition a recognition and a celebration of difference, a huddling together of the same in contradistinction to the not-same, a solidarity premised on division.25
President Jimmy Carter was an alien inside the contracting political universe of New Deal labor-liberalism. Carter the technocrat was aptly described by Stuart Eizenstat, his chief domestic adviser, as “the first neo-liberal Democratic president, fiscally moderate, socially progressive, and liberal on foreign policy issues.” Despite Carter’s subsequent reputation as a global humanitarian, the techno-liberalism he articulated was ethically challenged. It was the culmination of a century of cultural evolution that sought to reduce politics to a matter of expertise and the manipulation of public sentiment to support ends already agreed upon in circles of the knowing. The idea was to dampen down social abrasions and political confrontations and to resolve them by means of political psychotherapy and social engineering.
This was a worldview compatible with the automatic mechanisms of the self-regulating market. It appealed to the new class of suburban professionals and technocrats captivated by the mirage of cost-benefit analysis as an end run around the bloody-mindedness of politics. Its ardor for government economic activism cooled while it warmed up for the corporation. What links still existed between the social liberalism of the new class and the economic liberalism of the working class were wearing out. More than growing apart, these became armed camps with middle-class social liberals facing off against conservative populist “hard hats.” A new political chemistry, allying the neoliberalism of the business class and the social liberalism of the new class, became electorally feasible.
After Ronald Reagan’s election, what remnants there were of New Deal populism and class consciousness were shuttered away in some attic of the Democratic Party. Legions of working people, whether unionized or like the thirty million or so unorganized working poor, could expect little help anymore from that quarter. They had been abandoned not only by government but by the political machinery their forebears had created to help them cope.26
Carter was in effect a precursor of the Democratic Leadership Council’s neoliberal champion, Bill Clinton. The council formally made its debut after Ronald Reagan’s overwhelming re-election. It recognized the supremacy of the business model for running the country. This included reducing taxes on corporations and the wealthy. And it applauded deregulation, already well under way during the Carter years, in trucking, the airline industry, natural gas, electrical power, and, most tellingly, in finance—a fatal deregulation in the case of the savings and loan business. It was even endorsed by liberal heroes like Edward Kennedy and Ralph Nader and by figures from the other shore like Milton Friedman.
The marriage of Bill Clinton to Robert Rubin, who came from Goldman Sachs to run the Treasury Department, would consummate this union of Democrats and Wall Street. As the Clinton era ended, a journalist summed up this historic makeover: “The precinct of money, traditionally rock-ribbed Republican, has become one in which Democrats are more comfortable. In many ways, the democratization of money has led to the Democratization of money. As the 1990s wore on, the Clinton Administration grew not only to tolerate and appreciate the markets but even to love and embrace them.”
A long parenthesis in the history of American political culture had closed. It amounted to a belated recognition that the whole medicine chest of countercyclical remedies on offer from Keynesian economics and the political chemistry that had made them possible had failed to cure the structural crisis of American capitalism—or at least that the political party where they had been incubated was no longer prepared to defend them.27
Poverty can crush the will and amputate the future; everything becomes a matter of getting from one day to the next. Resignation sets in or, worse, depression. As the twentieth century drew to a close, one-quarter of the workforce—that is, 36 million people—earned less than the official poverty level for a family of four, three-quarters of them had no company health insurance, eight of nine no pension, three-fourths no paid sick days. Matters only grew tougher when governments pared away at what programs there were left to alleviate the hardship. What was to be done? And who was to do it?28
Mass incarceration was one “solution” pursued with a vengeance for decades. It freed up the streets and put a lid on unemployment, and while doing that exported the ominous shadow of the prison into the barrios. Rise up against that if you dare!
Or, if you were one of the millions of illegal immigrants keeping whole sectors of the economy afloat, you might call out your sweatshop employer. But this rated as the highest-risk behavior. The feds might be lurking nearby; ICE (Immigration and Customs Enforcement) is well-known to be raiding and deporting people like you, sometimes thanks to tips from employers who want to shed recalcitrant workers. Keeping your mouth shut and your hideouts hidden might seem far wiser.29
Betrayed and abandoned, cut adrift or superannuated, coerced or manipulated, speeded up, cheated, living in the shadows—this is a recipe for acquiescence. Yet conditions of life and labor as bad as or even far worse than these once were instigators to social upheaval. Alongside the massing of enemies on the outside—employers, insulated and self-protective union leaders, government policy makers, the globalized sweatshop, and the globalized megabank—something in the tissue of working-class life had proved profoundly disempowering and also accounted for the silence.
Work itself had lost its cultural gravitas. What in part qualified the American Revolution as a legitimate overturning of an ancien régime was its political emancipation of labor. Until that time, work was considered a disqualifying disability for participating in public life. It entailed a degree of deference to patrons and a narrow-minded preoccupation with day-to-day affairs that undermined the possibility of disinterested public service. By opening up the possibility of democracy, the Revolution removed, in theory, that crippling impairment and erased an immemorial chasm between those who worked and those who didn’t need to. And by inference this bestowed honor on laboring mankind, a recognition that was to infuse American political culture for generations.
But in our new era, the nature of work, the abuse of work, exploitation at work, and all the prophecies and jeremiads, the condemnations and glorifications embedded in laboring humanity no longer occupied center stage in the theater of public life. The eclipse of the work ethic as a spiritual justification for labor may be liberating. But the spiritless work regimen left behind carries with it no higher justification. This disenchantment is also a disempowerment. The modern work ethic becomes, to cite one trenchant observation, “an ideology propagated by the middle class for the working classes with enough plausibility and truth to make it credible.”30
Moreover, the marketplace is not the workplace. A society preoccupied with exchange, with the world of the market, loses sight of the place where real value is created, wealth distributed, and power deployed. The democracy of consumption, while having a lot to recommend it, is not the incubator of inspiring self-sacrifice which social movements live on. The nineteenth-century critic John Ruskin observed, “It is not that men are ill fed but that they have no pleasure in the work by which they make their bread and therefore look to wealth as the only means of pleasure.” However, if that workaday world is remembered at all today, it is naturalized, turned into an inevitability, and made into an ontological category so as to be beyond interrogation.31
Poverty did sometimes manage to occupy center stage. But this was not poverty arising out of exploitation at work. It was first of all the “invisible poverty” detected in the early 1960s, which on the contrary originated in the exclusion from work. Alternatively, there was the specter of poverty, again racially inflected and highly visible, but as a cultural not economic phenomenon, as a failure of the will—the will to work. Significantly, there were no more “Work in America” reports, at least none commanding nationwide attention like the one issued in 1973. Why worry about the “blue collar blues” in an economy where many are so cowed and anxious that any job will do? And something even deeper was tunneling away at the self-respect and public stature of the workingman.32
The shift from industrial to finance-driven capitalism was accompanied by a cultural phase change whose impact on the self-esteem of working people and on their public regard was disarming and devastating. Escape from proletarian life and social status has long been an American promissory note. Yet a contrary impulse to make a place in the sun for the working classes has coexisted for generations alongside those feelings of shame, failure, or denial.
No longer is this the case. Even the labor movement wants to depict itself as middle class, in a studied aversion to using a social category—the working class—that fits it well but is now so stigmatized that it is better left buried. As one writer notes, even those pop culture figures who, by virtue of everything they do at work should be seen as working class, are rather portrayed as middle class, be it the Simpsons or the characters on Friends or Gilmore Girls. Should it crop up, this invisible class is treated as an exotic species at best or at worst as a failure, as throwaway trash. Manual labor is disrespected in favor of what is depicted as “real” creative work, often abstract, done in an office, numerical, image-laden, and paper-bound. A world once highly visible, wretched, and inspirational all at the same time has dropped beneath the horizon of our common consciousness.
In this postindustrial world not only is the labor question no longer asked, not only is proletarian revolution passé, but the proletariat itself seems passé. And the invisibles who nonetheless do indeed live there have internalized their nonexistence, grown demoralized, resentful, and hopeless; if they are noticed at all, it is as objects of public disdain. What were once called “blue-collar aristocrats”—skilled workers in the construction trades, for example—have long felt the contempt of the whole white-collar world. For these people, already skeptical about who runs things and to what end, and who are now undergoing their own eviction from the middle class, skepticism sours into a passive cynicism. Or it rears up in a kind of vengeful chauvinism directed at alien others at home and abroad, emotional compensation for the wounds that come with social decline.33
At lower levels of the working-class hierarchy, it is stunning how many still living in poverty blame themselves for their predicament, for making “bad choices.” Then again, it’s really not so surprising in a society that has largely erased their presence except to judge them delinquent. Who can’t sense the collective frown that derides working-class dress codes, eating habits, what they drive and what they smoke, their feckless social and moral attitude about work, childbearing and child-rearing practices, early marriages, leisure pursuits—in sum, their whole unstylish lifestyle. And the connection to race is organic, a way of erasing and replacing class with race. All these improprieties are associated in the public mind with blacks and Latinos even while felt to be characteristic of a whole class.
Once upon a time the exposure of urban misery and squalor aroused middle-class sympathy, even outrage. Now the reaction is more likely one of fatalistic resignation. Thus when Jacob Riis put together a book of words and pictures about life in the slums of late-nineteenth-century Gilded Age New York, it shocked and nauseated middle-class readers. It helped ignite a national crusade to do something about what was considered a social scandal. A century later the television drama The Wire might have been our own era’s version of Riis’s How the Other Half Lives, detonating a similar explosion of outrage. It was after all a scorching indictment of the way our modern breed of pitiless capitalism simply writes off vast tracts of humanity, in this case both the black and white working and impoverished classes of Baltimore.
Yet “indictment” does not quite capture the mood and vantage point of the TV serial. It was rather a kind of visual requiem pervaded by a mournful hopelessness about the scene it paints, about the abyssal cynicism and corruption of the city’s political class, its police hierarchs, its business leaders, its school bureaucrats, its media lords, its ghetto spokesmen, and even those remnants of the labor and civil rights movements mounting a last-gasp, enfeebled resistance. The last man with heart left in terminal Baltimore turns out to be a drug-dealing killer.
Tragic fatalism of the kind conveyed by The Wire is a version of our general condition that accepts as natural and inevitable what might have alarmed and agitated past generations. What comes to seem normal is assumed to be not only real but right, or is not noticed at all, or is so frequently noticed its impact becomes negligible.
Vanishing is disempowering in the extreme, especially if you’re still right there. In a society infatuated with business titans and Olympian managers, commanding presences because they are in positions of command, the managed shrink in self-estimation. Deprecating and denigrating this underworld is a reflex, one that becomes part of the collective unconscious. As Senator Phil Gramm observed, “No poor man ever gave me a job.” Dissing these “losers” became a nasty commonplace, not beneath even President George H. W. Bush’s press secretary Tony Snow, who remarked: “Upper classes have always pulled societies forward economically—and their conspicuous prosperity has always aroused the jealousies of the lower classes. The envious set out to strip the rich of their lucre believing mistakenly that by redistributing income they could make everybody affluent.” Snow’s was a kind of social cowardice that runs away from confronting the structure of power and wealth actually responsible.
This may have always been true, but more emphatically so once wage slavery is accepted as destiny, as history’s end point, when all that has come before it is gone. Or as Margaret Thatcher put it: “There is no alternative.”34
Naturally, the brutal process leading up to this monumental disappearance caused internal bleeding. In a society that was giving itself over to the uninhibited pursuit of self-interest, unions were increasingly treated as the most piggish institutions out there, strictly out for themselves—especially their bureaucrats, who were likened to gangsters of the first order. While corruption is chronic in all sorts of institutions, especially in government, politicians never tire of raking unions over the coals for behavior they turn a blind eye to when it crops up among more powerful transgressors.
Where unions managed to achieve and hang on to some semblance of power, security, and dignity, they were greeted by a sour turn in the public mood. That mood swing was caused by a general decline in all those precious things that make life supportable for those not lucky enough to be in unions. Resentment instead became the emotion du jour and unions got chastised for not having to suffer quite as much as everybody else—everybody, that is, except those privileged circles who didn’t have to suffer at all.
Inflation was blamed on the unions. Stagflation was blamed on the unions. Union members were coddled, too immune to the discipline of the market, too well paid. If in the distant past low wages were once the problem, now they were the solution. Why not pick on teachers’ unions for the sorry state of public education and the even sorrier prospects of the kids in attendance? Teachers and their unions after all don’t carry the heft of hedge funds looking to privatize schools and willing to make insupportable claims about the wonders of a charter school regimen.35
Racial resentment has always cut a broad swath through working-class life, making enemies where there might have been brothers. The rights revolution of the 1960s, while supported by many trade unions, was opposed by others—most conspicuously, in the construction trades. Then, as pressures of economic decline began to set in, resentment about affirmative action, school busing, tax-supported poverty programs (however meagerly funded), and other needs-based, tax-supported social welfare initiatives boiled over.
Actually, this resentment was less evident among union workers than among white working-class men and women outside the ranks of organized labor. Wherever it cropped up, it wedged apart groups that might have been and in some large measure had been living approximations of “solidarity forever,” of that subversive social emotion, sympathy, which the Taft-Hartley Act went out of its way to criminalize. An irony of tragic proportions happened. People once standing shoulder to shoulder in the struggle for economic justice were now divided into hostile camps by the same conundrum.36
Nor was this debilitating only in a practical sense. What lent the labor movement of the 1930s and the various anticapitalist currents before then such a large presence was the instinct, expressed only episodically, to identify its cause and objectives with the lowliest. Now, jealousy about the gains, however limited, of those beneath them, those still living on the mudsills of modern America, corroded precisely that socially more embracing instinct.
A better-off white working class bought into a notion originally concocted by the haute bourgeoisie of the first Gilded Age: there were two working classes, one respectable, hardworking, manly, orderly, cleanly turned out, and a second one that was dirty, dangerous, debauched, and unmanly. This demobilizing fantasy was already the price paid in the immediate postwar years for losing the battle in the public sphere to extend social welfare and resorting instead to the far more restricted private realm of collective bargaining.
More crippling even than that, this growing parochialism, embedded in the politics of identity no matter what victimized population was hoisting its flag, changed the way people viewed the world. Making sense of what’s out there is never a matter of individuals apprehending it directly. Rather knowledge, especially social knowledge, is mediated by all those relations and connections in which everybody is entwined—from the intimate immediacy of the family to the remote nation-state. Breaking down the Berlin walls that balkanize social life into sovereign territories—family, kin, neighborhood, ethnic and religious tribes, primordial hierarchies of race and gender, manual and mental labor—is rare. When it does occur, however imperfectly and briefly, people caught up in this overturning, in this act of organizational artistry, may reconceive the world and their own place in it. The mutuality, the underlying interdependence, that accounts for the existence and the identity of every modern individual becomes palpable. You might call that the epistemology of revolutionary change. In plainer language it is what animated the mass strikes of the Gilded Age or what the Flint sit-down striker recalled after that victory when he said, “It was the CIO speaking in me.”
What is therefore most pernicious about the recent ascendancy of free market thinking is perhaps not so much the triumph of its public policies. Rather, it is how its spirit of self-seeking has exiled forms of communal consciousness, rendered them foolish, naïve, woolly-headed, or, on the contrary, sinful and seditious. A cultural atmosphere so saturated with these suspicions is a hard one in which to maintain or create movements or institutions built on oppositional foundations.
The postwar “grand bargain” which traded in anticapitalist aspirations for the American standard of living and the welfare state was already a first step headed that way. The siege mentality of the Reagan era further entrenched this behavior. Worse, that atmosphere became the polluted air breathed by everyone. Even those still tied to the remnants of what once were assemblies of resistance, zones where the world might be reimagined, were weighed down by the torpor of a regnant self-interest. Hunkered down in their bunkers, unions began to behave as they had been caricatured by their foes: wheelers and dealers, wise to the ways of the world, bailing out as the boats flooded. To envision an escape from that fate seemed like proposing to abolish the law of gravity.
Weakness and defeat, the exhaustion of old remedies, living as cultural and social ex-communicants, all this could not help but demoralize. And it made inviting as well other forms of consolation and reassertion. Political seductions aimed at hard hats, at silent majorities of ordinary, hardworking people, at the upholders of family values, and at those who still got dirty when they worked with their hands while those in suits and ties shuffled papers were persuasive to many. The counterreaction to the racial politics of the 1960s fertilized the soil in which hostility to government could root.
Richard Nixon’s approach to shoring up his social support rested on this démarche in cultural politics aimed at the white working class, something he borrowed from George Wallace’s assault on the liberal establishment. Meanwhile, his vice president, Spiro Agnew, trafficked in alliterative homilies designed to arouse anxieties about manly men and their despoilment by the “effete” and the “effeminate” establishment. While this seduction merchandized a warehouse full of issues, from crime and drugs to abortion and subversion, at its most primitive it indicted the liberal elite for taking money from the people who worked to give it to those who wouldn’t.
For example, Louise Day Hicks, a leader of the fiery protest movement in Boston against busing and affirmative action, saw the contest as one between “rich people in the suburbs, and the working man and woman, the rent payer, the home-owner, the law-abiding, tax-paying, decent-living, hard-working, forgotten American.” Looking through the other end of the telescope, Charles Murray saw in his best seller Losing Ground a state-created caste of nomadic outsiders, poverty program addicts, immoralists, poverty-stricken pariahs, and parasites.37
Nixon sought to capture the sense of resentment and alienation that had been vividly depicted in “The Revolt of the White Lower Middle Class,” an article by Pete Hamill. The president loved that article, which was mined in turn by his Labor Department secretary, George Shultz, for a report called “The Problem of the Blue Collar Worker,” which sympathized with the worker’s sense of being forgotten and passed over. Nixon’s team went to great lengths to woo the AFL hierarchy, including heads of the construction, longshoremen, and teamsters unions, as well as police and firefighters. And so too, the Nixon administration seized on the latent identity politics of the white working class, appealing to thirty-three separate ethnic groups from Bulgarians to Syrians, transmuting indigestible class issues into the comfort food of cultural genuflection.38
What Nixon inaugurated became Republican orthodoxy for a generation. And since it worked, why not? Many working-class Americans saw little relief and less respect headed their way from the desiccated ranks of the New Deal. The Democratic Party had become the captive of a milieu that repudiated much of what they cherished: marriage, the paterfamilias, and patriotism; sexual orthodoxy and social conventions; and also class consciousness. The trick that produced Reagan Democrats from blue-collar precincts in some numbers was not to erase a sense of class identity but to neuter it politically, to make it turn inward—to privatize it, so to speak.
As the distinguished literary critic Terry Eagleton points out, for a long time issues of gender, race, ethnicity, identity, and culture were inseparable from “state power, material inequality, the exploitation of labor, imperial plunder, mass politics, and resistance and revolutionary transformation.” Then the links got severed. We have grown accustomed to speaking about the “Southernization” of American politics. Alongside it emerged the Southernization of the American working class, its retreat into make-believe. Perversely, a form of cultural flattery invited white male workers especially to take pride in their own degradation as rednecks, in the stereotype of them as stupefied marginalia. One ingredient of this cultural curative buoyed up a deflated masculinity. A world less and less respectful of the working-class male breadwinner, one chipping away at his material wherewithal, one where women in the workplace seemed trespassers and transgressors outside their proper place, could be seen by a man as a threat to a primordial status as well an implicit indictment of his own failure to provide. It incited a kind of angry, vengeful, homophobic muscle flexing and altered the chemistry of working-class life. Together with racial fears and resentments, this was a cocktail of class consciousness mixed by political and business elites whose every motivation and policy initiative meant to subvert the class consciousness that still survived from earlier days and that stood in their way.39
Popular entertainment and the graphic fantasies of commercial art made this cultural persuasion the spiritual sustenance of everyday life, as Jefferson Cowie has described in his book Stayin’ Alive. Even while legions rose against the prevailing industrial order in the early 1970s, Archie Bunker, with his amusing dumbness and bigotry, showed up as the media’s favorite image of the white working-class male. In other depictions this simulacrum of the genuine blue-collar article might be less benighted, or simply pathological, as in Easy Rider. He might be hapless and marginal—“history’s loser,” say, like John Travolta in Saturday Night Fever—or, on the contrary, patronized as the bearer of authenticity, hence the suddenly fashionable fascination with country-western music and mores, epitomized by Merle Haggard’s “Okie from Muskogee.”
Heroes like Rambo, Dirty Harry, or Charles Bronson in Death Wish, and antiheroes like Al Pacino in Dog Day Afternoon or Robert De Niro in Taxi Driver, and men in commercials wearing shirtsleeves and bandannas and driving pickup trucks through rough country all became icons of a stylish new working class. What distinguished it was its hypermasculinity, its warriorlike penchant for revenge, its contempt for conventional institutions of authority, and its fealty to the family. Here was a mythic figure, full of latent (or not so latent) fury, a narcissistic, infantile rage, often mirroring the enemy’s. It was a masquerade of control, needing no one but concealing a social hopelessness. This working-class hero might rush in to protect the underdog, but only to lapse into the age’s all-enveloping cynicism.
Debasement masquerading as empowerment or just plain debasement without psychic compensation became de rigueur. This recombinant bouillabaisse of bluegrass, gospel, hillbilly, and cowboy music turned the epithet “redneck” into a boast. One fundamentalist preacher/singer captured the inversion in a song:
No, we don’t fit in with the white collar crowd
We’re a little too cowboy and a little too loud.
But there is no place I’d rather be than right here,
With my red neck, white socks, and Blue Ribbon Beer.
The literary critic Roger Shattuck itemized the elements of this “savage ideal,” which “combines shiftlessness and energy, yeoman stock and degeneracy, hedonism and paternalism.” When it was still capable of shouting out, the cri de coeur came in ways that aroused personal not social protest: Johnny Paycheck’s song “Take This Job and Shove It,” for example. The roar echoing from the punk rock scene was just as inward, cramped in civic ambition, minimalist in its vision, reinforcing as a point d’honneur its own marginality.
If public life can suffer a metaphysical blow, the death of the labor question was that blow. For millions of working people, it amputated the will to resist.40