10

THE BETRAYAL

In January 1998, the federal court of appeals in Chicago handed down a ruling in Sprague v. General Motors—the climax of nine years of litigation so labyrinthine that the case file contained more than 1,000 exhibits and a trial transcript that filled nineteen thick volumes. For Sprague’s lawyer, though, the whole thing could be summed up very simply: “The court decided that American industry no longer needed to keep its promises.” When it was pointed out that this was a rather curt way to recap a matter of such great importance, the attorney retorted: “It doesn’t take a lot of words to explain that you got fucked.”

Robert Sprague had joined GM in 1943 when he was just sixteen. His father also worked at the company, a union man on the crankshaft line who had taken part in the 1937 sit-down strike at the Chevy plant along the Flint River. The junior Sprague, who worked at the same factory complex as his dad—“Chevy in the Hole,” it was called—took after the old man in many ways. He was as strong as an ox and as tough as one, too. “He wouldn’t take crap from anybody,” said his son, Robert Jr. In the late 1950s, Sprague entered the lower rungs of management, becoming a maintenance supervisor. His work ethic bordered on the pathological. He routinely put in seven days a week at Chevy’s metal-fabrication division, all while growing soybeans, corn, and wheat on his family’s forty-acre farm. Somehow, he found time for fun, as well, hunting pheasants, fishing, and kicking back with his buddies and a splash or two of Black Label on Michigan’s Upper Peninsula.

In spite of the pace, Sprague never missed a day at GM. Well, that’s not exactly true. He once had appendix surgery on a Friday and was back to work by Monday. On another occasion, when he was really sick, he got sent home after half a day. But that was it—a day and a half of work missed (aside from vacations) through thirty-eight years on the job.

In March 1982, Sprague’s boss, Sam Rodnick, approached him about taking early retirement. The offer came as part of a series of cost-cutting initiatives that GM had been undertaking for several years, though at this early stage, the belt-tightening was hardly noticeable. As late as 1979, GM was still the biggest company in America, whether ranked by revenue (more than $63 billion) or employment (nearly 620,000 US workers). Its profit stood at $3.5 billion—also tops in the nation—and many GM executives felt as if the company were unassailable. When Ron Hartwig, a public-relations manager for GM on the West Coast, started to see a profusion of Japanese-made cars on the road, he sent a note back to Detroit, alerting his boss. “Clearly their popularity here means they are going to be popular across America,” Hartwig told him. The reply that Hartwig received was pure brass: “I just looked out my window in the GM Building, and I don’t see any Japanese cars.” Of late, however, the pressure on GM to trim its payroll had increased considerably.

The back-to-back recessions of the early eighties were killing America’s Big Three automakers, as Chrysler and Ford recorded huge losses. In 1980, GM posted its own net loss—its first annual deficit since 1921. The following year, the company eked out a profit, but only by whacking more expenses; sales were still slumping. The company also saw its prized AAA credit rating evaporate. And GM’s attempts to take on Toyota and other Japanese producers with its new subcompact, the J-car, were going nowhere. Things were so bad that even the United Auto Workers agreed to “givebacks” in early 1982: a two-and-a-half-year wage freeze, a delay in cost-of-living adjustments, and the elimination of certain paid holidays. (The union seethed when it discovered that GM had simultaneously, under a new compensation scheme, made it easier for its executives to collect big bonuses. Chairman Roger Smith quickly backtracked.)

At the same time, as Sprague would tell it, the deal that GM offered him and other white-collar managers to retire before the normal age of sixty-five sounded awfully sweet. Rodnick “told me I would get all benefits I was presently receiving, for life, with no strings attached,” Sprague recalled. Plus, “he said I could get another job and make as much money as I could without any penalties.” A plant personnel manager, Sprague added, subsequently confirmed those details about taking an “early out.” To Sprague, it felt like the right moment to jump. His youngest son, Bill, had gone to work at the Chevy plant in Flint in 1977, and so the family’s proud connection to GM would continue. Sprague himself had just turned fifty-five and was beginning to feel worn down by hypertension and arthritis. He was also conscious of the heart disease that ran in his family. And so he agreed to hang up his wrench.

Over the next several years, GM struggled more. The big reorganization that Roger Smith had instituted in 1984—the one that included the dismantling of Fisher Body—was a fiasco. The new org chart drove GM “into a corporate nervous breakdown,” the journalists Paul Ingrassia and Joseph White have written. “Disoriented and fearful for their jobs, many lower-level managers simply ducked. They waited to be told what to do, bucked critical decisions up the line, and suppressed problems in numbing rounds of interdivisional meetings.… Smith had hoped the reorganization would allow for shrinking GM’s army of paper-pushers. Instead, the bureaucracy got more complex.” The company was in such a state of confusion that the running joke was about a manager who walked out of his office and told his secretary, “If my boss calls, please get his name.” Said one GM executive: “It is tough to operate when the structure isn’t right. It just stops you cold.”

The cars stopped cold, too. Filled with malfunctioning parts, Buicks and Oldsmobiles and other makes began coming to a standstill without warning. Between late 1985 and 1991, GM would be hit with more than 2.5 million warranty claims for stalling cars. The company’s Pontiac Fiero, for its part, had the unfortunate tendency to catch on fire. And its GM-10 cars, a family of midsized vehicles designed to vie with the Ford Taurus, Honda Accord, and Toyota Camry, were so inefficiently built that the company lost thousands of dollars on every one it manufactured. MIT researcher James Womack would call the GM-10 program “the biggest catastrophe in American industrial history.” Incongruously, GM was building many of the auto industry’s most abysmal products just as Smith was investing in all kinds of new technology. But this movement to modernize became a farce, with automated systems mistakenly sticking Buick bumpers onto Cadillacs and robots spray-painting each other instead of cars. GM’s market share plunged, from 45 percent in 1984 to 37 percent in 1987. “We have vastly underestimated how deeply ingrained are the organizational and cultural rigidities that hamper our ability to execute,” Elmer Johnson, GM’s executive vice president, told his senior colleagues in a January 1988 memo.

For all of his inventiveness, Smith at some point concluded that, in order to regain control of his sinking enterprise, he had no choice but to slash jobs. In late 1986, GM announced that it would shut down eleven factories and get rid of 29,000 positions, including 23,000 hourly employees. Even though that represented more than 5 percent of GM’s workforce, officials tried to play down the weight of the move, saying that the company had built six new assembly plants since the late 1970s, and so it was only natural that these older facilities would be closed now. Yet others read the cutbacks as a tacit admission by GM that it couldn’t imagine a way to regain the business that it had lost to the Japanese as well as to Ford and Chrysler, which had pinched their production and were showing signs of financial and operational improvement.

Besides, even if GM did rebound, it didn’t mean that those 29,000 spots were ever coming back. Neither were the 35,000 salaried jobs that the company earlier in the year had said it was phasing out, mostly by offering early retirement incentives similar to what Robert Sprague had opted for. The imperative to do more with less because of global competition—and the ability to do more with less because of technological progress—presaged a steadily contracting workforce. “There’s no longer a direct link between prosperity in the marketplace and jobs,” labor economist Harley Shaiken said after GM spelled out its plan to fold up the eleven factories. “No matter how the market performs over the next five years, General Motors will end up employing fewer workers. This announcement is not the end of the layoff process for General Motors. It’s the beginning.”

Actually, for some it seemed more like the last straw, given that GM had been shedding jobs since 1979, leading to the disintegration of communities like Flint, Michigan. Nearly 30,000 Flint residents had departed the city between 1982 and 1987, many for the Sunbelt, where their prospects for employment were brighter. Stores in Flint even began stocking newspapers from Houston and Dallas so that locals could peruse the classifieds. Michael Moore, a Flint native whose dad worked on a GM assembly line making AC spark plugs, watched with grief as his hometown decayed. Now the editor of an alternative weekly called the Michigan Voice, Moore wrote in late 1984:

Two more friends moved away from Flint today, two more in a long string of friends, family, and coworkers who have left Flint in the last five years. Either for lack of a job or lack of any signs of a quality life, they’ve moved downstate, down south, out west, or out of the country.

They’re the lucky ones. The rest of us here are either unemployed, expecting to be unemployed, or living amongst the two, sort of like Father Damien in the leper colony of Molokai; sooner or later you feel like you’ll end up, job or no job, as one of the walking dead.

Two years later, as Moore heard about the eleven factories slated for closure—an action that threatened to eat into Flint’s total employment by another 15 percent—his heartache gave way to anger. “You’re a fucking terrorist,” Moore remembered thinking of Roger Smith when the news was broadcast. “I can’t take this anymore. I have to do something. I’m going to make a movie.” The result was Roger & Me, an acid film that ended with Smith’s sappy Christmas greetings to GM employees (“We smell the pine needles on the trees and the turkey on the table”) interspersed with footage of a laid-off GM worker’s family being evicted from their home in Flint because they couldn’t make the rent. Although some critics skewered Moore for taking liberties with the facts, there was no denying that he had captured a larger truth about the devastation pouring down upon tens of thousands of people. “The hardships associated with plant closings are both pervasive and persistent,” a team studying the situation from the Michigan Health and Social Security Research Institute found. “Workers whose plants close suffer financial losses, marital strains, and symptoms of poor mental health.”

It was no wonder that those left jobless were scarred so severely. “People feel as if they are being thrown away,” said Richard Price, a University of Michigan psychologist. To many, it felt as if their entire heritage was being thrown away, too. “A manufacturing plant can become a moral bedrock, an institution that anchors a town’s special character, weaving the fortunes of many generations together,” the anthropologist Katherine Newman wrote in her 1988 book Falling from Grace: The Experience of Downward Mobility in the American Middle Class. “When something so fundamental to a community’s sense of self disappears, the consequences are more than economic—they call into question deeper commitments of loyalty, stability, and tradition.… Thus, in the midst of coping with the practical, personal consequences of economic dislocation (the familiar litany of occupational degradation, income reduction, economic insecurity), blue-collar workers also face the question of where (or whether) they belong in postindustrial America.”

Even amid so much distress, those working in autos had it much better than their peers in many other industries, thanks to the historic strength of their union. Hourly employees who were let go by GM could still count on the supplemental unemployment benefits that the UAW’s Walter Reuther had fought so hard for in the mid-1950s, and that had been enlarged in the late 1960s. Depending on how long laid-off employees had been at the company, they could now collect 95 percent of their take-home pay for anywhere from a few months to two years under the SUB. There was also a “guaranteed income stream,” which kicked in after the SUB ended. For those with ten or more years of seniority, this provided 50 to 75 percent of their pay until age sixty-two—so long as the idled worker hadn’t turned down an open job at GM or through the local unemployment office.

What’s more, GM and the UAW had agreed in 1984 to create a Jobs Bank to help dull the effects of automation. Under the system, which the rest of the Big Three would also adopt, up to 5,000 or so GM workers who had been displaced by machines or other efficiencies were to be retrained or allowed to take part in some other company-approved activity, all while receiving 95 percent of their wages. As the Jobs Bank expanded in future years, it would come to be seen as a gross perversion of the social contract—the corporate equivalent of the mythological “welfare mother who drives a Cadillac,” Harley Shaiken said—after it was reported that those in the program were being paid to do crossword puzzles, watch cartoons, and take naps.

But the fact was that none of the job-security measures won by the union could offer any real security in the end. Shaiken was right: the American auto industry—and especially General Motors—was getting smaller and smaller. And nothing could reverse the trend.

In April 1988, GM executives finally conceded as much, telling securities analysts that the company would pare its production capacity over the next five years so that its American factories would operate at 100 percent, two shifts a day, five days a week—something that it hadn’t been able to do since 1984. The Wall Street Journal called it “a strategic retreat that will significantly shrink the largest industrial concern in the world for the first time in its history.” GM president Robert Stempel, the company’s top executive behind Smith, vowed that the consolidation would be “handled in an orderly manner” and that the carmaker would cooperate with the UAW on a plan. The union, though, would hear none of it. “They sure as hell aren’t working hand in hand with me on eliminating plants,” said Don Ephlin, a UAW vice president.

Meanwhile, the company looked to bring down costs in every other way it could. Its greatest expense was medical care, with GM shelling out 13 percent more annually, on average, from the mid-1960s through the 1980s to cover its hourly and salaried employees and retirees. Its health bill totaled about $3 billion a year by decade’s end. One effect of this, said Gregory Lau, an assistant treasurer at the company, was that it put GM “at a serious disadvantage” relative to the Japanese companies that had opened factories on American soil because they had much younger workforces and very few retirees. “The spiraling increase in health-care costs places a special burden on mature industries like ours,” Lau said. Ever richer retirement benefits, which the UAW had continued to wangle from the automakers through the late 1970s, also added to the Big Three’s uncompetitive position. The Japanese could now manufacture and ship a car to the United States for $1,600 less than it cost the Americans to make one at home. It wouldn’t be long, as the writer Roger Lowenstein has expressed it, before GM would be seen as little more than “a pension firm on wheels… an HMO with a showroom.”

And so GM desperately tried to get out from under, at least on the health-care front. In January 1988, following the lead of others in the industry, GM made its salaried workers responsible for meeting certain copays and deductibles—up to $750 in annual out-of-pocket costs per family. (By the early 1990s, that number would rise to $2,600.) This wasn’t the first time that GM had altered its health-care plans. But there was a major difference: practically every modification that GM had implemented in the 1940s, ’50s, ’60s, and ’70s—often through collective bargaining with the UAW, which then set the course for what would happen with the company’s salaried staff—broadened workers’ benefits. Now, it was cutting them. “The changes made in ’88 were inconsistent with past practice, past policy… and to an extent violated the trust that employees had granted to General Motors over the years,” admitted Doug Eavenson, a GM benefits manager. One hundred and ten thousand white-collar employees were affected, along with 84,000 salaried retirees.

Robert Sprague was among those blindsided by GM’s new direction. “Had I known that General Motors was going to change my health-care benefits,” he said, “I would not have taken the special early retirement.” Sprague said he’d relied on what his boss and other GM officials had told him: that “I would have these benefits for the rest of my life.” Like other GM managers, Sprague had also been given a stack of benefit booklets, which appeared to summarize the company’s commitments. “Your basic health care coverage will be provided at GM expense for your lifetime,” one said. The others conveyed the same thing—a pledge that GM had been giving its workers since the Golden Age of the 1960s, when it had first expanded medical coverage for salaried retirees and their spouses.

In the summer of 1989, 114 early retirees, including Sprague, filed suit against GM for denying them no-cost health-care benefits. Eventually, the court certified the case as a class action, ruling that 50,000 other early retirees could join the suit.

GM’s defense came down to this: whatever the retirees were told in person or in writing, the company had always made clear that nothing was set in stone. “The corporation retained the right to modify, terminate, or suspend the programs,” said Richard O’Brien, a GM vice president. And, indeed, many of the summary booklets—as well as the company’s health-care plans themselves—said as much, even if this proviso wasn’t put front and center. Many of the retirees said they never really noticed any such admonition, focusing instead on the “for your lifetime” language. As for the possibility of out-of-pocket costs being imposed, “I never gave it a thought,” said Sprague.

When Sprague brought his lawsuit, his son, Bill, was still working for General Motors. He was now in human resources. “That was somewhat uncomfortable,” Bill said. “People would ask, ‘Do you know this Sprague guy?’” But Bill never spoke up on his dad’s behalf. “Sometimes, being in HR, there’s a path that you have to walk,” he said. And he genuinely appreciated the cost constraints that the company was under. “When you looked at the financial problems we were having, something had to be done,” he remarked. Yet Bill also felt for this father. It wasn’t that he couldn’t afford the medical expenses suddenly thrust upon him. “He was luckier than others in that way,” Bill said. It was the principle of the thing. “I could understand what my dad was standing up for,” he said. “A promise is a promise.”

In December 1991, as the Sprague case worked its way through the courts, General Motors announced that it would need to cut costs much deeper. Over the next few years, the company said, it would close 21 of its 125 North American factories, wiping away 74,000 jobs in the United States and Canada, about 18 percent of the total. Twenty-four thousand of those positions would be gone within a year. “GM,” said Robert Stempel, who had become chairman after Roger Smith retired, “will become a much different corporation.”

That was undeniable—but not everyone viewed such a gigantic scaling back as being for the better. Sen. Donald Riegle, a Michigan Democrat, called it “an economic Pearl Harbor.” Owen Bieber, the UAW president, said that for his members, “once again the reward is anxiety and dislocation at best or unemployment at worst.” Others had a more favorable take. “Management is finally rolling up its sleeves and attacking its problems aggressively,” said John Casesa, an analyst with the investment firm Wertheim Schroder. Even billionaire Ross Perot was impressed. GM had bought his company, Electronic Data Systems, in 1984, giving him a seat on the automaker’s board. Within a couple of years, he was making no secret of how little he thought of Roger Smith and his tolerance for limitless layers of red tape. At his company, EDS, the first person “who sees a snake kills it,” Perot said. “At GM, the first thing you do is organize a committee on snakes. Then you bring in a consultant about snakes. Third thing you do is talk about it for a year.” But this time seemed different, and Perot praised Stempel for “facing reality.”

Whatever one’s perspective, Stempel had not come to this place easily. He had started at GM in 1958 as a junior engineer and got his big break in the early 1970s when he helped to oversee development of the catalytic converter, a device that curtails pollution by filtering a car’s exhaust. When he was named GM’s chairman in August 1990, Stempel looked to be distinct from his predecessor in every way. Smith was short in stature with a high, squeaky voice. Stempel was a full-throated baritone and built like he used to play college football, which he had. Most notably, he was the first “car guy” to inhabit the chairman’s office in more than thirty years, following a steady run of executives whose backgrounds were all in finance. As such, Stempel seemed to identify with those workers who actually got their hands greasy, and he immediately established a good relationship with the UAW. He liked to speak in terms of “we” more than “I.”

In September 1990, during contract talks with the union, the company agreed to lay off its hourly employees for thirty-six weeks at most; if there was no possible way to call them back to work because there wasn’t anything for them to do, they’d still receive unemployment benefits equal to nearly full pay for up to three years. Stempel said he hoped that the accord would “provide the basis for a new level of trust” between the company and its blue-collar workers.

As optimistic as Stempel could sound, he didn’t sugarcoat GM’s need to be smaller—a lot smaller. But more than many of the executives who had led GM previously, he was sensitive to easing the misery for the men and women on the line. Before, “this was a company run by financial guys who said people are expendable,” asserted Joseph Phillippi, the auto industry analyst at Lehman Brothers. Under Stempel, he surmised, GM would be more inclined to “take care of the people.” Even after Stempel announced seven plant closures in November 1990, the union was forgiving, thanks to the safety net that had been constructed. When Smith used to walk into a GM factory, the workers would boo him; for Stempel they cheered.

But the comity wouldn’t last. After Stempel dropped the bombshell in late 1991 about closing twenty-one additional plants, there seemed to be no way for employees to evade real pain. The simple fact was that GM’s workforce was on track to be half the size in 1995 that it had been ten years earlier. And the magnitude of the retrenchment triggered fears that the funds GM had set aside to ensure that its laid-off workers would receive virtually full pay—some $3.5 billion—would now be drained prematurely.

Still, for Stempel, there was no other option. On August 2, 1990, just one day after he had assumed GM’s chairmanship, the economy had buckled: Iraqi dictator Saddam Hussein invaded Kuwait, helping to further weaken an American economy that had just fallen into recession. Oil prices spiked and consumer confidence plunged, dragging down car sales. GM would lose $2 billion in 1990, followed by an astonishing loss of nearly $4.5 billion in 1991—the biggest pool of red ink in the history of corporate America. Some on GM’s finance staff worried that the company might go bankrupt. Of course, the economic downturn only compounded a more basic problem confronting GM. As Sean McAlinden, an automotive researcher at the University of Michigan, stated: “They made a lot of cars that people don’t want to buy.” Ford and even Chrysler were, relatively speaking, doing better with those shopping for autos.

In the meantime, GM’s outside directors—the members of its board who were not part of management—were leaning on Stempel to act more swiftly and boldly. The board itself was feeling the heat from large institutional investors who had begun to exercise their growing power in this new era of shareholder capitalism. For companies that were “nonperformers,” said Richard Koppes, the general counsel of the California Public Employees Retirement System, which held a large block of GM stock, “the status quo doesn’t do.” Yet rather than pacify the outside directors, who were led by former Procter & Gamble CEO John Smale, Stempel’s plan to downsize GM only made them more perturbed. Some of this was due to Stempel’s style: he hadn’t bothered to consult with the board before he announced the twenty-one factory shutdowns. And some was more substantive: instead of specifying which factories were going to be cut, Stempel decided to punt those difficult calls until later, adding to the impression that he was still dithering and grasping for answers, not putting in place a thoroughly thought-out strategy.

For his part, Stempel believed he was being considerate by not naming names right before the holidays. But for GM’s workers, the lack of specificity about which plants would be closed tossed them into a horribly stressful guessing game, as tens of thousands of lives were now in limbo. “It’s like putting six people in a room and saying, ‘Well, one of you has AIDS,’ but not telling them right away,” Michael Moore commented in his inimitable way. “Why would you do that to somebody?” The UAW accused the company of pitting union locals against each other, with those willing to make concessions presumably enhancing the chances of seeing their factories survive. Stempel insisted that he wasn’t engaging in any such “whipsawing.” But two months later, when GM disclosed a dozen of the locations where it was going to halt production, few missed that its plant in Arlington, Texas, was spared while its Willow Run factory in Ypsilanti, Michigan, wound up on the company’s hit list. Both places made larger cars. But the union in Texas had been open to amending certain work rules, allowing GM to run the facility around the clock and thereby boost efficiency. The UAW at Willow Run had been slower in consenting to such changes.

Still, the 4,000 employees at the Michigan site were stunned by the news, having believed that Willow Run’s place in history—it built B-24 bombers during World War II—and proximity to GM’s main supplier base would save the plant from being targeted. Workers had even bought hot dogs and sodas in anticipation of celebrating; instead, many were left weeping. “It’s like they took a knife and stabbed me in the back,” said Homer Wiley, a twenty-eight-year GM veteran.

In Texas, by contrast, workers stood up and applauded when they heard that their jobs had been preserved. But they quieted down fast. “It got reverent because we got something somebody else had to lose,” said Lloyd Parker, a union official in Arlington. “There are a lot of sad people up north.”

They were reeling as well in Tarrytown, New York, where another GM plant was tagged for closure.

“It’s tough and aggravating and hard to accept,” said Tim Shore, one of the nearly 3,500 workers at the factory, which had started building Chevys in 1915. He and his two kids had settled a couple of years earlier in Tarrytown, which lies about an hour up the Hudson River from New York City, after the Fiero plant in Michigan where he had been working ceased operations. In relocating across the country, Shore had joined a group known as “GM Gypsies”—hourly employees forced to move from city to city, chasing diminishing amounts of work. But now, Shore couldn’t help but think that his chances of finding another gig within GM may have run out. “With all the other plants shutting down around the country,” he said, “there’s not going to be a lot of places to go.”

The curtain coming down in Tarrytown was particularly hard to take because the plant had become the embodiment of so many things that GM was doing right, after years of having done so many things wrong. Back in the late 1960s, Tarrytown was one of the company’s worst factories. Efficiency and productivity were crummy, and labor was not so much led as lorded over. “I am the goddamn boss, and you’ll do as I say!” was a customary response whenever a supervisor perceived that he was being challenged in any way. Management “thought we were scum,” said Bill Marmo, who came to work the assembly line in Tarrytown in 1952. The UAW didn’t think any better of management, and the antagonism between the two sides was constant. “My job… was to respond to any complaint or grievance regardless of the merits, and just fight the company,” one union committeeman recounted. “I was expected to jump up and down and scream.” Said Bill Slachta, the plant manager: “The whole pattern was destructive to the best interests of both parties and absolutely provided no middle ground.”

By 1971, those in charge of Tarrytown figured that they had to try something different—or the plant might well be put out of business. Over the next few years, top managers and their local union counterparts unveiled a program that was meant to raise the “quality of work life” throughout the factory. Under QWL, as it was known, workers were trained to learn more about what managers actually did. And managers were trained to learn more about the functions of the union. Hourly employees and their supervisors then sat down together to generate solutions to problems, as well as to make suggestions for heading off other problems before they even cropped up.

Initially, at least some employees were skeptical of QWL. If you give a worker “the opportunity to share with you the responsibilities of his job, he’ll meet you more than half way,” Ray Calore, the president of UAW Local 664 in Tarrytown, told a group of GM managers. “But you’ve got to come that half way first because he’s a suspicious person because of what’s happened to him for so many years.” Really, who could blame him? “When you hired an autoworker off the street, you bought his hands, his feet, and his back—and you cut him off at the head,” Calore said. “Shame on you because you lost an awful lot. Because that same person, when he goes to church, he’s asked to share in the responsibilities of running the church. He belongs to the Elks or the Eagles, and he serves on committees, and he belongs to the volunteer fire department and the community chest. So for sixteen hours a day, he’s a parent. He’s a person in the community who’s respected. And people are using him for his mind. And you’re not. Shame on you.”

Such fist shaking notwithstanding, Calore was sold completely on QWL. So was Slachta, the plant manager, who was willing to acknowledge that, for the most part, “management must be the bearer of the olive branch” to the rank-and-file. In time, most of the Tarrytown workforce embraced QWL. By the end of 1978, more than 3,000 employees were participating, working closer with management than ever before. “We were a team,” said John Inzar, an hourly employee who’d come to Tarrytown in 1965. The impact was enormous: Gauges of product quality rose substantially. Absenteeism declined by more than 50 percent. Worker grievances were down to a few dozen; seven years earlier, some 2,000 such complaints were in the hopper. “More generally,” said Robert Cole, a management professor who examined what was transpiring in Tarrytown, “workers now felt that they were being treated like human beings, and that someone was listening to them; there was a newfound dignity.”

In the coming years, GM would extend QWL to other factories all over the country. The company’s Fleetwood plant in Detroit, for instance, was a lot like Tarrytown had been: churning out second-rate products at a high cost while nobody seemed to get along. “Our job was to screw the union; their job was to screw management,” said one supervisor. “It was a way of life.” But in the late 1970s, a new plant manager introduced QWL, and within a few years, things had totally turned around. Now, managers actively solicited their workers’ input. Tensions decreased and performance increased, both greatly.

Not everyone was in favor of such togetherness. In some factories, managers were reluctant to yield their authority. Workers would be trained to have more involvement in planning. But then—in the words of a manager at the GM factory in Linden, New Jersey, where the adversarial atmosphere lingered—“they’d go back down to the floor, and… they’d get a superintendent who said, ‘That stuff you learned up there in that ivory tower, leave it up there, because this is the real world down here.’” Some union officials were also resistant to QWL, especially the hard-liners who thought it was a ploy intended to undermine organized labor. Others simply dismissed it as a bunch of malarkey. “Management says, ‘joint, joint, joint,’ but if they want to do something, they just go ahead and do it,” said Fred Myers, president of UAW Local 599 in Flint. Despite all the talk about blue- and white-collar employees finally seeing eye to eye, GM through the 1980s maintained the ultimate symbol of separateness at many of its factories: A “salaried men’s rest room” sat right next to the “hourly men’s rest room.”

Nevertheless, GM and the union continued to promote QWL, hopeful that a flowering of trust and collaboration between management and labor would enable every person in the plant to contribute to his or her fullest. Before long, other vaunted company programs would advance the same philosophy. One of them was a joint UAW-GM Quality Network, launched in 1987. An identical spirit also animated New United Motor Manufacturing Inc., or NUMMI, a venture that GM opened in 1984 in Fremont, California, with Toyota. “The key to NUMMI’s early success,” Newsweek reported in 1986, “is Japanese-style ‘participatory management.’” And Saturn Corporation, which GM formed as a stand-alone subsidiary in 1985, was built on such values as well. Saturn was able to thrive, said the UAW’s Don Ephlin, a trailblazer for employer-employee partnership, because GM had agreed “to push the decision-making power down to the lowest possible level.”

Surely, few, if any, of those behind these efforts knew that “Engine Charlie” Wilson had in the late 1940s put forward a similar notion—that blue-collar workers should have a bona fide voice in GM’s affairs—only to be rejected by both Alfred Sloan and Walter Reuther, each of whom considered the plan to dangerously distort the rightful duties of management and labor. That Wilson’s vision was now coming to pass (at least to some degree) said a lot about how the relationship between employer and employee had evolved over the ensuing thirty to forty years. Some companies, such as those with Scanlon Plans, had long sought their workers’ insights, but they were very much in the minority. By the 1980s and into the ’90s, with executives willing to try new things to make their operations more productive and competitive, that began to change.

More than a third of businesses, according to surveys, had backed away from using strictly command-and-control models and moved toward more flexible arrangements, like quality circles and self-governing teams. All sorts of corporations, beyond those in the auto industry, took up the cause: Corning Glass, Xerox, J.C. Penney, Texas Instruments, and thousands more. General Electric had Work-Out, while parts of the Kodak and Coca-Cola empires set up Total Quality Management systems in which frontline employees were given more of a direct say in how work should get done. Among America’s best companies, it had become common wisdom that they should not “foster we/they labor attitudes” and instead see every worker “as a source of ideas,” Tom Peters and Bob Waterman wrote in their runaway bestseller from 1982, In Search of Excellence.

But if this was a valuable upgrade to the corporate social contract—and programs like QWL certainly did provide many workers with more fulfillment and meaning in their jobs—it was impossible to ignore how much other, more concrete aspects of the employer-employee compact had fallen apart. Yes, “many workers have been empowered to play a greater role in designing their jobs,” MIT’s Paul Osterman submitted, but “wage levels have stagnated… and employment security is eroding,” often “at the same companies that are trying to restructure their workplaces” by forging these innovative alliances between management and labor. Said Harley Shaiken: “There’s going to be a problem on how these fit together.”

At Tarrytown, they didn’t fit at all. On February 24, 1992, the workers there were told to take a break from building Chevy Lumina APVs, Pontiac Trans Sports, and Oldsmobile Silhouettes and to gather around to watch a special telecast. That’s when they learned that the factory was going to be closed. “It was the worst thing in my life,” said Bill Marmo. “When I think about it, I get tears in my eyes.” The New York Daily News blared a two-word headline on the front page, “GRIM MONDAY,” over a picture of the Tarrytown plant. The very next day, a Washington think tank—apparently unaware of the death sentence that GM had just issued—released a report exalting the QWL program at the factory. “Labor and management were able to establish mutual trust where before it had never existed,” the Employment Policy Foundation declared. “Both sides became convinced of the other’s sincerity and commitment.” If words ever seemed empty, it was now. Compared with other plants, GM said, the cost of reconfiguring Tarrytown to produce the company’s newer-model minivans was too high to carry on.

In 1996, the last vehicle at Tarrytown rolled off the line—number 11,889,266. Two years later, the factory was demolished. All that was left behind, besides the memories, was a ninety-seven-acre, weed-strewn, apocalyptic landscape. Down the street at the old UAW hall, a sign read: “Parking Only For Union Made American Cars.” The lot sat empty.

Bob Stempel didn’t last even that long. In April 1992, upset that the CEO was still not displaying enough alarm over GM’s dire condition, the company’s directors dislodged him as head of the board’s executive committee—a clear signal of their loss of faith in him as a leader. The directors also demoted Stempel’s right-hand man, GM president Lloyd Reuss, who epitomized the company’s lethargic, head-in-the-sand ways. Through the summer, as Reuss’s replacement, Jack Smith (no relation to Roger), resolutely pursued a turnaround plan called “Fundamental Change,” Stempel found himself isolated. Rumors swirled that he was being pushed out.

By late October, Stempel had had enough. He resigned, and Jack Smith became CEO. The board’s coup was now complete. Stempel would go down in the opinion of some as one of the most wretched CEOs in the history of the auto industry. Others would say that it was Roger Smith who deserved such opprobrium; Stempel was just a victim of the wreck that he’d inherited. Either way, Stempel’s departure had a galvanizing effect on other boards: going forward, chief executives who didn’t cut it were more likely to be shown the door. Shareholders demanded nothing less. By some estimates, average CEO tenure would fall to seven years in the 1990s from ten in the 1980s. It was hard, however, to feel too sorry for most of those who got the axe. Their severances were unquestionably a lot better than most workers’. Stempel, for one, was awarded an $800,000 annual pension and a two-year consulting deal worth $1 million.

In March 1993, General Motors was back in the news: Time magazine informed its readers that, with its workforce down to 367,000, GM wasn’t the biggest employer in America anymore. The real story, though, was that it wasn’t General Electric or Exxon or some other industrial giant that now claimed the top spot. As Time tallied it (and as would be repeated again and again), the crown belonged to Manpower, the temp agency, which had 560,000 workers.

Truth be told, that number was spurious. More than half a million employees was accurate only if you counted them over the course of a year. On any given day, Manpower had about 110,000 people working for it. That’s because they would typically spend just a few weeks in the company’s employ before moving on—which, really, was the whole point. “The US is increasingly becoming a nation of part-timers and freelancers, of temps and independent contractors,” Time said. “This ‘disposable’ workforce is the most important trend in business today, and it is fundamentally changing the relationship between Americans and their jobs. For companies large and small, the phenomenon provides a way to remain globally competitive while avoiding the vagaries of market cycles and the growing burdens imposed by employment rules, antidiscrimination laws, health-care costs, and pension plans. But for workers, it can mean an end to the security and sense of significance that came from being a loyal employee. One by one, the tangible and intangible bonds that once defined work in America are giving way.”

This unwinding was not by chance. Well before Manpower was said to have one-upped GM, corporate America was being sold on the advantages of using temp labor—and not just the “Girl in the White Gloves” variety that had become so recognizable through the mid-1960s. By late that decade and into the ’70s, the temp industry pivoted to include men in its marketing, and the message now had less to do with giving women a way to balance work inside and outside the home, and more with giving businesses a way to pump up their bottom lines. “You May Have a Severe Swelling of the Payroll,” Olsten Temporary Services proclaimed in a 1968 ad, which showed a sick-looking fellow lying in bed, a thermometer protruding from his mouth. It went on:

For workers, this wasn’t necessarily all bad. Some people were pleased to be employed as temps because they were in school, taking care of a child or an elderly parent, or looking to cut back on their hours but not fully retire. Others wanted to check out a job before committing to it. “The peripheral worker in our society provides the economy with a very important part of the flexibility which it must have if it is to be efficient and dynamic,” Columbia University’s Dean Morse wrote in 1969, in one of the earliest assessments of how traditional employment was starting to fracture.

More than 80 percent of independent contractors were glad to be their own boss and not work for somebody else. Others argued that because those in “alternative work arrangements”—freelancers, temps, on-call employees, and contract-firm workers—were dispensable by definition, they gave companies a ready means to shave costs during slow times, helping to insulate their longtime employees from being laid off. “A major function of the contingent workforce is to buffer core workers from the vagaries of the market,” said Cornell law professor Stewart Schwab. “It is one thing for a temporary worker to lose a job early when job loss was foreseeable from the outset. It is quite another for someone on a career track for fifteen years, for example, to lose a job.”

Yet as more and more Americans found themselves in “alternative” or “contingent” positions, other experts became concerned. Only a fairly small portion of the labor force fell into these categories—somewhere between 10 and 30 percent, depending on how you sliced it. But the ranks of temp-agency workers were rising especially fast, and these employees were by and large unhappy. There were exceptions, including those white-collar workers who enjoyed being able “to go to the Caribbean when you want,” as one laid-off General Electric marketing executive turned temp put it. However, some two-thirds of temps wished they had a regular job—and for good reason. They tended to make less per hour than did permanent employees for the same work. They usually received no sick pay or vacation pay and had no insurance or retirement plan. All too many failed to be given proper safety training before being told to take on hazardous tasks. And some were made pawns as companies battled unions, with temps deployed to help beat back organizing drives or endure strikes. Temps could also take a toll on a company’s other workers by serving “as a subtle reminder… of the precarious nature of their own job security,” as one study has characterized it.

Troubling, too, was that since about 1970, nearly all of the growth of part-time work in America was taken up by those who preferred to have a full-time job instead—a reversal of earlier years when women and others streaming into the workplace grabbed up part-time positions because they were actually seeking such a schedule.

“These contingent workers… quickly are becoming a second class of workers within many large US firms,” warned Kathleen Christensen of the Sloan Foundation. Their subpar status earned them their own publication, Temp Slave!, with biting articles such as “Working Poor,” “Hello, My Name is Temp 378,” and “Ignorant Dumb Shitwork.” The inspiration for the 1999 cult classic Office Space was filmmaker Mike Judge’s stint as a temp, when he alphabetized purchase orders. The job was “soul-sucking torture,” he said, making plain that the temp had replaced Charlie Chaplain’s assembly-line worker as the serf of modern times. Many stuck around so long at the same employer that they became known as “permatemps.” Pushing back, employment agencies defended themselves as responding to the needs of a new America. “We are not exploiting people,” said Manpower’s chairman, Mitchell Fromstein. “We are not setting the fees. The market is. We are matching people with demands. What would our workers be doing without us? Unemployment lines? Welfare? Suicide?”

Through the 1990s, companies didn’t just bring more temps in to take pink-, blue-, and white-collar jobs; they also farmed more work out. Businesses had always depended on various contractors and suppliers. But now they were starting to off-load jobs that, in prior periods, they had handled exclusively themselves. One of the pioneers in this new age of outsourcing was Kodak, which in 1989 decided to turn over the running of its data centers to IBM. The move saved Kodak money—5 to 15 percent of its information-technology budget. But its primary aim was to better focus on its film business. “We were a photographic company, not a computer company,” said Kathy Hudson, who was then Kodak’s chief information officer. “We were not thinking about it in a cost-cutting way but in a strategic way.”

The plan caused a stir nonetheless. Hudson assured the hundreds of Kodak employees who would soon see their jobs sent away that they’d have every opportunity to move right along with the work; IBM was going to hire them all. But “no matter what you do or how employee-friendly you try to be, there’s not a lot of trust between management and companies on something like this,” Hudson said. At one point, she went out to dinner with some key members of her group, along with their husbands and wives. “If you thought the employees were nervous,” said Hudson, “that paled in comparison with their spouses. They were all asking, ‘What are you doing?’”

If anybody was well suited to answer and inject a measure of calm, it was Hudson. Not only had she been with Kodak since 1970; she had been brought up in its protective bubble. Her father, Edward, had joined the company in the early 1930s, running errands and cleaning inkwells. He went on to become an information-services manager, sitting in the very same office where his daughter later kept her desk. “In Rochester, you did not go to work for Kodak,” Hudson said. “The phrase was ‘you got in’—because you got in for life.” The world was changing now, and changing fast, but Hudson did all she could to uphold this legacy. The reason she selected IBM to operate Kodak’s mainframes, instead of going with a technically impressive bid from Electronic Data Systems, was that Big Blue still had a defined-benefit pension plan and would let her employees transfer over their years of service. “We were moving our people from a paternalistic Kodak culture into an equally paternalistic IBM culture,” she said.

As executives at other companies began to copy what Hudson had done—in those early days, the outsourcing of IT was known as “doing the Kodak”—the transitioning of work wasn’t always so gentle. Most of the time that jobs were absorbed by outside contractors, employees were let go without a second thought. The cafeteria crew, the maintenance staff, customer-service agents, security guards, and janitors were among those most at risk. The UAW tried to fight outsourcing in the auto industry, but it was now too weak to win, and the practice spread rapidly. Manufacturing a host of parts once made in-house by the car companies, US auto suppliers added more than 100,000 jobs from 1987 to 1996, while employment at GM, Ford, and Chrysler dropped sharply.

For many of those in the employ of these outsiders, their circumstances were less than ideal. Most auto contractors were nonunion shops, and they paid 30 to 40 percent less than the Big Three. Janitors and guards who worked for contractors were likewise paid relatively little compared with those who were employed directly by the companies where they scrubbed the floors and checked the locks. The same was true for food-service workers and those at call centers. In some cases—including at many warehouses—the contractors, in turn, subcontracted with temp agencies. At each step, as they got more distant from the big corporations whose customers they were serving, employees became more vulnerable to violations of law and other abuses.

The outsourcing wave didn’t stop at the nation’s borders, either.

Among the fiercest proponents of searching out the most cost-efficient suppliers—wherever they happened to be, almost anywhere across the globe—was a General Motors executive named Jose Ignacio Lopez de Arriortua.

Lopez had first caught the eye of Jack Smith, now GM’s CEO, when Smith was running the company’s European division. On a visit to GM’s factory in Zaragoza, Spain, Smith found the engineer in his office, with components piled everywhere: on the desk, the chair, every other stick of furniture, even the floor. Lopez had disassembled a subcompact car, the Corsa, looking for all of the innards that could be removed or substituted without compromising quality. Over the next few hours, Lopez went through each single piece, showing Smith how he could carve out a sizable percentage of the costs. Smith would never forget Lopez’s brilliance or his bravado.

Lopez—“Inaki” to his friends—was soon promoted to GM’s main European outpost in Rüsselsheim, Germany, where he was asked to take apart something much bigger: the entire purchasing operation. Lopez did whatever he needed to obtain the best prices from those manufacturing parts for GM in Europe, even if it required him to break long-standing relationships with local contractors in Germany by sending the work to Spain, Italy, Turkey, or some other country that could do it cheaper. This concept of “worldwide purchasing” was revolutionary. Suppliers in Germany began calling him “Lopez the Terrible,” “The Rüsselsheim Strangler,” and “The Spaniard Who Makes the Germans Tremble.”

Now, Smith wanted Lopez to bring his methods to the heart of GM: the company’s North American operation, which spent $35 billion a year on 10,000 different parts for its trucks and cars. Ever the swashbuckler, Lopez referred to those on his purchasing staff as “warriors,” and the man who liked to work fifteen-hour days pressed them to adhere to his “Warrior’s Diet” of fruits and whole grains and finicky food combinations because, he said, “there was a direct correlation between nutrition and professional efficiency, between health and the warrior spirit.” The real diet, though, would come for GM’s suppliers, from whom Lopez extracted billions of dollars in costs by subjecting them to as many as six rounds of bidding for a single job.

At first, Lopez framed his actions in patriotic terms. “We are fighting to save the auto industry and our lives,” he said. “If we lose the battle… our sons and daughters will become second-class citizens, and the US will have a second-class economy.” He swore that he’d work with the UAW, as well. “I love the union,” he said. But such statements became harder to believe as Lopez whipped up his warriors to find the best deals wherever they could, regardless of whether the supplier was unionized or even located within the United States.

In addition to Lopez’s quest for the lowest-priced components, GM also opened up more of its own factories in Mexico to supply parts to its assembly plants in the United States. By 1998 it owned more than fifty facilities south of the border, employing 72,000 Mexican workers. The migration didn’t go unnoticed at home. “There’s hardly anybody at this plant who hasn’t seen machinery moving out in a crate with an address on it that says ‘Mexico,’” Larry Mathews, a UAW official in Flint, related.

The pull to go elsewhere was strong. “By the early 1990s, the Big Three paid around eighteen dollars an hour—plus generous benefits such as fully paid health care—to unionized workers at parts plants,” the automotive writer Micheline Maynard has noted. “By contrast, workers at nonunion plants averaged about eight dollars to nine dollars an hour, and few had anything approaching UAW-style benefits. And plants based in Mexico or Asia paid workers just a fraction of nonunion US wages. Workers there took home in a week what UAW members earned in a day.” Given this gap, it was no surprise what happened next. In 1990, automakers in the United States imported about $30 billion worth of components from abroad; ten years later, that had soared to more than $50 billion in foreign-made parts.

Other industries similarly built global production networks, often over decades, helped along by advances in technology and logistics. US multinational corporations had been around for a long time—General Motors, General Electric, Kodak, and Coca-Cola among them. But the speed and scope of what was happening now was unprecedented. “What is new,” said the political economist Robert Reich, “is that American-owned multinationals are beginning to employ large numbers of foreigners relative to their American workforces, are beginning to rely on foreign facilities to do many of their most technologically complex activities, and are beginning to export from their foreign facilities—including bringing products back to the United States.”

For those making computer hardware and consumer electronics, Taiwan became a crucial base of supply. For many other American brands—whether in appliances, toys, cameras, shoes, or apparel—Chinese factories emerged as a main source of production through the nineties. American companies directly employed 11 percent of Ireland’s industrial workers by the late 1980s; they manufactured a whole range of products, many of which were shipped back across the Atlantic to be sold in US stores.

From the mid-1980s through the late ’90s, General Electric chopped its US workforce by half (to about 160,000) while nearly doubling its foreign workforce (to 130,000). In some instances, this was because the company wanted to be nearer to its overseas customers. The hottest market for GE’s power generation business, for example, was in Asia, and so in 1992 Jack Welch reassigned the lead sales office for the unit to Hong Kong from Schenectady, New York. “Psychologically,” said Welch, getting “away from ‘Mother Schenectady’… shocked the system. Suddenly we heard people say: ‘They really mean it. Globalization is for real.’” In other instances, GE changed countries just to pay less for labor. “Ideally,” said Welch, “you have every plant you own on a barge.” The company also prodded its suppliers to drive down costs—Welch called it “squeezing the lemon”—by picking up and reopening in Mexico.

All of this put blue-collar laborers in the United States, especially those in certain sectors, in particular jeopardy, whereas those with more education and skills seemed better poised to make a go of it. “Some Americans, whose contributions to the global economy are more highly valued in world markets, will succeed, while others, whose contributions are deemed less valuable, fail,” Reich wrote, shortly before he became labor secretary in the new Clinton administration.

But no employee was immune from new competition, Reich hastened to add. By 1990, Texas Instruments wasn’t just doing most of its manufacturing in East Asia, but also its research, development, and design. W.R. Grace, Du Pont, Merck, Procter & Gamble, Upjohn, and Kodak had all opened R&D labs in Japan. And Hewlett-Packard was tapping talent in West Germany, Australia, and Singapore for breakthroughs in fiber optics, computer-aided engineering software, and laser printers.

For those trying to save money on IT staff, accountants, and software programmers, India was becoming the place to turn. One of the first to spot the opportunity there was GE, which first consolidated its own “back-office” operations in India in 1997, and then offered these services to other companies interested in hiring “legions of English-speaking, college-educated workers… on the cheap,” as the New America Foundation’s Barry Lynn has described it. Even Kathy Hudson, who had taken so much care to ensure that her Kodak employees didn’t lose their IT jobs, couldn’t fault companies that now sent work far away, across the ocean. “When I can bring on an engineer in Bangalore for what it costs just for health care for a US employee, who am I going to hire?” she asked. “It’s the way it goes.”

As unnerving as globalization started to become for American workers in the 1980s and ’90s, many economists and policymakers contended that the subject was prone to a fair share of sophistry. The number of jobs being outsourced, they said, was pretty tiny all in all, accounting for perhaps a few percentage points of lost employment in the United States over the years. Automation, some indicated, posed a much graver danger. Also largely overlooked was that since at least the late 1990s, many American companies opened overseas factories and offices not to ship low-cost goods to the United States, but to serve local markets; the foreigners employed in those cases didn’t swipe jobs from US workers. Nor did “offshoring” happen only in one direction; foreign-owned companies operating in the United States during the nineties increased their hiring of Americans as well. Increased trade gave US companies new access to export markets that they could sell into, and a panoply of goods (many of them inexpensive) were now available for US consumers to buy.

Looked at in the right light, then, the greater interconnectedness of the world’s economies could seem a big net plus. Some even maintained that for every person an American multinational employed abroad in the nineties, this growth in business spurred it to hire two people in the United States—a “win-win” formula if ever there was one.

Such was President Clinton’s mindset when he advocated passage of the North American Free Trade Agreement, which was to ease the flow of products and services among the United States, Mexico, and Canada. “We seek a new and more open global trading system not for its own sake but for our own sake,” he said. “Good jobs, rewarding careers, broadened horizons for… middle-class Americans can only be secured by expanding exports and global growth.”

But while Clinton enthusiastically endorsed NAFTA and pooh-poohed Ross Perot’s prediction that there would be a “giant sucking sound” because of all the US jobs flooding to Mexico, he was always careful to acknowledge the darker side of a world with more open borders. “For a high-wage country like ours, the blessings of more trade can be offset, at least in part, by the loss of income and jobs as more and more multinational corporations take advantage of their ability to move money, management, and production away from a high-wage country to a low-wage country,” Clinton had said when still a candidate for the White House. “We can also lose incomes because those companies that stay at home can use the threat of moving to depress wages, as many do today.”

As he signed NAFTA into law in December 1993, putting him in conflict with fellow Democrats and organized labor, Clinton again recognized that there would be losers as well as winners—and the losers were likely to be those who already had lost a lot. “We have an obligation to protect those workers who do bear the brunt of competition by giving them a chance to be retrained and to go on to a new and different and, ultimately, more secure and more rewarding way of work,” the president said. “In recent years, this social contract has been sundered. It cannot continue.”

Nobody was really sure, however, how to stop the contract from ripping further.

Officially, the one recession to sting America during the 1990s didn’t last too long, having ended in March 1991, just eight months after it had begun. But the economic expansion to follow was different than any upturn that had come before it, at least since the end of World War II.

“Even though business profits are up, output is growing, and the economy is recovering,” Michael Mandel, the chief economist at BusinessWeek, said in 1993, “help-wanted ads are still scarce, and private job growth is plodding along at a measly 75,000 per month—with many of these temporary or part-time positions.” Corporate America, he added, “has developed a deep, and perhaps abiding, reluctance to hire.”

After the eight previous recessions, it had taken only ten months on average to recoup the number of jobs that had been lost during the downturn. It would now take twenty-three months. The “jobless recovery” was born.

No one knew for certain why the labor market was lagging, but in many industries, especially manufacturing, employers were not behaving as they had historically. Companies in the past would hold on to more workers than necessary during slack periods—a practice termed “labor hoarding”—so that they didn’t have to retrain new people once business picked up. Now they were letting people go without any intention of ever filling those positions again.

Around the same time that GM had announced it was abolishing 74,000 jobs, many others said they were terminating multitudes of workers as well: Unisys, Xerox, McDonnell Douglas, Sears, Tenneco, Westinghouse, TRW, Chemical Bank, Manufacturers Hanover, and more. “Do not confuse cyclical fluctuations with powerful structural forces that are now affecting the very fabric of our social order,” Morgan Stanley economist Stephen Roach told a congressional panel. The Federal Reserve Bank of New York said that, unlike before, a recession for many wasn’t “an event to be weathered,” but “an opportunity—or even a mandate—to reorganize production permanently, close less efficient facilities, and cull staff.”

For Kodak, the culling commenced in 1993, in a scene that played out very similarly to the one at GM with Bob Stempel. Like Stempel, Kodak CEO Kay Whitmore seemed overly cautious just when the company needed an assertive leader—in its case, to keep Fuji and private-label film manufacturers in check and to make the leap to digital photography. It didn’t help that Whitmore, a chemical engineer who had been at the company since 1957, would fall asleep in meetings; once, he even dozed off during a session with Microsoft cofounder Bill Gates.

Kodak did have one big advantage over GM: it was still making money; there was no real crisis in Rochester—not yet anyway. But institutional investors and the company’s outside directors were frustrated that Whitmore wasn’t delivering much higher profits. Kodak’s stock price was as listless as Whitmore himself. “Kay may be the right guy to be the pilot of a glacier,” said Robert Monks, whose firm had bought millions of dollars in Kodak shares and was now calling for change. “The trouble is, the water has gotten hot.”

Although Kodak had lowered its head count some in the 1980s, when Whitmore had been the company’s president, and he had recently eliminated another 2,000 jobs, the board wanted him to cut harder and faster. Whitmore said the right things: “There’s a shared sense of urgency at Kodak.” In January 1993, he had even recruited a chief financial officer, Christopher Steffen, who’d earned a reputation at Chrysler and Honeywell as an unapologetic cost-cutter. But Steffen abruptly quit Kodak after less than four months, irked that Whitmore and other senior executives kept trying to slow him down.

Whitmore was now sitting right in the crosshairs. This was, as Steffen had labeled it, a “post-Stempel world.” In August, Kodak’s board voted to oust Whitmore. The value of the company’s stock went up by a billion dollars that day.

Even as they booted him, the directors said they still expected Whitmore to keep working on a plan to bolster Kodak’s finances. Two weeks later, he announced that by 1995 the company would cut 10,000 jobs, or about 10 percent of its workforce. Wall Street, hoping for 20,000 positions to melt away, was disappointed. Employees, however, were shaken. The paper in Rochester ran a cartoon of Whitmore, his head being served up on a platter, that said: “Only 9,999 to go.”

It would be left to the new CEO, George Fisher, to preside over the layoffs. Fisher, who had been lured away from Motorola, which he’d led to great acclaim, had a little Jack Welch in him. He wasn’t as prickly as Welch. But like the GE chief, he was intent on smashing the snail-paced bureaucracy that had built up over many decades at Kodak, making it so that, as one business partner complained, the company would “have 1,000 people in the room and couldn’t make a decision.” Breaking this stultifying system, thought Fisher, was the only way to make real headway on the strategy he was honing: capturing more foreign markets and winning in digital imaging.

From the first day, Fisher regularly strolled Kodak’s offices and factories—even the employee cafeteria—asking questions, welcoming opposing views, seeking ideas from everyone about how to increase productivity, improve quality, and expunge waste. David Swift, who had been Whitmore’s executive assistant and was now Fisher’s, never forgot the first senior staff meeting he attended with the new boss. Swift entered and took his seat at the edge of the room, the way subordinates in the Kodak hierarchy had always done. Fisher stared at him, befuddled. “Why are you sitting over there? If you’re part of my team, you need to sit at the table.” (If Fisher ever surrendered the aura of the Everyman it was with his pay package, which included a $5 million signing bonus and the potential to make more than $100 million in compensation if Kodak’s stock price climbed high enough.)

Also like Welch, Fisher was insistent on holding employees accountable for achieving results. Just showing up wasn’t good enough anymore. He even overhauled Kodak’s wage dividend policy so that it would now be tied to specific corporate financial objectives—return on net assets, to be precise; no longer was a payout all but guaranteed. “When I joined Eastman Kodak out of high school” in 1964, said Mike Morley, the company’s director of human resources, “it was a real… entitlement mentality. We are trying to change that culture to be much more performance-based.”

And although the new CEO was determined for Kodak to grow its way to bigger profits—and not hack its way there—he reined in benefits where he felt it was essential. Much to the displeasure of Kodak’s employees, for example, Fisher pulled back on pensions and retiree health care. He also went forward with the 10,000 layoffs that Whitmore had set in motion, though he tried to do it in a way that would complement his strategy, directing a small team of executives to make pinpoint reductions rather than cutting across the board. Swift was part of the group that huddled in a room on the sixteenth floor of the Kodak building, choosing who would stay and who would go. “It was unbelievably gut-wrenching,” he said. “It was one of the hardest things I’ve ever had to do.” Fisher, too, hated having to dismiss so many people. “You don’t like to do it, but you’ve got to do it,” he said. “You agonize about the impact, the personal impact on families.”

Al Waugh, an engineer who received one of the 10,000 pink slips, had worked at Kodak for twenty years. “One of my beliefs has been dashed to pieces,” he said. “People worked hard, got a decent living, and made a good enough wage to raise a family. The whole thing came tumbling down.” For Waugh and his wife, Mary, it was tough to know what to do now. Kodak was everything to them. Generations of their families had worked at the company. Waugh’s grandmother, Bessie, had even been a secretary to George Eastman. “It’s interesting to think that she met him,” Al said. Mary had a different reaction: “He’s probably turning over in his grave a hundred times a day.”

Perhaps. But this was now George Fisher’s Kodak, not George Eastman’s. And Fisher didn’t hide that the company could not—and would not—give its workers all that it had given them in the past. He even put in place what he called, explicitly, a “new social contract,” which echoed Jack Welch’s promise of “‘lifetime employability” as opposed to “lifetime employment.” At Kodak, employees were to receive forty hours of training and career development per year. That way, if another round of layoffs were to come, workers would at least be “more marketable,” as Fisher said, raising the odds that they’d be able to land another job somewhere else. The hope, too, was that this kind of investment would lead to a fresh relationship between management and Kodak’s workforce, one in which trust would replace the jitters and contempt that all the cuts had provoked. But overcoming the embitterment wasn’t going to be easy. “The majority of employees still craved the security blanket that used to come with a Kodak job,” Alecia Swasy, a Wall Street Journal correspondent who covered the ’93 layoffs, observed. Only 30 percent of employees, internal surveys showed, now thought that Kodak had a “sincere interest” in their future.

By 1997, harboring such doubts seemed more sensible than cynical. Kodak announced another series of layoffs that year: 3,000, then 6,000, then 10,000—a Chinese water torture of sackings. “Everyone there is again saying, when… will it be me; when are the job cuts going to end?” said Eugene Fram, a marketing professor at the Rochester Institute of Technology.

As disturbing as it was for many to witness Kodak’s cultural conversion, perhaps even more unsettling were the job cuts that IBM initiated in 1993. The company had been shrinking its payroll since the mid-1980s, going from more than 400,000 workers to about 250,000, but this had been done through attrition and voluntary retirement (though the company, exhibiting a sharper edge, had also stepped up its firings for poor performance). In principle, IBM’s no-layoff policy had remained intact—and, more than that, it was still looked upon as “one of the factors that give the company its soul,” Harvard’s D. Quinn Mills wrote in 1988.

But once IBM lost money—nearly $8 billion in 1991 and ’92—its soul was lost, too. A new CEO, Louis Gerstner, had been brought in from RJR Nabisco to do what George Fisher at Kodak had been brought in to do: transform a sluggish company into a supple one. “If IBM is as bureaucratic as people say, let’s eliminate bureaucracy fast,” Gerstner told his most senior managers during his first meeting with them. “Let’s decentralize decision making wherever possible.… If we have too many people, let’s right-size fast.”

That he did. Four months into his job at IBM, Gerstner revealed plans for 35,000 layoffs, which were to come on top of another 25,000 early retirements already announced.

Within a year, IBM would return to profitability and soon thereafter it would begin a seismic, and exceedingly successful, shift from the mainframe business into computer services. Gerstner would go on to become one of the more venerated leaders in business history—one who could, to use his famous metaphor, make an elephant dance. But the “paternalistic IBM culture” into which Kathy Hudson had felt so good about placing her workers was no more. In the years ahead, as IBM took over the IT operations of many other companies, including those of J.P. Morgan Chase and AT&T, many of the workers it acquired were eventually laid off as their old departments were made more efficient. Those who were kept on the IBM payroll often faced cuts in their income and benefits. “It sometimes looks as though IBM is hired to be the hatchet man,” said Michael Smith, a Washington attorney retained by a group of high-tech workers who had lost their jobs. IBM also started to utilize another tactic that became an employer favorite during the 1980s and ’90s: firing a person one day and then bringing him back as an independent contractor, at lower compensation, the next.

One person who kept an eye on the job cuts at IBM was Jack Welch at General Electric. He had never liked IBM’s holier-than-thou advertising campaign in the 1980s: “Jobs may come and go. But people shouldn’t.” Now, he couldn’t help himself. When he’d run into IBM executives at some business or charity event, he’d smile and ask, “Why isn’t he ˜Neutron Lou?’”

Less amused were people like Michael Cunningham, whom IBM fired in 1993 after he’d been with the company for sixteen years. Everything about it bugged him, including the way that a group of security guards—unfamiliar faces all—kept watch as his supervisor escorted him out of the Poughkeepsie, New York, lab where he’d worked. He also felt demeaned by the language in IBM’s dismissal letter: “You have been designated a ‘surplus employee’ effective immediately.” Cunningham was one of nearly 8,000 IBM employees in New York’s Mid-Hudson Valley who’d been “surplused,” as the job cuts came to be known. “That word was the most disgusting word I ever heard,” he said. “It told me I was excess baggage, junk, garbage.”

“Surplused” was far from the only weasel word that employers used during the 1990s. There was also Gerstner’s expression, “right-sizing,” as well as “de-hiring,” “slivering,” “retooling,” “downshifting,” and “reengineering,” among many others. That last term grew out of Reegineering the Corporation, a blockbuster bestseller by former MIT professor Michael Hammer and consultant James Champy, which was published in 1993. It would long be cited as one of the most influential business books of all time—and not because it had been good for workers. “Hammer and Champy’s ‘manifesto’ obliterated the implicit social contract between employers and employees,” said Inc. magazine. “Gone were the notions of lifetime employment and corporate loyalty, replaced by an endless regimen of downsizing, rightsizing, outsourcing, and offshoring.”

In all fairness, that was a terrible misreading of the book, which called for reorganizing business by breaking down activities into small steps and then finding new efficiencies by reassembling these processes with a “clean sheet” approach. Although the authors did anticipate that some jobs would be combined through reengineering, mass layoffs were never part of their prescription. That, however, didn’t stop executives and consultants from using the book as intellectual cover for wholesale downsizing. “It is astonishing to me the extent to which the term reengineering has been hijacked, misappropriated, and misunderstood,” said Hammer. Added Champy: “At least half of the work that was going on out there under the name of ‘reengineering’ was really just shedding bodies.”

The 1990s wasn’t only about losing jobs. It was also about adding them—in record quantities, actually—which can make the period seem more than a bit bipolar. Some have called it “The Downsizing Decade,” others “The Roaring Nineties.”

Pointing to his long list of accomplishments, you could make a persuasive argument that Bill Clinton’s guiding of the economy was the best of any president who has ever served. During his eight years in office, the United States netted nearly 23 million new jobs. Unemployment reached a thirty-year low, falling below 4 percent. Productivity jumped, and real wages rose at their fastest rate since 1972. Income gains were widespread, including for African American and Hispanic families. Poverty declined. Inflation was stable. “America,” Clinton said, “again has the confidence to dream big dreams.”

But for lots of workers, it was hard to dream big when they were feeling so fitful, and understandably so. Most concerning, perhaps, was that during the nineties job opportunities for Americans became increasingly split, reinforcing a trend that had started in the late 1970s. Employment was rising in high-education professional, technical, and managerial occupations, as well as in low-end service work: food preparers, health-care aides, security guards, and so on. But both blue-collar and white-collar jobs in between—the work of factory hands, sales assistants, clerks, and low-ranking administrators who could once build a middle-class life, even with little formal education—were now vanishing. David Autor, an MIT economist who has documented this hollowing out of “middle-skill” jobs, has fingered the automation of routine work and globalization as prime causes.

For many, this meant that when they did find a new job after being laid off, it paid less than the one they’d had before. For millions of others, especially men without a college degree, this absence of demand for what they had to offer was so dispiriting that they stopped looking for work. As they dropped out of the labor force, with a good many now collecting government disability checks to get by, Washington didn’t count them anymore in its official tally of the unemployed. “Overall,” wrote David Leonhardt, the Pulitzer Prize–winning columnist for the New York Times, “the rise in the number of missing workers calls into question the great achievement of the 1990s economy: the best job market since 1970.”

Even for those who were working in “good jobs,” the nineties could be disquieting—a lot like the 1980s, when the social contract really began to pull apart, only more so. Through the first few years of the Clinton recovery, companies forced workers out of their jobs at a greater rate than during the brutal recession of the early 1980s. “It is difficult to imagine more compelling evidence that the nature of the employment relationship has changed than this,” the University of Pennsylvania’s Peter Cappelli has written.

Most corporations made no bones about it. A 1997 survey from the Conference Board, a business group, found that two-thirds of employers acknowledged that while they once had an “employee compact” that gave their workers job security, this wasn’t the case any longer. “Hidden behind America’s low official unemployment rate,” Yale University’s Jacob Hacker has said, was the “specter of workplace insecurity,” beckoned by a “growing recognition that no worker, no matter how educated, no matter how well trained, is free of the risk of sudden and large economic losses—when the economy is racing along as well as when it is struggling.”

Eager to become ever more efficient, many companies were now purging people not only when they were doing poorly but also when they were financially sound. And as organizations became flatter, white-collar employees were often the first ones to be let go. During the 1990s, in fact, managers were more likely to be taken out in a large-scale layoff than were lower-level workers—a switch from earlier periods. For those coworkers still employed after a big layoff—the “survivors”—it was not unusual to experience low morale and high stress as they endeavored to cope with shock, grief, fear, a heavier workload, and a loss of institutional knowledge. Jill Andresky Fraser, an editor at Inc., announced the birth of the “white-collar sweatshop.”

It is easy to turn the portrait of downsizing in America into a caricature and miss the tremendous complications inherent in the issue. Many times, for example, companies would lay off large numbers and then follow up with a burst of new hiring within a few years, leaving them with more employees than they’d originally had. Others would announce a downsizing and continue to hire new employees at the same time. Moreover, while companies talked about “delayering” and set out to dismiss many of their managers, the percentage of the American workforce in managerial positions grew through the nineties. Outside of manufacturing, retailers and service-sector businesses were mostly “upsizing,” not downsizing. And even among manufacturers, while big companies lost jobs, smaller producers added them. “With big firms growing smaller and small firms growing larger, the script is clearly not one of universal decline, as the most obvious interpretation of downsizing might have us believe,” economists William Baumol, Alan Blinder, and Edward Wolff have explained.

But even with all of these qualifications, there was no escaping it: throughout the Clinton boom—an economic expansion that would run longer than either Kennedy’s or Reagan’s—the dissonance of downsizing could be heard constantly in the background. As the “jobless recovery” bowed to vigorous job growth, there was by historical standards extraordinary churn in the workforce. “Downsizing has taken on a logic of its own—has lost its connection to takeovers or to financial problems or even to genuine business need,” G. J. Meyer, who’d been laid off from executive jobs at aerospace manufacturer McDonnell Douglas and heavy-equipment maker J.I. Case, wrote in 1995 in his melancholy memoir about being jobless. In 1998, despite the zooming economy, there were nearly 680,000 job cuts in America—10 percent more than 5 years earlier.

Employees felt the social contract continuing to deteriorate in other ways as well. Wages grew solidly in the late 1990s, as the labor market tightened. But this upswing would turn out to be brief, and across the whole of the decade, hourly earnings for the typical worker would rise only about half a percent a year on average—a quarter of what was seen during the Golden Age. One thing that kept wages from moving higher was downsizing. Apprehensive about losing their own jobs, some employees were evidently reticent to ask for even bigger raises.

Companies continued to put less money into pensions, too—nearly 30 percent less, on average, than they had in the late seventies. By the close of the nineties, even IBM had given up guaranteeing retirement security, saying that the defined-benefit model was appropriate when “thirty-year careers were the norm,” but not any more. Thousands of IBMers, scared that the changes the company was making would destroy a third or more of the value of their pensions, weren’t as ready to brush off the past. “These are employees who, throughout their careers, rejected job offers from other companies because of their loyalty to IBM,” said Bernie Sanders, a then-congressman from Vermont who counted many of the workers among his constituents. “And these are the same employees who woke up one day, not so long ago, to discover that all of the promises IBM made to them were not worth the paper that they had been written on.”

Health care was also slipping. President Clinton rode into office resolving to tame medical inflation while covering the 37 million Americans who didn’t have health insurance. He also wanted to end “job lock”: people wary of leaving one employer for another for fear of losing their insurance. The 1,300-plus-page piece of legislation that his administration produced was M. C. Escher-like in its complexity and New Deal-esque in its ambition—“the Social Security Act of this generation,” as Hillary Clinton, then the president’s wife, who was placed in charge of the plan, called it.

At first, it looked like big business, concerned about escalating health costs, might rally behind the proposal. But gradually, one lobby after another—the National Association of Manufacturers, the Chamber of Commerce, and the Business Roundtable—peeled off and opposed the president. They just couldn’t accept the amount of government regulation in the Clinton bill, and even if they could have been persuaded on the fine points, they never truly trusted the administration. In some eyes, they also lacked a concern for the common good that their corporate forebears had possessed. “In the past, elites within the business community had intervened to prevent the most venal interests from dominating Congress,” the political commentator John Judis has written. “In 1946, the Committee for Economic Development, acting not as another business group but as an elite organization committed to the national interest, rescued the Employment Act. There were, however, no comparable organizations and no comparable leadership that could have rescued health-care reform from oblivion.”

With the demise of the Clinton measure, businesses tried to hold down medical costs on their own. Companies directed their employees away from fee-for-service plans and into HMOs, PPOs, and other forms of managed care, where expenses could be more tightly controlled but where patients had less choice over their doctors. And many became stingier with their benefits. By 1998, only 28 percent of workers with employer-provided health coverage had the full cost of their insurance premiums paid by their companies, down from 44 percent in the early 1980s. When it came to their retirees, many companies yanked health coverage altogether—or, as with Bill Sprague and tens of thousands of others who’d worked for GM, they began to make them cough up more of the costs.

The majority of the appeals court in the Sprague case didn’t dispute that GM had told its white-collar retirees, time after time, that the company would continue to provide them with the same medical benefits “for your lifetime.” But because GM also apprised them that it could possibly revise those terms in the future, it hadn’t violated the law—regardless of whether this note of caution was essentially buried in the fine print. “GM’s failure, if it may properly be called such, amounted to this: the company did not tell the early retirees at every possible opportunity that which it had told them many times before—namely, that the terms of the plan were subject to change,” Judge David Nelson ruled in his 1998 opinion. “There is, in our view, a world of difference between the employer’s deliberate misleading of employees… and GM’s failure to begin every communication to plan participants with a caveat.”

But for Boyce Martin, one of three dissenting judges, the company had betrayed its employees, plain and simple. “When General Motors was flush with cash and health-care costs were low, it was easy to promise employees and retirees lifetime health care,” Martin wrote. Decades later, “rather than pay off those perhaps ill-considered promises, it is easier for the current regime to say those promises never were made. There is the tricky little matter of the paper trail of written assurances of lifetime health care, but General Motors… has managed to escape the ramifications of its now-regretted largesse.”

In the aftermath of the decision, corporations all over America cited the Sprague case as they defended their own cutting of retiree health benefits from lawsuits. More often than not, the companies won. GM, which had been so instrumental in building up the social contract, was now pivotal in pulling it down.