11

THE NEW FACE OF CAPITALISM

As America slipped toward, and then settled into, the early 2000s, it was only natural to look back upon the preceding one hundred years and take stock of those people and institutions that had exerted the most influence on the country and the world. Time magazine named Albert Einstein its Person of the Century. National Public Radio came out with its list of the hundred greatest American musical works of that span—a catalog whose eclectic nature was reflected in a sequence of tunes found in the middle of the Ss: “Singin’ in the Rain,” “Sittin’ on the Dock of the Bay,” “Smells Like Teen Spirit,” and “Stand by Your Man.” Popular Mechanics proclaimed the car the “defining device” of the era. And in the business world, many saw one company as the paradigm of the period (or a good chunk of it, anyway): General Motors.

But as edifying—or at least as entertaining—as it was to look back, it also made sense to look ahead. And as the new century began, commentators were eager to identify the company that was now positioned to be, in the words of one group of scholars, “the face of twenty-first-century capitalism.” Their choice, and the choice of many others, was Wal-Mart.

In terms of scale, Wal-Mart was the obvious successor to GM. In 2002, the retailer became the biggest corporation in America, topping the Fortune 500 list for the first time. By 2005, its revenues exceeded $285 billion, and profits surpassed $10 billion. It was the nation’s leading seller of toys, furniture, jewelry, dog food, and scores of other products. It was also the largest grocer in the United States through its so-called Supercenter stores. To peddle all this merchandise, the company employed more than 1.3 million US workers. But there was a massive difference between GM and Wal-Mart. While the former had put most of its workers on a secure route to the middle class, the latter was placing many on a path to impoverishment.

For decades, Wal-Mart had enjoyed a reputation as the most wholesome kind of company, steeped in American values and a down-home culture captured in the cheer that Wal-Mart employees would do, gathering in a circle at the start of every shift change: “Give me a W! Give me an A! Give me an L! Give me a squiggly! (followed by a shaking of the butt) Give me an M! Give me an A! Give me an R! Give me a T! What’s that spell? Wal-Mart! What’s that spell? Wal-Mart! Who’s number one? The customer! Always!” Wal-Mart, said David Glass, who’d joined the company in 1976 and had served as its CEO from 1988 until 2000, was “sort of like motherhood and apple pie.”

Nobody did more to enhance this image than Sam Walton, better known inside the company as “Mr. Sam,” who liked to swing his muddy boots into the cab of an old pickup truck and drive around Bentonville, Arkansas, where in the early 1950s he’d started Walton’s 5&10 on the town square. The first actual Wal-Mart was opened in 1962 in Rogers, about eight miles to the east. From the beginning, whenever Walton would deal personally with his workers—“associates,” in Wal-Mart’s communal-sounding parlance—he went out of his way to be as nice as nice could be. “This is a man who… treated his associates well as persons, not just as clerks and salespeople,” said Walter Loeb, a retailing consultant. Signifying this trait, Walton renamed the personnel department the “People Division” and put forth a management philosophy that was all about motivating each and every employee to give his or her best. Wal-Mart “believes management’s responsibility is to provide leadership that serves the associate,” the company asserted in the 1991 edition of Sam’s Associate Handbook. “Managers must support, encourage, and provide opportunities for associates to be successful. Mr. Sam calls this ‘Servant Leadership.’”

But there was another aspect of Walton’s belief system—an extreme tightfistedness—that rubbed up against this creed of caring. The idea was that by doing absolutely everything possible to minimize costs, Wal-Mart would be able to deliver consistently low prices to its customers, giving hardworking people more money to spend on other things. For the company, serving up bargains was more than just a business strategy; it was practically a religious calling. “If we work together,” said Walton, “we’ll lower the cost of living for everyone. We’ll give the world an opportunity to see what it’s like to save and to have a better life.” Consumers ate it up. By 2005, more than 90 percent of American households could be counted on to shop at Wal-Mart at least once a year; about a third of the nation shopped there every week.

For Wal-Mart’s folksy founder, his CEO successors, and their executive teams, the company’s parsimony translated into having tiny offices, emptying their own trash, flying coach, even sharing budget hotel rooms with colleagues when traveling on business. For Walton, who had a net worth of more than $100 billion, making him the richest person in the world, these were minor gestures. For Wal-Mart’s hourly workers, though, the company’s determination to pinch every penny was anything but minor: it left many of them scrambling to get by.

In the 1960s, Mr. Sam set up his stores as separate corporate structures—all linked back to a single financial partnership that he and his family controlled—so that revenues would come in at less than $1 million apiece, a threshold that permitted each location under government rules to pay less than the minimum wage. (A federal court eventually found this arrangement, while undeniably clever, to be improper.)

As the decades passed, the pressure to extract more from the workforce increased. Wal-Mart managers around the country forced employees to work off the clock and skip breaks; broke child-labor laws; and used illegal immigrants to clean its stores. It also locked workers inside overnight. The company maintained that this was to keep employees safe in high-crime neighborhoods, but former store managers said that the real purpose was to discourage workers from stealing anything or sneaking out for a smoke. Steven Greenhouse, the New York Times reporter who documented these and other such practices, has characterized them as “downright Dickensian.”

Some Wal-Mart managers tried to make conditions insufferable for their highest-paid employees so that they’d quit and could be replaced by others earning less. Others made sure that their workers never logged quite enough hours to reach full-time status since, as part-timers, they wouldn’t qualify for certain benefits. “I knew what I had to do,” said Melissa Jerkins, who oversaw a staff of 120 at a Wal-Mart in Decatur, Indiana. “I had to meet the bottom line or my ass was in trouble. They didn’t care how I got to that bottom line… so long as I got there.”

Even for those employees who weren’t victims of manipulated schedules or outright abuse, it could be tough to scratch out a living. In 2005, most Wal-Mart workers took home less than ten dollars an hour, compared with more than thirty dollars for a nonskilled assembly line worker at General Motors. This added up to a yearly income at Wal-Mart of about $18,000—below the poverty line for a family of four. Wal-Mart liked to talk up its profit-sharing plan. But because of high turnover, typical of most retailers, relatively few employees ever got to take much advantage of it. And even Wal-Mart acknowledged that a full-time worker might not be able support a family on a company paycheck. The health benefits that Wal-Mart offered were also lacking. As a result, many of its employees had to turn to public relief: food stamps, Medicaid, and subsidized housing.

In Nickel and Dimed, her 2001 modern classic about what it was like to take up a series of low-wage jobs across America, Barbara Ehrenreich recounted how when she went to work at a Wal-Mart in Minneapolis, making seven dollars an hour, she soon found herself about to run out of money. She wound up seeking help through a human-services agency called the Community Emergency Assistance Program, which recommended that she move into a homeless shelter until she could save enough for a deposit and first month’s rent on an apartment. Wrote Ehrenreich:

One reason that Wal-Mart workers have always had difficulty improving their lot is that they’ve never been able to form a union. At its core, Wal-Mart’s rationale for being against organized labor was not unlike that of Kodak, say, or General Electric under Lem Boulware: management had an open door policy, by which any worker could ostensibly walk in and discuss anything. Therefore, as Wal-Mart laid out in its “Manager’s Toolbox to Remaining Union Free,” “we do not believe there is a need for third-party representation. It is our position every associate can speak for him/herself without having to pay his/her hard-earned money to a union in order to be listened to and have issues resolved.”

Yet unlike Kodak, which tried to frustrate union organizers by keeping its workers happy with good wages and princely benefits, Wal-Mart has been reliably ungenerous. And unlike GE, which fought tooth and nail against the International Union of Electrical Workers but ultimately honored its right to exist, Wal-Mart has never allowed so much as a single one of its stores to be organized. Indeed, ever since Mr. Sam’s time, the company has done everything it can to crush the unions, painting them in the most Manichean terms. They are “nothing but blood-sucking parasites living off the productive labor of people who work for a living!” said attorney John Tate, an iron-willed right-winger whom Walton had hired to help beat back the Retail Clerks in the early 1970s and who then stuck around at Wal-Mart where he developed an array of antiunion techniques.

In the early 1980s, when the Teamsters attempted to represent employees at two Wal-Mart distribution centers in Arkansas, Walton himself showed up with a warning—US labor law be damned. Recalled one worker: “He told us that if the union got in, the warehouse would be closed.… He said people could vote any way they wanted, but he’d close her right up.” The Teamsters lost the election.

But it wasn’t just top executives who were expected to resist being organized. “Staying union free is a full-time commitment,” read a manual given out at a Wal-Mart distribution center in Indiana in 1991, the year before Walton died. “Unless union prevention is a goal equal to other objectives within an organization, the goal will usually not be attained. The commitment to stay union free must exist at all levels of management—from the chairperson of the ‘board’ down to the front-line manager. Therefore, no one in management is immune from carrying his or her ‘own weight’ in the union prevention effort. The entire management staff should fully comprehend and appreciate exactly what is expected of their individual efforts to meet the union free objective. The union organizer is a ‘potential opponent’ for our center.”

The main thing that a Wal-Mart manager was supposed to do when he or she caught even a hint of union activity was to contact corporate headquarters via a special hotline. Immediately, a “labor team” would be sent from Bentonville to the store where union organizers might be gaining even the slightest toehold. This squad from HQ would then take over the running of the place, putting workers on a steady diet of antiunion propaganda videos; weighing whether local managers were too timid or sympathetic to labor and should be ousted; and keeping a close eye on any employees agitating for a union or simply disposed to having one. “As soon as they determine you’re prounion, they go after you,” said Jon Lehman, who was a Wal-Mart manager in Kentucky for seventeen years. “It’s almost like a neurosurgeon going after a brain tumor: we got to get that thing out before it infects the rest of the store, the rest of the body.”

Much of this strategy evoked the way that GM and other companies deployed Pinkertons to bully the unions way back in the 1930s. “I had so many bosses around me, I couldn’t believe it,” said Larry Adams, who worked in the tire and lube express department at a Wal-Mart in Kingman, Arizona, which grabbed the labor team’s attention in the summer of 2000. With temperatures soaring well above a hundred degrees, Adams and some of his fellow automotive technicians got angry when their boss wouldn’t spend the $200 needed to fix a broken air conditioner. So they reached out to the United Food and Commercial Workers union. Within forty-eight hours, twenty outside managers were crawling all over the store. “It was very intimidating,” said Adams. The UFCW’s organizing push in Kingman would end in defeat; within a year or so, nearly every one of the union supporters would be fired or compelled to quit.

At a Wal-Mart Supercenter in Jacksonville, Texas, a group of meat-cutters had better luck at organizing—though not for long. A week or two after they voted to join the UFCW in February 2000, the company announced that it would cease cutting meat and switch instead to selling prepackaged beef and pork at all of its stores. Wal-Mart said that the butchers’ seven-to-three vote in favor of the UFCW had nothing whatsoever to do with its decision to stock “case-ready” meat. Yet it was all but impossible to miss that the move essentially eviscerated the union’s first ever victory at the company, as Wal-Mart successfully argued that the changes to the meat department made it so that collective bargaining wasn’t appropriate going forward. The UFCW challenged the company’s stance before the National Labor Relations Board, but after eight years of rulings by the agency and the courts, the union could claim only a partial win; Wal-Mart would escape being unionized.

The dust-up in Texas was far from unusual. In all, unions filed 288 unfair-labor-practice charges against Wal-Mart between 1998 and 2003. Most alleged that the company had engaged in improper firings, threatened employees if they tried to organize, carried out surveillance, or illegally interrogated workers to determine their views on labor-related matters. Of these charges, the NLRB found ninety-four of them to be substantive enough to issue a formal complaint against the company. Still, none of it was enough to shake Wal-Mart’s conviction that organized labor needed to be stopped at all costs. “I’ve never seen a company that will go to the lengths that Wal-Mart goes to, to avoid a union,” said consultant Martin Levitt, who helped the company hone its attack before writing a book titled Confessions of a Union Buster. “They have zero tolerance.”

But if Wal-Mart was particularly strong-minded in its opposition to organized labor, most other companies weren’t too far behind. For example, at a Coca-Cola bottling plant in Yuma, Arizona, a manager told employees in 2002 that they’d lose their 401(k)s if they voted to be represented by the United Industrial, Service, Transportation, Professional, and Government Workers of North America. The union lost the election by a vote of eleven to ten. At a General Electric subsidiary in Muskegon, Michigan, which hadn’t been organized, a machine operator named Michael Crane began to hand out literature promoting the Electrical Workers and wear a union T-shirt on the job, only to find his manager repeatedly asking him, “Don’t you have a better shirt to wear?” If this was overly subtle, the company’s subsequent action wasn’t: it fired Crane on a bogus accusation of substandard workmanship.

It wasn’t always like this. In the 1960s, when some 30 percent of the private-sector workforce in America was still unionized, employers were generally cautious about how far they’d go in trying to repel organized labor. “They were as nervous as whores in church,” said one corporate adviser. “The posture of major company managers was, ‘Let’s not make the union mad at us during the organizing drive or they’ll take it out at the bargaining table.’” By the 1980s, a whole industry had sprung up to assist business: “union avoidance” consultants, lawyers, psychologists, and strike-management firms. The more enfeebled the unions became, the more forcefully employers then acted. By the 1990s, even companies that had once accepted unions as a fact of life were now totally defiant.

“The most intense and aggressive antiunion campaign strategies, the kind previously found only at employers like Wal-Mart, are no longer reserved for a select coterie of extreme antiunion employers,” Cornell University’s Kate Bronfenbrenner wrote in an analysis of how business conducted itself during representation elections from 1999 through 2003. Specifically, she discovered that companies threatened to close the facilities where employees were trying to organize in 57 percent of elections, raised the possibility of cuts to wages and benefits in nearly half, and went so far as to actually discharge workers a third of the time. Over the years, employers also became adept at delaying union elections so that they had more time to coerce workers to see things their way.

All of this proved highly effective. Even if the NLRB ruled that a company had illegally dissuaded employees from organizing, the price to pay was typically small—about $200,000 in penalties—compared with the many millions of dollars that could be saved by ensuring that union negotiators never got a shot to bolster workers’ earnings, retirement plans, and health-care benefits.

Unions hurt themselves as well. Apparently carried away by the power that they were able to maintain through the 1950s and ’60s, labor leaders by the 1980s were dedicating less than 5 percent of their budgets toward recruiting new members. By the time that they woke up and realized that it was crucial to expand their rolls, the composition of the economy had been recast. During the 1930s, when the Auto Workers were first signing up members inside the nation’s car factories, it was possible to pull in large pools at once because each plant averaged between 2,000 and 3,000 blue-collar employees. By contrast, organizers were likely to find a tenth that many workers under the roof of a big-box retail outlet and far fewer at a fast-food joint, making it much more expensive and daunting to pick off sizable slices of the service sector. The growing use of temps and other contingent workers also complicated union organizing.

Labor did notch some significant triumphs through the 1990s, such as the Service Employees International Union’s Justice for Janitors campaign, which propelled more than 100,000 building workers in various cities into collective bargaining, and the election that saw 74,000 home-care aides in Los Angeles County join the SEIU. But such advances were extremely rare. By 2000, the portion of private-sector workers across the United States who were union members stood at just 9 percent—less than half of what it had been in 1980. By 2010, this figure would dip below 7 percent, even though surveys showed that a majority of nonunion workers desired representation.

The drop was rough on employees across the economy. While the ascension of unions helped to lift pay and benefits for tens of millions of workers in the first few decades after World War II—not only at those companies that had been organized, but at many others that felt the need to keep up—the fall of organized labor was now dragging down people’s compensation. Researchers have found a connection between the decline of unions and the rise of income inequality in America; without the added clout that comes with collective bargaining, a good many employees have seen their compensation stagnate or even get set back. “Deunionization,” Lawrence Mishel of the Economic Policy Institute has written, “has strengthened the hands of employers and undercut the ability of low- and middle-wage workers to have good jobs and economic security.”

For many people, this reality has had a perverse effect: it has incentivized them to do their shopping at one destination above all others—Wal-Mart. It is tough to fault them. In the early 2000s, the prices on groceries at a Wal-Mart Supercenter were running 8 to 27 percent below those of a traditional supermarket. All in all, the company’s prices have been so low—and its reach so vast—that some economists have credited it with singlehandedly helping to hold down inflation for the entire country.

Even union members, who are keenly aware of the company’s two-fisted tactics against organized labor, have been tempted by the discounts. “Ten bucks,” said Glenn Miraflor, an ironworker and a father of four, as he lowered a twenty-inch box fan into his cart at a Wal-Mart in Las Vegas in 2003. “You can’t beat that.” Eyeing a new PC, he added: “Where else are you going to find a computer for $498? Everyone I work with shops here.”

The clamp that Wal-Mart put on workers didn’t stop with those on its own payroll.

Most every time it opened a new store, other retailers that charged their customers more—but also paid their employees better—would find that they couldn’t compete. For many of them, the only way to survive was to imitate the Wal-Mart model, cutting back on their own workers’ wages and health care. “Socially,” said Craig Cole, the chief executive of Brown & Cole Stores, a supermarket chain in the Pacific Northwest, who was wary of Wal-Mart coming into his area, “we’re engaged in a race to the bottom.”

In October 2003, members of the United Food and Commercial Workers struck Vons and Pavilions markets in Southern California. In a show of solidarity, Ralphs and Albertsons locked out their union members the next day. In all, 59,000 workers were idled. Although the grocery chains and the union were the only two sides actually facing off at the negotiating table, another presence hovered over the talks: Wal-Mart, which was on the verge of opening dozens of Supercenters in the region. “They are the third party now that comes to every bargaining situation,” said Mike Leonard, director of strategic programs for the UFCW.

As the supermarkets depicted it, the union would have to make concessions—or Wal-Mart would use its low-price formula to snatch away a substantial amount of their business, causing them to close many of their stores and costing many thousands of jobs. After all, what else could anyone expect to happen when Wal-Mart paid its grocery workers about nine dollars an hour in wages and benefits, compared with nineteen at the big supermarkets? Already, from 1992 through 2003, Wal-Mart’s leap into groceries had led to the closure of 13,000 supermarkets in other parts of the country and the bankruptcy of at least twenty-five regional chains. Yet for the UFCW and its allies, this was precisely the time to hold the line and not let Wal-Mart dictate that the entire sector would subject its workers to the lowest common denominator. “If we don’t as labor officials address this issue now, the future for our membership is dismal, very dismal,” said one California union representative.

For nearly five months, the UFCW rank-and-file walked the picket lines from San Diego to Bakersfield. Finally, a settlement was reached—and it marked a big step backward for the union and its members. Central to the accord was the introduction of a two-tier system under which the supermarkets would pay new hires much less in wages and benefits than veteran employees. The impact, said Los Angeles Times columnist Michael Hiltzik, was that UFCW “membership will evolve from a group of decently paid workers into a group of poorly paid ones. Its resources to fight the employers will dwindle, and its prospects for growth will vanish.… The damage done by this dispute to the principle of providing a living wage and adequate health-care coverage for employees will be felt by workers—union and nonunion—around the state and across the country.”

Some, including Hiltzik, maintained that Wal-Mart had been a convenient bugaboo for the supermarkets and that they were unlikely to pass along much savings from the new labor agreement to their customers; instead, they would reward their shareholders and senior executives. At the same time, others welcomed Wal-Mart, especially in the inner-city neighborhoods that the company had set its sights on. “I’d rather have a person on somebody’s payroll—even if it isn’t at the highest wage—than on the unemployment roll,” said John Mack, president of the Los Angeles Urban League. Boosters predicted other positives as well. “Money that people save on groceries will be redirected to other items, including housing, savings, health, entertainment, and transportation,” said a report from the Los Angeles County Economic Development Corporation. “This new spending will, in turn, create jobs outside the grocery industry.”

But whatever pluses Wal-Mart was bringing, this much was beyond debate: another set of workers was now part of an industry that couldn’t guarantee them a solid living. Under the old contract, a grocery clerk in Southern California could earn more than seventeen dollars an hour; under the new one, workers started at less than nine dollars and could make, at most, about fifteen. Overall, the average wage for a supermarket worker was poised to decrease between 18 and 28 percent. And anywhere from a quarter to nearly half of the unionized workforce would no longer be eligible for medical insurance. Employer pension contributions were also slashed 35 percent for current employees and 65 percent for new workers. “You’re not going to be able to make a career out of it anymore,” said Kerry Renaud, a produce worker at a Vons supermarket in Hollywood. Barbara Harrison, a bakery worker at Ralphs, put it this way: “With treatment like this from the company… the middle class is just becoming the lower class.”

In 2007, the UFCW would win back some of what it had lost. Most important, the union was able to reverse the two-tier setup. With low employee morale plaguing the supermarkets, they apparently came to the realization that dividing the workforce wasn’t so smart. The pact also included the first pay raises that workers had received in five years. But this wouldn’t be a tale of sweet redemption. As the years went on, the social contract throughout the industry would continue to weaken. Wal-Mart and other nonunion grocers acquired more and more market share. And at the unionized chains, rule changes and scheduling tricks made workers wait an increasingly long time before they could move up the ladder and command the highest wages. By 2016, the average grocery worker in L.A. would be making less than $29,000 a year, down from more than $31,000 in 2005, when adjusted for inflation. “Grocery store jobs look much more like fast-food jobs than they used to,” said Chris Tilly, director of the UCLA Institute for Research on Labor and Employment. “Lower pay, fewer benefits, more people part-time.”

Meanwhile, Wal-Mart’s sway extended to another batch of workers: those of its suppliers. “Your price is going to be whittled down like you never thought possible,” said Carl Krauss, the owner of a Chicago company called Lakewood Engineering & Manufacturing. It produced the fan that Las Vegas ironworker Glenn Miraflor purchased at Wal-Mart for ten dollars. A decade earlier, it cost shoppers twice as much. But Wal-Mart, whose bigness gave it tremendous leverage over the companies whose wares filled its aisles, insisted that Krauss figure out how to make the appliance for less. “You give them your price,” Krauss said. “If they don’t like it, they give you theirs.” To lower expenses, Krauss automated his factory: where it once required twenty-two people to assemble a product, it now took only seven. Krauss also leaned on his own suppliers to reduce prices for their components. Still, that wasn’t enough. And so in 2000, Krauss opened a factory in Shenzhen, China, where workers earned twenty-five cents an hour, compared with thirteen dollars in Chicago. By 2003, about 40 percent of Lakewood’s products were being made in China, including most heaters and desktop fans. For the box fan that Glenn Miraflor bought, the electronic innards were imported.

Many of Wal-Mart’s suppliers found themselves in the same boat—or, more accurately, filling an endless string of container ships crossing the Pacific Ocean. In the mid-1980s, Wal-Mart had tried to score public-relations points by sourcing more products domestically. Under its “Buy American” initiative, patrons to its stores were greeted with red-white-and-blue banners and smaller signs that read, “This item, formerly imported, is now being purchased by Wal-Mart in the USA and is creating—or retaining—jobs for Americans!” By the mid-1990s, however, “Buy American” didn’t pencil out for the company anymore. In 2002, Wal-Mart brought in $12 billion worth of Chinese-made goods, double the sum from five years earlier. By 2004, that was up to $18 billion, and by 2006 it had reached $27 billion, accounting for nearly a tenth of total US imports from China. “People say, ‘How can it be bad for things to come into the United States cheaply? How can it be bad to have a bargain at Wal-Mart?’” said Steve Dobbins, president of Carolina Mills, a supplier of thread and yarn to the textile industry, which has watched as employment in American apparel factories has disappeared at a rate of more than 11 percent a year since the late nineties. “Sure, it’s held inflation down, it’s great to have bargains. But you can’t buy anything if you’re not employed. We are shopping ourselves out of jobs.”

By one estimate, Chinese imports to Wal-Mart alone were responsible for wiping out more than 300,000 US manufacturing jobs between 2001 and 2013—a manifestation of what had suddenly become a very serious predicament across America. “Until about a decade ago, the effects of globalization on the distribution of wealth and jobs were largely benign,” Michael Spence, a Nobel Prize–winning economist, wrote in 2011. “But employment in the United States has been affected… by the fact that many manufacturing activities, principally their lower-value-added components, have been moving to emerging economies.” Through the 1990s, the United States ran a trade deficit, but it averaged only about 1 percent of the nation’s economic output—a bearable volume. To be sure, the North American Free Trade Agreement caused some pockets of the nation to shed jobs, but the negative repercussions were quite small on the whole, and the losses may well have been eclipsed by new US employment that NAFTA helped to generate.

China was very different, however. After the country joined the World Trade Organization in 2001, giving it more access to markets across the globe, imports into the United States surged. The US trade deficit exploded, averaging about 5 percent of economic output in the 2000s. A consensus of economists and policymakers had long contended that free trade easily made for far more winners than losers. But, as the Wall Street Journal has pointed out, “China upended many of those assumptions. No other country came close to its combination of a vast working-age population, super-low wages, government support, cheap currency, and productivity gains.” Within four years of the WTO admitting China, imports from there as a percentage of US economic output doubled. It took Mexico twelve years to do the same thing following the passage of NAFTA.

“China’s low-cost imports swept the entire US,” the Journal reported, “squeezing producers of electronics in San Jose, California; sporting goods in Orange County, California; jewelry in Providence, Rhode Island; shoes in West Plains, Missouri; toys in Murray, Kentucky; and lounge chairs in Tupelo, Mississippi, among many other industries and communities.” Total manufacturing employment in the United States, which had held fairly steady through the 1990s, plunged after 2001. The harm inflicted would lead many Americans to regard globalization as more bad than good—an attitude that would help shape the contours of the 2016 US presidential race. “The Chinese export onslaught… left a scar on the American working class that has not healed,” Eduardo Porter of the New York Times has written.

How much of this can or should be blamed on any single company—even a colossus like Wal-Mart—is certainly arguable. Not surprisingly, Wal-Mart itself has always made the case for total absolution. “Some well-meaning critics believe that Wal-Mart, because of our size, should play the role that General Motors played after World War II, and that is to establish the post-world-war middle class that the country is so proud of,” said H. Lee Scott Jr., who took over as the company’s CEO in 2000. “The facts are that retailing doesn’t perform that role in the economy as GM does or did.”

While Scott wasn’t wrong, that was surely of little solace to millions of workers who’d seen the economy change, leaving them with fewer and fewer ways to get ahead. When GM and other manufacturers employed a quarter or more of the American workforce through the early 1980s, even those without much education could land a factory job and do quite well. But what were the two-thirds of Americans without a four-year college degree supposed to do now? Manufacturing, with its high wages and good benefits, employed just 10 percent of Americans by 2010. And even many of these industrial jobs and other blue-collar positions now demanded technical instruction beyond high school—something that far too few people had.

“The person who unloads the truck now has to have some training in logistics and inventory systems,” said Anthony Carnevale, director of Georgetown University’s Center on Education and the Workforce. “You don’t become an auto mechanic any longer by getting a dirty rag and hanging out with your uncle.” It wasn’t that more of the nation hadn’t grasped the value of being in a classroom. Between 1967 and 2012, the ranks of US adults with at least a four-year college degree rose from 13 percent to 32 percent, “a remarkable upgrading in the skills of America’s workers,” as a study by Carnevale has described it. Yet even when you added up all of those with four-year college diplomas, two-year associate degrees, and postsecondary vocational certificates, it still came out to less than half of the working-age population. Making things worse, many big companies—even while complaining about a “skills gap”—had stopped trying to develop their people. While both white- and blue-collar workers received on-the-job training as a matter of course in the 1950s and ’60s, surveys now indicated that anywhere from about 50 percent to 80 percent of American employees got nothing of the sort.

For most of those without the right credentials, the only option was to try to make it amid a decidedly low-wage landscape, one now dominated by poorly paying service providers. In 1960, eleven of the fifteen biggest employers in the country made things, led by GM, with nearly 600,000 workers. By 2010, only four manufacturers were in the top fifteen; the rest were in services, including not only Wal-Mart but also McDonald’s, Yum Brands (the parent of Taco Bell, KFC, and Pizza Hut), Target, and CVS. These occupations “are crucial to the support and growth of major industries across the country,” officials at the San Francisco Federal Reserve said, “but many of these workers do not earn enough to adequately support their families, even at a subsistence level.”

Yet for Lee Scott, paying more and offering larger benefits would have undercut Wal-Mart’s reason for being. If “we raised prices substantially to fund above-market wages,” he said, “we’d betray our commitment to tens of millions of customers, many of whom struggle to make ends meet.”

Others, though, didn’t buy it. “Being able to purchase groceries 20 percent cheaper at Wal-Mart,” said Paul Samuelson, who is regarded as the father of modern economics, “does not necessarily make up for the wage losses.” Besides, this wasn’t the only way to do business. Some retailers had an alternate approach—most notably Costco, which paid an average of seventeen dollars an hour in 2005, more than 40 percent higher than the average wage at Wal-Mart’s Sam’s Club, its closest rival. What’s more, a much bigger proportion of Costco workers had medical coverage and retirement plans than did their counterparts at Wal-Mart, and these benefits were far superior. After four and a half years, a full-time Costco worker would earn more than $46,000 a year; a full-time Sam’s Club worker made $27,000. “We’re not the Little Sisters of the Poor,” said Jim Sinegal, Costco’s longtime CEO. “This is good business.”

By investing in his employees, Sinegal knew, they were bound to give excellent service to the company’s customers, which helped to drive satisfaction and sales. “It starts with a living wage and affordable quality health benefits,” said Richard Galanti, the chief financial officer at Costco, whose stock price would greatly outperform Wal-Mart’s through most of the 2000s and beyond. “That’s the initial basis for engagement.”

Across most of the American economy, however, Costco has very much been the exception; Wal-Mart, the rule.

Economically, the new century began with a whimper, as two recessions pummeled America within the first ten years.

The first slump, which began in March 2001, was a surprise to many. Even with all of the downsizing that had taken place during the 1990s, the Clinton expansion had gone on so long, it convinced some in Washington that they had—by virtue of their astute handling of fiscal and monetary policy—rendered the nation forever safe from any more broad-based economic declines. Theirs was a cocksureness akin to what President Kennedy’s aides had exhibited in the early 1960s, though this latest exuberance was heightened by all of the dot-coms that had sprouted up during the mid-to-late nineties. The Internet seemed to be fundamentally remaking the way that one industry after another went about its work, spurring productivity that would result in widely shared prosperity. “The New Economy represented… a shift from the production of goods to the production of ideas, entailing the processing of information, not of people or inventories,” Joseph Stiglitz, the chairman of President Clinton’s Council of Economic Advisers, has written. “The New Economy also promised the end of the business cycle, the ups and downs that had, until now, always been part of capitalism, as new information technologies allowed businesses to better control their inventories.”

But the bubble burst, as it inevitably does. A host of high-tech companies, many of which had been built largely on puffery, went under. Businesses, which had overinvested in IT, cut back. The stock market, which had become overinflated, tanked. As recessions go, this one was reasonably short (only eleven months long) and, according to some indicators, shallow. But it took the labor market an awfully long while to rebound completely—more than three years. Considering how slowly employment also had bounced back after the previous recession, a decade earlier, it was looking like “jobless recoveries” may have become the new normal.

The next economic upturn would last six years, but it wasn’t the type of expansion that had nurtured and sustained the middle class in the decades after World War II. Corporate profits were strong, but total output was weak. Employment grew at less than a 1 percent annual rate, compared with the postwar average of 2.5 percent. Wages and salaries moved up at a rate just under 2 percent, half the postwar average. Few people, however, seemed to notice—or, if they did, to care—because one thing was going up and up and up: the value of their houses.

From 2000 through 2006, home prices increased across the country by more than 90 percent. In some places—Las Vegas, Phoenix, Miami—values more than doubled. It was the biggest housing boom in American history, and a cocktail of low interest rates and a growing populace had persuaded many in the industry that it might never end. For homebuyers, it was almost as easy to order up a mortgage as it was a Starbucks cappuccino, and many purchased properties far beyond what they could afford. It seems incontrovertible that lending money to people who don’t have the means to pay it back is a stupid thing to do, but as Michael Lewis has detailed in his book The Big Short, those selling subprime mortgages operated by a different criterion: “You can keep on making these loans, just don’t keep them on your books. Make the loans, then sell them off to the fixed income departments of big Wall Street investment banks, which will… package them into bonds and sell them to investors.”

By the middle of 2007, the whole thing was toppling, sparking what would be the longest and most severe downturn since the Depression of the 1930s. The Great Recession began officially in December 2007; it ended in June 2009. But the worst was far from over. Nationwide, housing prices didn’t bottom out until 2012, and by then they’d lost a third of their value from their pinnacle in 2006. In the interim, more than 4 million homes had been foreclosed upon.

The labor market was hit especially hard. The unemployment rate, which peaked at 10 percent in October 2009, was still sitting above 7 percent four years later. It would take six and a half years before all of the jobs that had been lost in the recession were regained. This was the most jobless of all the jobless recoveries yet. Of particular concern was the historically high number of Americans who were unemployed for six months or more, causing both their skills and feelings of self-worth to atrophy. Millions more “missing workers” left the labor force altogether. Growth in wages was modest, and five years into the recovery many workers—retail salespersons, waiters and waitresses, food preparers, janitors, maids, and more—had actually seen their pay go down from where it had been when the economy had supposedly troughed. Many companies pulled back on their health care and retirement benefits, and it would take years to restore them to prerecession levels, if they ever got there.

The Great Recession was the proximate cause of much of this pain for workers, but in many ways the downturn had merely underlined deeper developments that had long been underway. “Arguably the most important economic trend in the United States over the past couple of generations has been the ever-more-distinct sorting of Americans into winners and losers, and the slow hollowing of the middle class,” the journalist Don Peck has written. “For most of the aughts, that sorting was masked by the housing bubble, which allowed working-class and middle-class families to raise their standard of living despite income stagnation or downward job mobility. But the crash blew away that fig leaf. And the recession has pressed down hard on the vast class of Americans with moderate education and moderate skills.”

It was difficult to find a company that the Great Recession hadn’t touched. As most of the global economy tipped into crisis, Coca-Cola felt the effects of lower consumer spending; practically every region in the world was sick to some degree—be it the equivalent of “bad bronchitis” or a “mild cold,” said CEO Muhtar Kent. In response, Coke cut costs. But it also faced challenges beyond the ailing economy.

After Roberto Goizueta died in 1997, Coke’s momentum faltered, and nobody seemed to have inherited his Midas touch; the company burned through two CEOs in just seven years—first Doug Ivester and then Douglas Daft. The latter had tried to right things in 2000 by implementing a restructuring, the likes of which Coke had never experienced before. Outside of its bottling network, the parent company had been able to preserve a familial spirit a lot longer than had most other American corporations. More concretely, Coke’s employees had been spared from the mass layoffs that had punished their peers at General Electric, General Motors, Kodak, and so many other businesses during the 1980s and ’90s. “At Coke,” said Neville Isdell, a top executive who had joined the company in 1966, “a job had almost always meant a job for life.” Not any more. Even at Coca-Cola, a social contract based on loyalty between employer and employee was now a thing of the past.

Daft’s overhaul called for getting rid of more than 5,000 jobs out of a global workforce of 30,000. About half the pink slips were handed out at Coke’s Atlanta headquarters. Entire departments, including payroll and building and grounds, were outsourced. Some employees, based abroad, were fired by voice-mail message. Many middle managers lost their stock options. “What that says to the remaining employees is, ‘It doesn’t matter what kind of job I do. We’re all susceptible,’” remarked one executive who was let go. “It was cutting to the quick.” People started calling the CEO “Daft the Knife.”

The company, in the meantime, had its own harsh label to contend with: racist. In 1999, four current and former employees sued Coke, alleging disparities between whites and blacks in their promotions, performance evaluations, dismissals, and pay. The median annual salary for African Americans at the company was about $36,000 compared with $65,000 for whites. Some 2,000 employees soon became part of the class-action case. “In 114 years, you’ve only had one black senior vice president,” Larry Jones, who had been a benefits manager at Coke, told Daft in April 2000 at the corporate annual meeting, where he led a delegation of protestors. “In 114 years, you only found one of us qualified? How long do we wait?”

In November, the company settled the suit for $192 million—about $40,000 per plaintiff—the most ever in a discrimination case. Even more extraordinary was that Coke agreed to have a panel of outside experts regularly review its pay and promotion practices, thereby ensuring that the company was making the necessary changes to truly become a place of equal opportunity. Managers’ compensation was now tied to diversity goals. “The internal cultures of companies have been built on patterns of exclusion based on gender and race,” said the Reverend Jesse Jackson, who had first condemned Coke’s civil-rights record back in the early 1980s and had again voiced his disapproval of the soda giant after the latest lawsuit was filed. “This is a step in the right direction.”

Most other corporations remained two steps behind. In 2014, minorities would make up 37 percent of workers at larger companies—but only 13 percent of executives and senior managers. African Americans accounted for about 15 percent of the workforce at these companies, but a mere 3 percent were executives and only 7 percent had reached middle management.

In the five decades since Walter Reuther had marched with Martin Luther King Jr. and riots had convulsed Rochester and other cities around America, you could make the argument that the nation had bent slowly toward racial justice. The black middle class had expanded fivefold since the early 1960s, and the share of African Americans with a college degree had more than tripled, to 22 percent. In 2008, an African American was elected president of the United States for the first time, and in 2012 Barack Obama would be reelected. But many other measures were troubling: blacks still earned about 25 percent less than whites on average and were twice as likely to be unemployed. In an economy now split between good-paying knowledge work and poor-paying service work, African Americans were clustered disproportionately at the low end. Blacks who grew up in middle-class families were apt to slide down the socioeconomic spectrum far more easily than were whites. Despite the educational strides that the community had made, black college graduates had a much harder time finding work than whites did, and when well-educated African Americans were employed, they were compensated badly. Hispanics, for their part, had to overcome comparable discrimination in the workplace.

Women were up against their own barriers. They had continued to pour into the labor force, so that by 2015 they would make up nearly half of it, a sharp increase from less than 30 percent in 1950. They were now earning about 80 percent of what men did, versus 60 percent fifty years earlier. But the pace of change had slowed. By 2000, the growth of women in the workforce had leveled off, and the pay gap stopped shrinking much. For many the glass ceiling never cracked, much less shattered. In 2014, less than 30 percent of the executives and senior managers at bigger companies were women. Of the 500 largest companies in the country, just twenty-one had a female CEO.

There were all kinds of theories for why women were still being held back, most of which could be condensed to a single word: sexism. One sociologist, for instance, found that women’s pay got cut when they had children (because motherhood made them seem less committed to the job), while men were rewarded (because fatherhood made them seem more dependable and deserving). A woman’s earnings, which were now often the same or even higher than those of her male colleagues when entering the workplace, tended to plateau by the time she was in her midthirties to midforties; men’s salaries continued to climb. Another investigation showed that whenever a lot of women entered a field, the pay went down—for the very same work that men were doing in greater numbers before. Others pointed to surveys proving that men were just clueless; two-thirds thought that their female coworkers had equal opportunities on the job, making prejudice difficult to combat because it wasn’t even recognized.

As for Coke, it would do much better than most employers, thanks to the goad of litigation. In 2000, just 8 percent of corporate officers had been racial and ethnic minorities, even though they constituted about a third of the company. By 2006, that was up to 22 percent. The share of women officers at Coke rose from 16 percent to 27 percent. The company’s compensation system was also revamped to remove inequities.

The layoffs left their own mark. When Neville Isdell succeeded Daft as CEO, he found an “atmosphere of fear and disaffection” throughout Coca-Cola. He was able to revive employees’ spirits before stepping down in 2008, however. And that cleared the way for Muhtar Kent, the next CEO, to map out a plan to double the size of Coke’s business by 2020. To help meet his target—during a time when consumers were showing less and less appetite for sugary drinks, no less—Kent launched one of the most exhaustive leadership development programs in all of American industry. Still, even with that cultivation of talent, it wasn’t as if Coke had resumed its old custom of giving a job for life. In 2015, as part of a multibillion-dollar cost-cutting bid, the company said it was severing about 1,700 white-collar positions, including 500 at headquarters. It was the largest reduction since the days of Daft.

The Great Recession also bruised Kodak. In late 2008, the company cited the “unprecedented amount of uncertainty surrounding the economic environment” before suspending its wage dividend for the first time since the Great Depression. But the truth was, Kodak had been trapped in a downward spiral for decades before the economy went sour. George Fisher, who bowed out as CEO in 2000, had done all he could to try to speed the company’s transition to the world of digital photography. So had his two successors. But none of them got nearly far enough fast enough. In 2007, the company moved into the printer business, excited that its scientists had invented a pigment-based ink that didn’t clog the nozzles of printing heads. It was an impressive innovation but didn’t turn things around. Kodak lost money in 2008, 2009, 2010, and 2011. In 2012, it filed for bankruptcy. The judge who was in charge of Kodak’s reorganization under Chapter 11 called the company’s collapse “a tragedy of American economic life.”

It was also a good reminder that the social contract between employer and employee can only function when a company is doing well. An unprofitable business isn’t a stable place to work. By the time it emerged from bankruptcy in 2013, with a focus on commercial printing, Kodak employed fewer than 9,000 people worldwide, down from more than 50,000 in 2005 and 128,000 in 1980. In Rochester, just a couple of thousand folks were now left working for Kodak, barely a blip when compared with the 60,000-plus it employed there thirty years earlier. The company also ended health-care coverage for its 56,000 retirees and their dependents. Many lost their pensions as well.

Rochester’s civic leaders expressed hope that the city, with its world-class universities and castoffs from Kodak, would become a breeding ground for a new crop of technology companies. And to some extent, their vision has been fulfilled, with young companies producing advanced manufacturing materials, chemicals, and more. But with Kodak in such a diminished state, others have bemoaned that the bountiful blue-collar work that so many locals once thrived upon—and which ranked Rochester near the top of American cities with the highest-wage jobs—was now gone and never coming back. “Good jobs,” said Jennifer Leonard, president of the Rochester Area Community Foundation, have largely been overtaken by “service jobs that don’t have benefits and don’t pay well.”

Of all the companies in America, General Motors was among those that the Great Recession rocked the most. The outcome was all the more heartbreaking because, for a time, it seemed like GM might finally have clawed past the calamities that the company had suffered during the 1980s and early ’90s. In 1994, Fortune praised Jack Smith for having rescued GM and said that the company’s “best days lie ahead.” And for a while, few could have taken issue with the magazine’s assessment. The company remained profitable through the late 1990s, and even after the terrorist attacks of September 11, 2001.

Smith’s successor, Rick Wagoner, displayed a willingness to tackle some of GM’s long-unattended-to troubles, including closing its failing Oldsmobile division. At a company notorious for moving too slowly, Wagoner put in place a program called “Go Fast” to unclog snarls in the system. Under him, GM also showed another attribute that it was not known for in the past: humility. “Ten years ago we had a choice,” the company said in a 2003 advertising blitz. “We could keep looking in the rearview mirror, or out at the road ahead. It was the easiest decision we ever made. The hard part meant breaking out of our own bureaucratic gridlock. Learning some humbling lessons from our competitors.”

Before long, however, GM under Wagoner reverted to its old habits: proceeding tentatively, becoming too insular, not picking up swiftly enough on what car buyers wanted—more hybrids, fewer Hummers. The good years had fooled many into thinking that GM was in fine shape, but it never really had been. “If you’re bleeding from the capillaries it’s hard to notice,” said Paul O’Neill, the former US treasury secretary, who served on the GM board during the mid-1990s. By 2005, GM was bleeding from everywhere. It lost more than $10 billion that year, as it fought a losing battle to contain costs while Toyota and other foreign automakers siphoned market share. The company reacted by announcing a new series of cuts, including the shutdown of all or parts of twelve factories in North America and the elimination of 30,000 more jobs.

By 2007, things were so bleak that the United Auto Workers agreed to historic compromises. In September, the union had undertaken its first nationwide strike against GM since 1970. But with no muscles to flex, it ended the walkout just two days later. The UAW then consented to the establishment of a two-tier wage structure under which many new hourly hires would earn half as much as current workers. These second-class employees would find themselves living in a way that autoworkers never had before—paycheck to paycheck. Their pension and health benefits also were curtailed. The Jobs Bank was trimmed back. And, in the most consequential part of the agreement, the union accepted that GM would no longer be responsible for blue-collar retirees’ medical care. Instead, the company would put a capped amount into a union-controlled trust, and at that point GM was off the hook from any further obligations. If people’s future health needs couldn’t be met, that wasn’t the company’s worry anymore; it was the trust’s, the UAW’s, and the retirees’.

At first, the UAW was skeptical that GM’s finances were as bad as the company claimed. But GM was happy to open its books—the very thing it had refused to do in the mid-1940s, when Walter Reuther was trying to draw attention to the company’s amazing wealth—so as to make transparent how incredibly strapped it was now. “We showed these guys that the goose that laid the golden egg was about to die,” remembered Kent Kresa, a GM board member. Yet even if the UAW had no alternative, it didn’t lessen the magnitude of the moment. By acquiescing to the company’s transfer of tens of billions of dollars in health-care liabilities, it revealed just how tattered the old employer-employee compact had become. The headline in the progressive publication In These Times said it all: “Treaty of Detroit Repealed.”

As the recession deepened, things got worse for GM, which not only wasn’t selling enough cars but also had exposure through its financing arm to the sinking home-mortgage market. In late 2008, as the Bush administration extended more than $13 billion in financing to GM as part of its bailout of the auto industry, Wagoner pledged to repair the company “once and for all.” But he wouldn’t get the chance. In March 2009, the Obama administration pushed him out as CEO. In June of that year, GM filed for bankruptcy.

The company would exit bankruptcy, lightning fast, a little more than a month later. It was now in a form that the Obama auto-industry task force liked to call “Shiny New GM.” The company’s debt had been sliced by more than 70 percent, and it had far fewer factories, brands, dealers, and employees to manage. In the United States, only 69,000 workers remained at GM—down about 90 percent from the late 1970s.

Even with a lower-wage tier, a GM job was still attractive in a service-heavy economy, and many felt lucky to get hired there. Debbie Werner earned about seventeen dollars an hour installing parts on Chevys and Buicks—a lot better than the nine she was making as a nursing-home worker. David Ramirez pulled down more than eighteen bucks an hour on a GM transmission line, more than double the eight that Wal-Mart had been paying him. Over the next few years, GM would only get stronger. A new CEO, Mary Barra, who had been with GM for more than three decades and was the daughter of a Pontiac die maker, won high marks for restraining costs, making the company’s products more appealing, and holding employees more accountable. Her strategy was refreshingly simple: “No more crappy cars.” By 2015, GM was doing well enough that it would end the two-tier wage and benefit system and give its more experienced workers their first raises since 2007. Profit-sharing bonuses were becoming routine.

Nevertheless, none of this suggested a return to the age of “Generous Motors,” as GM had once been called. It would now take eight years for an hourly worker to reach top pay, instead of the three that it used to. The average wage for a factory worker at the company would be lower, in real terms, in 2019 than it had been in the 1990s or 2000s. And it was up to the union trust, not GM, to deal with the multibillion-dollar shortfall projected for blue-collar retirees’ health care. The company also halted medical coverage for about 100,000 white-collar retirees—a big step beyond the cutbacks seen in the Sprague case—and it tried to buy out many of their pensions as well, in an attempt to save tens of billions of dollars more. “Everybody is trying to run leaner now,” said David Cole, president emeritus of the Center for Automotive Research. “GM is not unique.”

To many observers, such changes were long overdue, and if GM executives had only had the courage to make them sooner the company might never have gone bust. Others had a different take. “That argument,” said Rick Wagoner, “ignores the fact that American automakers and other traditional manufacturing companies created a social contract with the government and labor that raised America’s standard of living and provided much of the economic growth of the twentieth century. American manufacturers were once held up as good corporate citizens for providing these benefits. Today, we are maligned for our poor judgment in ‘giving away’ such benefits forty years ago.”

In early 2011, President Obama named Jeff Immelt, the CEO of General Electric, as the chairman of his new Council on Jobs and Competitiveness. At first blush, it seemed perfectly reasonable for the White House to seek guidance on stimulating employment from a group of businesspeople headed by the leader of one of America’s marquee companies. With the labor market still so anemic, what could be bad about getting advice from somebody with 285,000 workers of his own?

But to many on both sides of the ideological divide, Immelt was the wrong guy to be the nation’s “jobs czar.” Whether it was liberal Sen. Russ Feingold and Progressives United or conservative talk-show host Bill O’Reilly and the Tea Party, their reproof was the same: Immelt’s company was mostly interested in doing business abroad.

In some ways, the criticism was uncalled for. While the majority of GE’s employees were now based outside the United States—150,000 international workers compared with 133,000 domestic ones—that was hardly out of the ordinary. Through the 2000s, US multinational corporations added a total of nearly 3 million workers in foreign locations while chopping their payrolls at home by more than 860,000. And while many companies (including GE) continued to shift some production to other countries so as to lighten their cost of labor, globetrotters like Immelt were going overseas mostly to make money, not just to save money; GE’s fastest-growing markets were no longer in the United States. “We go to Brazil, we go to China, we go to India,” said Immelt, who’d replaced Jack Welch in 2001, “because that’s where the customers are.”

If anything, Immelt had done more than most to bring new manufacturing jobs to America, as he reoriented GE back toward its roots of industrial engineering and away from financial engineering. This born-again transformation would culminate in 2015 as the company set out to sell most of GE Capital, its finance unit—a divestiture that, in the view of many, signaled that Immelt was finally stepping out from Welch’s long shadow. Along the way, GE reinvested in factories in Louisville, Schenectady, and other long-beleaguered towns. It was enough for some to submit that the United States was experiencing a full-blown manufacturing resurgence, especially with China’s own labor costs increasing fast.

But this was a pipe dream, at least as far as creating lots of new jobs was concerned. Because of automation and the mix of goods being produced, US manufacturers had no use for anywhere near as many workers as they did from the 1950s through the 1970s—even though the nation’s industrial output was actually much higher now. What once took 1,000 people to make could be cranked out these days by less than 200. “We need to get real about the so-called renaissance, which has in reality been a trickle of jobs,” said Steven Rattner, who ran President Obama’s auto-industry team.

To knock Immelt for any of this—for not having enough employees in the United States, for having too many across the sea—was to imply that the economy now behaved just as it had decades earlier. But this wasn’t Ralph Cordiner’s time or Reg Jones’s, or even Jack Welch’s. When Immelt came to the company in 1982, GE derived 80 percent of its revenue from within the United States. Now, 60 percent came from elsewhere. By 2016, 70 percent would. “I run an American company,” Immelt said. “But in order for GE to be successful in the coming years, I’ve gotta sell my products in every corner of the world.” If not, he added, “we’d have tens of thousands fewer employees in Pennsylvania, Ohio, Massachusetts, Texas. I’m never going to apologize for that, ever, ever.”

And yet, as much as one might be willing to abide Immelt’s embrace of becoming a “complete globalist,” as he termed it, there were still plenty of other reasons to question his appointment as job czar. They might start with how GE was pushing workers at some of its factories to accept a two-tier wage structure, with new hires earning as little as $25,000 a year, while veterans took a pay freeze. Or how the company had closed its guaranteed pension plan to new white-collar employees—even though it was fully funded and could meet its obligations—giving them only the option of a 401(k). Or how it had changed its health coverage for salaried personnel to make them more responsible for the cost of their own care. Or how it no longer would provide supplemental medical insurance for its retirees older than sixty-five. Or how, while wearing down workers in all of these ways, it had done its best to take care of investors by buying back tens of billions of dollars worth of its own stock so as to attempt to raise the value of its shares.

On the other hand, all of the above was standard stuff for American business at this point: crimping workers’ pay, reining in medical coverage, gutting retirement benefits. If President Obama had tried to find a jobs czar from a leading corporation that didn’t do most of these things, he might have been hard-pressed to come up with a candidate. All of the trends that had started to sap workers’ sense of security in the 1970s and ’80s, and that had picked up in the ’90s, were continuing apace through the 2000s. By now, the social contract for most employees was basically kaput—and it didn’t matter what kind of company you were talking about, be it one with a history of troubles, like GM or Kodak, or one that had remained strong over a very long period of time, like Coca-Cola or GE. The story was pretty much the same everywhere.

For Immelt, the right way to think about people now seemed to lie out in Silicon Valley, where they practiced what he called “the essence of modern talent development.” These were the words that Immelt used in endorsing a book called The Alliance, written by three California entrepreneurs, including Reid Hoffman, one of the founders of LinkedIn, the online social network for professionals.

The old model of employment was a good fit for an era of stability,” Hoffman and his coauthors, Ben Casnocha and Chris Yeh, declared. “In that era, careers were considered nearly as permanent as marriage.” But such a construct, they said, was “too rigid” for today’s highly dynamic world.

What the authors urged, instead, was that employees sign up for a “tour of duty,” where it was spelled out what they were supposed to contribute over a prescribed period and what they’d gain in return, including skills and relationships. Toward the end of the tour, the employer and employee would discuss what should happen next—whether it was a new round at the company or a new job somewhere else. “Acknowledging that the employee might leave,” they wrote, “is actually the best way to build trust.”

This concept wasn’t new. For high-tech companies, “taking time off between jobs, moving from one company to another and then back again, and job-hopping in general are a way of life,” Paul Leinberger and Bruce Tucker noted in 1991 in The New Individualists: The Generation After the Organization Man. Still, it wasn’t hard to see why tech was so intriguing, especially for an executive like Immelt, who was betting big on turning GE into a digital powerhouse in its own right. He wanted his employees to be more comfortable taking risks and making mistakes—as long as they learned from them.

Silicon Valley seemed to be the one place where workers could find “purpose, freedom, and creativity,” as Google’s Laszlo Bock summed up the goals for his company. Depending on the employer, there could be more or less stress involved. Either way, there were invariably great perquisites—a twenty-first-century version of welfare capitalism, with climbing walls, free life coaching, and complimentary sushi in the cafeteria.

Yet as cool as companies like Google, Apple, Amazon, and Facebook were as employers—and as much joy as they’d given us as knowledge seekers and music fans, shoppers, and social animals—they weren’t job engines in the way that the big manufacturers once were. All combined, those four companies would employ about 300,000 people in the United States in 2015, less than half of what General Motors alone did in the late 1970s.

Software really was “eating the world,” as venture capitalist Marc Andreessen said. But software developers weren’t. The Labor Department counted a little more than 1 million Americans with such jobs in 2015—and they were terrific if you had the skills to get one, paying more than $100,000 a year on average. By comparison, there were more than 4.5 million people in retail sales, 3.5 million cashiers, 3 million food-service workers, 2.5 million waiters and waitresses, and 2 million secretaries and janitors. All of those jobs paid less than $27,000 a year.

When George Eastman had a fantastic idea for photography, he got quite rich, and the city of Rochester became a flourishing city for generations, supporting thousands of middle-class workers,” said Larry Summers, the former treasury secretary. “When Steve Jobs had remarkable ideas, he and his colleagues made a very large fortune, but there was much less left over—there was no flourishing middle class that followed in their wake.”

Not everyone was ready to let the old social contract perish without a fight. Dennis Rocheleau, who was GE’s labor relations chief under Jack Welch and then under Immelt before retiring in 2004, thought he might be able to prevail upon the company to reconsider its cutting of retiree medical benefits. Rocheleau wasn’t concerned that he himself couldn’t come up with the extra cash he’d now need to augment his Medicare coverage. As a top GE executive, he had made good money during his career—enough to fly back and forth between two residences: one in Connecticut and another in his home state of Wisconsin, a lakeside house filled with so many magnificent modern prints and crafts that it could have passed for a small museum. In addition, Rocheleau was still in good enough health in his early seventies that he would only be out about $1,000 a year because of the changes that GE was initiating.

But he worried that others among GE’s 50,000-plus salaried retirees and their spouses, who were sicker and more reliant on this supplemental insurance, would be hurt much worse. In time, GE would institute the same cuts for its hourly retirees. “The people they’re taking it from are the ones who can afford it the least,” Rocheleau said. Beyond that, there was something else gnawing at him: GE had gone back on a promise. And that, he’d been taught, was never okay.

Rocheleau grew up in the Wisconsin town of Two Rivers, where his dad worked in a wood-products factory. He went on to Northwestern University on a full scholarship, and then Harvard Law. Long fascinated by industrial relations, he joined GE as a junior union negotiator a couple of years before the ultimate flashpoint: the 1969 strike. When it was finally over and Lem Boulware’s formulation for beating down organized labor had been done away with, Rocheleau was pleased. GE’s insistence that the unions “take it or leave it” had never sat well with Rocheleau’s blue-collar pedigree. “I was schooled in Boulwarism,” he said, “but graduated to a better place: the land of the negotiated settlement.”

Over the next thirty-five years, as he moved up the organization, Rocheleau showed himself to be a hard bargainer. He was GE’s top labor-relations executive when the unions walked out in 2003—the first national work stoppage at the company since ’69. The key issue at the time: GE’s attempt to have its hourly workers assume more of their health-care costs. “We are not advocates for mushy labor-management cooperation,” Rocheleau once told an industry group. “We believe that there are conflicting interests within our industrial society and that the use of adversarial representation to accommodate those interests has been effective.”

But once a contract was hammered out, Rocheleau did everything he could to ensure that the company made good on its word. As time went on, GE’s unions weren’t nearly the force that they once were. When Rocheleau started at the company, more than 140,000 employees were covered by collective bargaining agreements. When he retired, that was all the way down to 24,000, on its way to just 16,000 in 2015. But Rocheleau respected the union leaders that he jousted with, and he wanted them to respect him. “I always maintained that at GE, promises made were promises kept,” he said. Rocheleau also took pride in another thing: speaking his mind. “I could have ass-kissed my way to success,” he said. He chose not to and became well known among GE executives for his frankness.

Initially, Rocheleau had hoped that he would be able to remedy the retiree medical situation quietly. He wrote to Immelt in late 2012, telling him that “my faith in GE’s expressed values has been seriously diminished,” while also assuring him that he would contest the company “in a responsibly measured manner.” “This isn’t just another instance of ‘Rocheleau going rogue,’” he said. He counseled restraint in others as well. When a retiree spokesman wanted to go to the company’s annual meeting and publicly embarrass Immelt by asking him what the health-care cuts would mean for the CEO’s father, who himself had worked at GE for forty years, Rocheleau got him to back off.

But Immelt never replied to Rocheleau’s letter. In 2013 and again in 2014, Rocheleau spoke out at the annual meeting, saying that the company had “turned its back on its traditions” and that “morality and integrity must outweigh legality and ability if you are going to make a long-term, mutually beneficial relationship.” Six months after the second meeting, Rocheleau sued GE—a case that, as of late 2016, was still pending.

As a senior manager at the company for so many years, Rocheleau knew full well why this was happening to its retirees: financial considerations were now preeminent, as GE made plain when it booked more than $4.5 billion in savings by dropping its post-sixty-five health-care obligations. But this was gravy. GE wasn’t in terrible straits. The enterprise was profitable. It was nice to be able to tidy up the balance sheet like this. “But look,” said Rocheleau, “a deal is a deal.”

As he made that comment, he stood beside a beautiful metal grate that had come from an old GE administration office in Schenectady. Along with all of his art, Rocheleau had integrated into his house a bunch of this industrial flotsam, which he had bought from the company as it tore down different buildings. “They just destroy stuff,” he said. “They have very little reverence for history.”

Can the American corporate social contract ever be reconstructed? The short answer is no—at least not in the way it was before. The Golden Age was sui generis, and too much has changed since then.

Some unions and other labor groups are using online technology to organize employees and give them new tools to stand up for their rights, while helping to finance valuable campaigns for low-wage workers such as the Fight for $15. Still, we’ll never again see private-sector unionization rates—and the potency that comes from widespread collective bargaining—like we had in the 1950s or ’60s. Our economy isn’t built for that anymore.

We must also accept that if artificial intelligence displaces masses of workers—and it well may, particularly those with few skills—we shouldn’t try to stand in its way. If the economy is going to continue to grow, we must have technological progress. We shouldn’t try to stop international commerce, either. It, too, is essential to our long-term economic growth, and trade restrictions won’t be effective anyway. A third of the components used in products made in America are imported from overseas. Plus, the flow of goods is flattening while the flow of data, zipping invisibly across borders, is swelling.

What’s required now is a new social contract that takes into account these realities.

The government has an enormous part to play. We need a higher federal minimum wage and, in our most expensive cities, a true living wage. Although organized labor won’t recapture its former prominence, Washington needs to make it easier for workers to unionize. We need paid family leave. We need stricter enforcement of labor standards to help workers fend off wage theft—the denial of pay and benefits rightly owed them—and other violations, especially at a time when so much of the nation’s employment is being generated among temps, freelancers, and others in contingent jobs. We need to expand the Earned Income Tax Credit.

We need portable benefits for Uber drivers and other “gig workers”—a still small but increasingly important segment of the labor market. We need the government to better support technological innovation and create an environment that fosters entrepreneurship, which, despite the radiance from Silicon Valley, has been at a low ebb in America for the past thirty years. We need the Federal Reserve to make full employment a priority.

And although it is easy to dismiss as a bromide, we really do need to revitalize public education from top to bottom.

We also need changes to our tax laws and accounting standards so that executives are induced to look beyond their company’s quarterly earnings and daily share price. We need institutional investors to exhort corporations to keep sight of the long term, and not just pay attention to results in the short term.

But more than anything, we need to recognize that the social contract between employer and employee won’t be strong unless our business leaders want it to be so.

“Companies once felt an obligation to support American workers, even when it wasn’t the best financial choice,” said Betsey Stevenson, who was chief economist at the Labor Department from 2010 to 2011. “That’s disappeared. Profits and efficiency have trumped generosity.”

We can’t mandate that executives think the way they used to. We can’t demand that they alter the norms by which they operate and—like the generation of executives who’d come through the crucible of the Great Depression and World War II—suddenly make it more about “we” and less about “I.” We can’t force them to invest in their workers through more training and higher compensation and stronger job security.

We—as employees and as consumers—can only encourage them to realize that corporate America’s continuing fixation with putting shareholders above everyone else is ultimately bad for their companies. “A healthy business,” wrote Peter Drucker, “cannot exist in a sick society.” Restoring more balance between worker and investor would be a welcome balm.

A few may be starting to get it. Wal-Mart, of all companies, made Fortune magazine’s Change the World list for 2016. It was so honored for raising its workers’ pay to an average of $13.38 an hour and enrolling half a million employees in a curriculum “designed to teach job skills that could help them climb the income ladder.”

And yet there is much, much more that Wal-Mart and many other corporations can do, and must do, if we are to come to grips with the most pressing issue of our time: distributing the nation’s economic gains more broadly.

The American people have continued up to now to tolerate abuses that have developed in our economic system because… they believe it can be made to work for the good of all,” one business leader avowed. “They see in it a chance for a better life for themselves. Almost unanimously, they want it for their sons and daughters.

“Most Americans,” he added, “would agree on the economic goals for America: a community permanently rich in opportunity and security.”

More than seventy years after William Benton, vice chairman of the Committee for Economic Development, made that statement, it would be difficult to argue that our goals are any different today. What has corroded, sadly, is the underlying belief that the system can be made to work for the good of all.