Evaluation

How did I do as a low-wage worker? If I may begin with a brief round of applause: I didn’t do half bad at the work itself, and I claim this as a considerable achievement. You might think that unskilled jobs would be a snap for someone who holds a Ph.D. and whose normal line of work requires learning entirely new things every couple of weeks. Not so. The first thing I discovered is that no job, no matter how lowly, is truly “unskilled.” Every one of the six jobs I entered into in the course of this project required concentration, and most demanded that I master new terms, new tools, and new skills—from placing orders on restaurant computers to wielding the backpack vacuum cleaner. None of these things came as easily to me as I would have liked; no one ever said, “Wow, you’re fast!” or “Can you believe she just started?” Whatever my accomplishments in the rest of my life, in the low-wage work world I was a person of average ability—capable of learning the job and also capable of screwing up.

I did have my moments of glory. There were days at The Maids when I got my own tasks finished fast enough that I was able to lighten the load on others, and I feel good about that. There was my breakthrough at Wal-Mart, where I truly believe that, if I’d been able to keep my mouth shut, I would have progressed in a year or two to a wage of $7.50 or more an hour. And I’ll bask for the rest of my life in the memory of that day at the Woodcrest when I fed the locked Alzheimer’s ward all by myself, cleaned up afterward, and even managed to extract a few smiles from the vacant faces of my charges in the process.

It’s not just the work that has to be learned in each situation. Each job presents a self-contained social world, with its own personalities, hierarchy, customs, and standards. Sometimes I was given scraps of sociological data to work with, such as “Watch out for so-and-so, he’s a real asshole.” More commonly it was left to me to figure out such essentials as who was in charge, who was good to work with, who could take a joke. Here years of travel probably stood me in good stead, although in my normal life I usually enter new situations in some respected, even attention-getting role like “guest lecturer” or “workshop leader.” It’s a lot harder, I found, to sort out a human microsystem when you’re looking up at it from the bottom, and, of course, a lot more necessary to do so.

Standards are another tricky issue. To be “good to work with” yourself, you need to be fast and thorough, but not so fast and thorough that you end up making things tougher for everyone else. There was seldom any danger of my raising the bar, but at the Hearthside Annette once upbraided me for freshening up the display desserts: “They’ll expect us all to start doing that!” So I desisted, just as I would have slowed down to an arthritic pace in any job, in the event that a manager showed up to do a time-and-motion study. Similarly, at Wal-Mart, a coworker once advised me that, although I had a lot to learn, it was also important not to “know too much,” or at least never to reveal one’s full abilities to management, because “the more they think you can do, the more they’ll use you and abuse you.” My mentors in these matters were not lazy; they just understood that there are few or no rewards for heroic performance. The trick lies in figuring out how to budget your energy so there’ll be some left over for the next day.

And all of these jobs were physically demanding, some of them even damaging if performed month after month. Now, I am an unusually fit person, with years of weight lifting and aerobics behind me, but I learned something that no one ever mentioned in the gym: that a lot of what we experience as strength comes from knowing what to do with weakness. You feel it coming on halfway through a shift or later, and you can interpret it the normal way as a symptom of a kind of low-level illness, curable with immediate rest. Or you can interpret it another way, as a reminder of the hard work you’ve done so far and hence as evidence of how much you are still capable of doing—in which case the exhaustion becomes a kind of splint, holding you up. Obviously there are limits to this form of self-delusion, and I would have reached mine quickly enough if I’d had to go home from my various jobs to chase toddlers and pick up after a family, as so many women do. But the fact that I survived physically, that in a time period well into my fifties I never collapsed or needed time off to recuperate, is something I am inordinately proud of.

Furthermore, I displayed, or usually displayed, all those traits deemed essential to job readiness: punctuality, cleanliness, cheerfulness, obedience. These are the qualities that welfare-to-work job-training programs often seek to inculcate, though I suspect that most welfare recipients already possess them, or would if their child care and transportation problems were solved. I was simply following the rules I had laid down for myself at the beginning of the project and doing the best I could to hold each job. Don’t take my word for it: supervisors sometimes told me I was doing well—“fine” or even “great.” So all in all, with some demerits for screwups and gold stars for effort, I think it’s fair to say that as a worker, a jobholder, I deserve a B or maybe B+.

But the real question is not how well I did at work but how well I did at life in general, which includes eating and having a place to stay. The fact that these are two separate questions needs to be underscored right away. In the rhetorical buildup to welfare reform, it was uniformly assumed that a job was the ticket out of poverty and that the only thing holding back welfare recipients was their reluctance to get out and get one. I got one and sometimes more than one, but my track record in the survival department is far less admirable than my performance as a jobholder. On small things I was thrifty enough; no expenditures on “carousing,” flashy clothes, or any of the other indulgences that are often smugly believed to undermine the budgets of the poor. True, the $30 slacks in Key West and the $20 belt in Minneapolis were extravagances; I now know I could have done better at the Salvation Army or even at Wal-Mart. Food, though, I pretty much got down to a science: lots of chopped meat, beans, cheese, and noodles when I had a kitchen to cook in; otherwise, fast food, which I was able to keep down to about $9 a day. But let’s look at the record.

In Key West, I earned $1,039 in one month and spent $517 on food, gas, toiletries, laundry, phone, and utilities. Rent was the deal breaker. If I had remained in my $500 efficiency, I would have been able to pay the rent and have $22 left over (which is still $78 less than the cash I had in my pocket at the start of the month). This in itself would have been a dicey situation if I had attempted to continue for a few more months, because sooner or later I would have had to spend something on medical and dental care or drugs other than ibuprofen. But my move to the trailer park—for the purpose, you will recall, of taking a second job—made me responsible for $625 a month in rent alone, utilities not included. Here I might have economized by giving up the car and buying a used bike (for about $50) or walking to work. Still, two jobs, or at least a job and a half, would be a necessity, and I had learned that I could not do two physically demanding jobs in the same day, at least not at any acceptable standard of performance.

In Portland, Maine, I came closest to achieving a decent fit between income and expenses, but only because I worked seven days a week. Between my two jobs, I was earning approximately $300 a week after taxes and paying $480 a month in rent, or a manageable 40 percent of my earnings. It helped, too, that gas and electricity were included in my rent and that I got two or three free meals each weekend at the nursing home. But I was there at the beginning of the off-season. If I had stayed until June 2000 I would have faced the Blue Haven’s summer rent of $390 a week, which would of course have been out of the question. So to survive year-round, I would have had to save enough, in the months between August 1999 and May 2000, to accumulate the first month’s rent and deposit on an actual apartment. I think I could have done this—saved $800 to $1,000—at least if no car trouble or illness interfered with my budget. I am not sure, however, that I could have maintained the seven-day-a-week regimen month after month or eluded the kinds of injuries that afflicted my fellow workers in the housecleaning business.

In Minneapolis—well, here we are left with a lot of speculation. If I had been able to find an apartment for $400 a month or less, my pay at Wal-Mart—$1,120 a month before taxes—might have been sufficient, although the cost of living in a motel while I searched for such an apartment might have made it impossible for me to save enough for the first month’s rent and deposit. A weekend job, such as the one I almost landed at a supermarket for about $7.75 an hour, would have helped, but I had no guarantee that I could arrange my schedule at Wal-Mart to reliably exclude weekends. If I had taken the job at Menards and the pay was in fact $10 an hour for eleven hours a day, I would have made about $440 a week after taxes—enough to pay for a motel room and still have something left over to save up for the initial costs of an apartment. But were they really offering $10 an hour? And could I have stayed on my feet eleven hours a day, five days a week? So yes, with some different choices, I probably could have survived in Minneapolis. But I’m not going back for a rematch.

All right, I made mistakes, especially in Minneapolis, and these mistakes were at the time an occasion for feelings of failure and shame. I should have pulled myself together and taken the better-paying job; I should have moved into the dormitory I finally found (although at $19 a night, even a dorm bed would have been a luxury on Wal-Mart wages). But it must be said in my defense that plenty of other people were making the same mistakes: working at Wal-Mart rather than at one of the better-paying jobs available (often, I assume, because of transportation problems); living in residential motels at $200 to $300 a week. So the problem goes beyond my personal failings and miscalculations. Something is wrong, very wrong, when a single person in good health, a person who in addition possesses a working car, can barely support herself by the sweat of her brow. You don’t need a degree in economics to see that wages are too low and rents too high.

 

The problem of rents is easy for a Noneconomist, even a sparsely educated low-wage worker, to grasp: it’s the market, stupid. When the rich and the poor compete for housing on the open market, the poor don’t stand a chance. The rich can always outbid them, buy up their tenements or trailer parks, and replace them with condos, McMansions, golf courses, or whatever they like. Since the rich have become more numerous, thanks largely to rising stock prices and executive salaries, the poor have necessarily been forced into housing that is more expensive, more dilapidated, or more distant from their places of work. Recall that in Key West, the trailer park convenient to hotel jobs was charging $625 a month for a half-size trailer, forcing low-wage workers to search for housing farther and farther away in less fashionable keys. But rents were also skyrocketing in the touristically challenged city of Minneapolis, where the last bits of near-affordable housing lie deep in the city, while job growth has occurred on the city’s periphery, next to distinctly unaffordable suburbs. Insofar as the poor have to work near the dwellings of the rich—as in the case of so many service and retail jobs—they are stuck with lengthy commutes or dauntingly expensive housing.

If there seems to be general complacency about the low-income housing crisis, this is partly because it is in no way reflected in the official poverty rate, which has remained for the past several years at a soothingly low 13 percent or so. The reason for the disconnect between the actual housing nightmare of the poor and “poverty,” as officially defined, is simple: the official poverty level is still calculated by the archaic method of taking the bare-bones cost of food for a family of a given size and multiplying this number by three. Yet food is relatively inflation-proof, at least compared with rent. In the early 1960s, when this method of calculating poverty was devised, food accounted for 24 percent of the average family budget (not 33 percent even then, it should be noted) and housing 29 percent. In 1999, food took up only 16 percent of the family budget, while housing had soared to 37 percent.1 So the choice of food as the basis for calculating family budgets seems fairly arbitrary today; we might as well abolish poverty altogether, at least on paper, by defining a subsistence budget as some multiple of average expenditures on comic books or dental floss.

When the market fails to distribute some vital commodity, such as housing, to all who require it, the usual liberal-to-moderate expectation is that the government will step in and help. We accept this principle—at least in a halfhearted and faltering way—in the case of health care, where government offers Medicare to the elderly, Medicaid to the desperately poor, and various state programs to the children of the merely very poor. But in the case of housing, the extreme upward skewing of the market has been accompanied by a cowardly public sector retreat from responsibility. Expenditures on public housing have fallen since the 1980s, and the expansion of public rental subsidies came to a halt in the mid-1990s. At the same time, housing subsidies for home owners—who tend to be far more affluent than renters—have remained at their usual munificent levels. It did not escape my attention, as a temporarily low-income person, that the housing subsidy I normally receive in my real life—over $20,000 a year in the form of a mortgage-interest deduction—would have allowed a truly low-income family to live in relative splendor. Had this amount been available to me in monthly installments in Minneapolis, I could have moved into one of those “executive” condos with sauna, health club, and pool.

But if rents are exquisitely sensitive to market forces, wages clearly are not. Every city where I worked in the course of this project was experiencing what local businesspeople defined as a “labor shortage”—commented on in the local press and revealed by the ubiquitous signs saying “Now Hiring” or, more imperiously, “We Are Now Accepting Applications.” Yet wages for people near the bottom of the labor market remain fairly flat, even “stagnant.” “Certainly,” the New York Times reported in March 2000, “inflationary wage gains are not evident in national wage statistics.”2 Federal Reserve chief Alan Greenspan, who spends much of his time anxiously scanning the horizon for the slightest hint of such “inflationary” gains, was pleased to inform Congress in July 2000 that the forecast seemed largely trouble-free. He went so far as to suggest that the economic laws linking low unemployment to wage increases may no longer be operative, which is a little like saying that the law of supply and demand has been repealed.3 Some economists argue that the apparent paradox rests on an illusion: there is no real “labor shortage,” only a shortage of people willing to work at the wages currently being offered.4 You might as well talk about a “Lexus shortage”—which there is, in a sense, for anyone unwilling to pay $40,000 for a car.

In fact, wages have risen, or did rise, anyway, between 1996 and 1999. When I called around to various economists in the summer of 2000 and complained about the inadequacy of the wages available to entry-level workers, this was their first response: “But wages are going up!” According to the Economic Policy Institute, the poorest 10 percent of American workers saw their wages rise from $5.49 an hour (in 1999 dollars) in 1996 to $6.05 in 1999. Moving up the socioeconomic ladder, the next 10 percent–sized slice of Americans—which is roughly where I found myself as a low-wage worker—went from $6.80 an hour in 1996 to $7.35 in 1999.5

Obviously we have one of those debates over whether the glass is half empty or half full; the increases that seem to have mollified many economists do not seem so impressive to me. To put the wage gains of the past four years in somewhat dismal perspective: they have not been sufficient to bring low-wage workers up to the amounts they were earning twenty-seven years ago, in 1973. In the first quarter of 2000, the poorest 10 percent of workers were earning only 91 percent of what they earned in the distant era of Watergate and disco music. Furthermore, of all workers, the poorest have made the least progress back to their 1973 wage levels. Relatively well-off workers in the eighth decile, or 10 percent–sized slice, where earnings are about $20 an hour, are now making 106.6 percent of what they earned in 1973. When I persisted in my carping to the economists, they generally backed down a bit, conceding that while wages at the bottom are going up, they’re not going up very briskly. Lawrence Michel at the Economic Policy Institute, who had at the beginning of our conversation taken the half-full perspective, heightened the mystery when he observed that productivity—to which wages are theoretically tied—has been rising at such a healthy clip that “workers should be getting much more.”6

The most obvious reason why they’re not is that employers resist wage increases with every trick they can think of and every ounce of strength they can summon. I had an opportunity to query one of my own employers on this subject in Maine. You may remember the time when Ted, my boss at The Maids, drove me about forty minutes to a house where I was needed to reinforce a shorthanded team. In the course of complaining about his hard lot in life, he avowed that he could double his business overnight if only he could find enough reliable workers. As politely as possible, I asked him why he didn’t just raise the pay. The question seemed to slide right off him. We offer “mothers’ hours,” he told me, meaning that the work-day was supposedly over at three—as if to say, “With a benefit like that, how could anybody complain about wages?”

In fact, I suspect that the free breakfast he provided us represented the only concession to the labor shortage that he was prepared to make. Similarly, the Wal-Mart where I worked was offering free doughnuts once a week to any employees who could arrange to take their breaks while the supply lasted. As Louis Uchitelle has reported in the New York Times, many employers will offer almost anything—free meals, subsidized transportation, store discounts—rather than raise wages. The reason for this, in the words of one employer, is that such extras “can be shed more easily” than wage increases when changes in the market seem to make them unnecessary.7 In the same spirit, automobile manufacturers would rather offer their customers cash rebates than reduced prices; the advantage of the rebate is that it seems like a gift and can be withdrawn without explanation.

But the resistance of employers only raises a second and ultimately more intractable question: Why isn’t this resistance met by more effective counterpressure from the workers themselves? In evading and warding off wage increases, employers are of course behaving in an economically rational fashion; their business isn’t to make their employees more comfortable and secure but to maximize the bottom line. So why don’t employees behave in an equally rational fashion, demanding higher wages of their employers or seeking out better-paying jobs? The assumption behind the law of supply and demand, as it applies to labor, is that workers will sort themselves out as effectively as marbles on an inclined plane—gravitating to the better-paying jobs and either leaving the recalcitrant employers behind or forcing them to up the pay. “Economic man,” that great abstraction of economic science, is supposed to do whatever it takes, within certain limits, to maximize his economic advantage.

I was baffled, initially, by what seemed like a certain lack of get-up-and-go on the part of my fellow workers. Why didn’t they just leave for a better-paying job, as I did when I moved from the Hearthside to Jerry’s? Part of the answer is that actual humans experience a little more “friction” than marbles do, and the poorer they are, the more constrained their mobility usually is. Low-wage people who don’t have cars are often dependent on a relative who is willing to drop them off and pick them up again each day, sometimes on a route that includes the babysitter’s house or the child care center. Change your place of work and you may be confronted with an impossible topographical problem to solve, or at least a reluctant driver to persuade. Some of my coworkers, in Minneapolis as well as Key West, rode bikes to work, and this clearly limited their geographical range. For those who do possess cars, there is still the problem of gas prices, not to mention the general hassle, which is of course far more onerous for the carless, of getting around to fill out applications, to be interviewed, to take drug tests. I have mentioned, too, the general reluctance to exchange the devil you know for one that you don’t know, even when the latter is tempting you with a better wage-benefit package. At each new job, you have to start all over, clueless and friendless.

There is another way that low-income workers differ from “economic man.” For the laws of economics to work, the “players” need to be well informed about their options. The ideal case—and I’ve read that the technology for this is just around the corner—would be the consumer whose Palm Pilot displays the menu and prices for every restaurant or store he or she passes. Even without such technological assistance, affluent job hunters expect to study the salary-benefit packages offered by their potential employers, watch the financial news to find out if these packages are in line with those being offered in other regions or fields, and probably do a little bargaining before taking a job.

But there are no Palm Pilots, cable channels, or Web sites to advise the low-wage job seeker. She has only the help-wanted signs and the want ads to go on, and most of these coyly refrain from mentioning numbers. So information about who earns what and where has to travel by word of mouth, and for inexplicable cultural reasons, this is a very slow and unreliable route. Twin Cities job market analyst Kristine Jacobs pinpoints what she calls the “money taboo” as a major factor preventing workers from optimizing their earnings. “There’s a code of silence surrounding issues related to individuals’ earnings,” she told me. “We confess everything else in our society—sex, crime, illness. But no one wants to reveal what they earn or how they got it. The money taboo is the one thing that employers can always count on.”8 I suspect that this “taboo” operates most effectively among the lowest-paid people, because, in a society that endlessly celebrates its dot-com billionaires and centimillionaire athletes, $7 or even $10 an hour can feel like a mark of innate inferiority. So you may or may not find out that, say, the Target down the road is paying better than Wal-Mart, even if you have a sister-in-law working there.

Employers, of course, do little to encourage the economic literacy of their workers. They may exhort potential customers to “Compare Our Prices!” but they’re not eager to have workers do the same with wages. I have mentioned the way the hiring process seems designed, in some cases, to prevent any discussion or even disclosure of wages—whisking the applicant from interview to orientation before the crass subject of money can be raised. Some employers go further; instead of relying on the informal “money taboo” to keep workers from discussing and comparing wages, they specifically enjoin workers from doing so. The New York Times recently reported on several lawsuits brought by employees who had allegedly been fired for breaking this rule—a woman, for example, who asked for higher pay after learning from her male coworkers that she was being paid considerably less than they were for the very same work. The National Labor Relations Act of 1935 makes it illegal to punish people for revealing their wages to one another, but the practice is likely to persist until rooted out by lawsuits, company by company.9

 

But if it’s hard for workers to obey The Laws of Economics by examining their options and moving on to better jobs, why don’t more of them take a stand where they are—demanding better wages and work conditions, either individually or as a group? This is a huge question, probably the subject of many a dissertation in the field of industrial psychology, and here I can only comment on the things I observed. One of these was the co-optative power of management, illustrated by such euphemisms as associate and team member. At The Maids, the boss—who, as the only male in our midst, exerted a creepy, paternalistic kind of power—had managed to convince some of my coworkers that he was struggling against difficult odds and deserving of their unstinting forbearance. Wal-Mart has a number of more impersonal and probably more effective ways of getting its workers to feel like “associates.” There was the profit-sharing plan, with Wal-Mart’s stock price posted daily in a prominent spot near the break room. There was the company’s much-heralded patriotism, evidenced in the banners over the shopping floor urging workers and customers to contribute to the construction of a World War II veterans’ memorial (Sam Walton having been one of them). There were “associate” meetings that served as pep rallies, complete with the Wal-Mart cheer: “Gimme a ‘W,’” etc.

The chance to identify with a powerful and wealthy entity—the company or the boss—is only the carrot. There is also a stick. What surprised and offended me most about the low-wage workplace (and yes, here all my middle-class privilege is on full display) was the extent to which one is required to surrender one’s basic civil rights and—what boils down to the same thing—self-respect. I learned this at the very beginning of my stint as a waitress, when I was warned that my purse could be searched by management at any time. I wasn’t carrying stolen salt shakers or anything else of a compromising nature, but still, there’s something about the prospect of a purse search that makes a woman feel a few buttons short of fully dressed. After work, I called around and found that this practice is entirely legal: if the purse is on the boss’s property—which of course it was—the boss has the right to examine its contents.

Drug testing is another routine indignity. Civil libertarians see it as a violation of our Fourth Amendment freedom from “unreasonable search”; most jobholders and applicants find it simply embarrassing. In some testing protocols, the employee has to strip to her underwear and pee into a cup in the presence of an aide or technician. Mercifully, I got to keep my clothes on and shut the toilet stall door behind me, but even so, urination is a private act and it is degrading to have to perform it at the command of some powerful other. I would add pre-employment personality tests to the list of demeaning intrusions, or at least much of their usual content. Maybe the hypothetical types of questions can be justified—whether you would steal if an opportunity arose or turn in a thieving coworker and so on—but not questions about your “moods of self-pity,” whether you are a loner or believe you are usually misunderstood. It is unsettling, at the very least, to give a stranger access to things, like your self-doubts and your urine, that are otherwise shared only in medical or therapeutic situations.

There are other, more direct ways of keeping low-wage employees in their place. Rules against “gossip,” or even “talking,” make it hard to air your grievances to peers or—should you be so daring—to enlist other workers in a group effort to bring about change, through a union organizing drive, for example. Those who do step out of line often face little unexplained punishments, such as having their schedules or their work assignments unilaterally changed. Or you may be fired; those low-wage workers who work without union contracts, which is the great majority of them, work “at will,” meaning at the will of the employer, and are subject to dismissal without explanation. The AFL-CIO estimates that ten thousand workers a year are fired for participating in union organizing drives, and since it is illegal to fire people for union activity, I suspect that these firings are usually justified in terms of unrelated minor infractions. Wal-Mart employees who have bucked the company—by getting involved in a unionization drive or by suing the company for failing to pay overtime—have been fired for breaking the company rule against using profanity.10

So if low-wage workers do not always behave in an economically rational way, that is, as free agents within a capitalist democracy, it is because they dwell in a place that is neither free nor in any way democratic. When you enter the low-wage workplace—and many of the medium-wage workplaces as well—you check your civil liberties at the door, leave America and all it supposedly stands for behind, and learn to zip your lips for the duration of the shift. The consequences of this routine surrender go beyond the issues of wages and poverty. We can hardly pride ourselves on being the world’s preeminent democracy, after all, if large numbers of citizens spend half their waking hours in what amounts, in plain terms, to a dictatorship.

Any dictatorship takes a psychological toll on its subjects. If you are treated as an untrustworthy person—a potential slacker, drug addict, or thief—you may begin to feel less trustworthy yourself. If you are constantly reminded of your lowly position in the social hierarchy, whether by individual managers or by a plethora of impersonal rules, you begin to accept that unfortunate status. To draw for a moment from an entirely different corner of my life, that part of me still attached to the biological sciences, there is ample evidence that animals—rats and monkeys, for example—that are forced into a subordinate status within their social systems adapt their brain chemistry accordingly, becoming “depressed” in humanlike ways. Their behavior is anxious and withdrawn; the level of serotonin (the neurotransmitter boosted by some antidepressants) declines in their brains. And—what is especially relevant here—they avoid fighting even in self-defense.11

Humans are, of course, vastly more complicated; even in situations of extreme subordination, we can pump up our self-esteem with thoughts of our families, our religion, our hopes for the future. But as much as any other social animal, and more so than many, we depend for our self-image on the humans immediately around us—to the point of altering our perceptions of the world so as to fit in with theirs.12 My guess is that the indignities imposed on so many low-wage workers—the drug tests, the constant surveillance, being “reamed out” by managers—are part of what keeps wages low. If you’re made to feel unworthy enough, you may come to think that what you’re paid is what you are actually worth.

It is hard to imagine any other function for workplace authoritarianism. Managers may truly believe that, without their unremitting efforts, all work would quickly grind to a halt. That is not my impression. While I encountered some cynics and plenty of people who had learned to budget their energy, I never met an actual slacker or, for that matter, a drug addict or thief. On the contrary, I was amazed and sometimes saddened by the pride people took in jobs that rewarded them so meagerly, either in wages or in recognition. Often, in fact, these people experienced management as an obstacle to getting the job done as it should be done. Waitresses chafed at managers’ stinginess toward the customers; housecleaners resented the time constraints that sometimes made them cut corners; retail workers wanted the floor to be beautiful, not cluttered with excess stock as management required. Left to themselves, they devised systems of cooperation and work sharing; when there was a crisis, they rose to it. In fact, it was often hard to see what the function of management was, other than to exact obeisance.

There seems to be a vicious cycle at work here, making ours not just an economy but a culture of extreme inequality. Corporate decision makers, and even some two-bit entrepreneurs like my boss at The Maids, occupy an economic position miles above that of the underpaid people whose labor they depend on. For reasons that have more to do with class—and often racial—prejudice than with actual experience, they tend to fear and distrust the category of people from which they recruit their workers. Hence the perceived need for repressive management and intrusive measures like drug and personality testing. But these things cost money—$20,000 or more a year for a manager, $100 a pop for a drug test, and so on—and the high cost of repression results in ever more pressure to hold wages down. The larger society seems to be caught up in a similar cycle: cutting public services for the poor, which are sometimes referred to collectively as the “social wage,” while investing ever more heavily in prisons and cops. And in the larger society, too, the cost of repression becomes another factor weighing against the expansion or restoration of needed services. It is a tragic cycle, condemning us to ever deeper inequality, and in the long run, almost no one benefits but the agents of repression themselves.

But whatever keeps wages low—and I’m sure my comments have barely scratched the surface—the result is that many people earn far less than they need to live on. How much is that? The Economic Policy Institute recently reviewed dozens of studies of what constitutes a “living wage” and came up with an average figure of $30,000 a year for a family of one adult and two children, which amounts to a wage of $14 an hour. This is not the very minimum such a family could live on; the budget includes health insurance, a telephone, and child care at a licensed center, for example, which are well beyond the reach of millions. But it does not include restaurant meals, video rentals, Internet access, wine and liquor, cigarettes and lottery tickets, or even very much meat. The shocking thing is that the majority of American workers, about 60 percent, earn less than $14 an hour. Many of them get by by teaming up with another wage earner, a spouse or grown child. Some draw on government help in the form of food stamps, housing vouchers, the earned income tax credit, or—for those coming off welfare in relatively generous states—subsidized child care. But others—single mothers for example—have nothing but their own wages to live on, no matter how many mouths there are to feed.

Employers will look at that $30,000 figure, which is over twice what they currently pay entry-level workers, and see nothing but bankruptcy ahead. Indeed, it is probably impossible for the private sector to provide everyone with an adequate standard of living through wages, or even wages plus benefits, alone: too much of what we need, such as reliable child care, is just too expensive, even for middle-class families. Most civilized nations compensate for the inadequacy of wages by providing relatively generous public services such as health insurance, free or subsidized child care, subsidized housing, and effective public transportation. But the United States, for all its wealth, leaves its citizens to fend for themselves—facing market-based rents, for example, on their wages alone. For millions of Americans, that $10—or even $8 or $6—hourly wage is all there is.

It is common, among the nonpoor, to think of poverty as a sustainable condition—austere, perhaps, but they get by somehow, don’t they? They are “always with us.” What is harder for the nonpoor to see is poverty as acute distress: The lunch that consists of Doritos or hot dog rolls, leading to faintness before the end of the shift. The “home” that is also a car or a van. The illness or injury that must be “worked through,” with gritted teeth, because there’s no sick pay or health insurance and the loss of one day’s pay will mean no groceries for the next. These experiences are not part of a sustainable lifestyle, even a lifestyle of chronic deprivation and relentless low-level punishment. They are, by almost any standard of subsistence, emergency situations. And that is how we should see the poverty of so many millions of low-wage Americans—as a state of emergency.

 

In the summer of 2000 I returned—permanently, I have every reason to hope—to my customary place in the socioeconomic spectrum. I go to restaurants, often far finer ones than the places where I worked, and sit down at a table. I sleep in hotel rooms that someone else has cleaned and shop in stores that others will tidy when I leave. To go from the bottom 20 percent to the top 20 percent is to enter a magical world where needs are met, problems are solved, almost without any intermediate effort. If you want to get somewhere fast, you hail a cab. If your aged parents have grown tiresome or incontinent, you put them away where others will deal with their dirty diapers and dementia. If you are part of the upper-middle-class majority that employs a maid or maid service, you return from work to find the house miraculously restored to order—the toilet bowls shit-free and gleaming, the socks that you left on the floor levitated back to their normal dwelling place. Here, sweat is a metaphor for hard work, but seldom its consequence. Hundreds of little things get done, reliably and routinely every day, without anyone’s seeming to do them.

The top 20 percent routinely exercises other, far more consequential forms of power in the world. This stratum, which contains what I have termed in an earlier book the “professional-managerial class,” is the home of our decision makers, opinion shapers, culture creators—our professors, lawyers, executives, entertainers, politicians, judges, writers, producers, and editors.13 When they speak, they are listened to. When they complain, someone usually scurries to correct the problem and apologize for it. If they complain often enough, someone far below them in wealth and influence may be chastised or even fired. Political power, too, is concentrated within the top 20 percent, since its members are far more likely than the poor—or even the middle class—to discern the all-too-tiny distinctions between candidates that can make it seem worthwhile to contribute, participate, and vote. In all these ways, the affluent exert inordinate power over the lives of the less affluent, and especially over the lives of the poor, determining what public services will be available, if any, what minimum wage, what laws governing the treatment of labor.

So it is alarming, upon returning to the upper middle class from a sojourn, however artificial and temporary, among the poor, to find the rabbit hole close so suddenly and completely behind me. You were where, doing what? Some odd optical property of our highly polarized and unequal society makes the poor almost invisible to their economic superiors. The poor can see the affluent easily enough—on television, for example, or on the covers of magazines. But the affluent rarely see the poor or, if they do catch sight of them in some public space, rarely know what they’re seeing, since—thanks to consignment stores and, yes, Wal-Mart—the poor are usually able to disguise themselves as members of the more comfortable classes. Forty years ago the hot journalistic topic was the “discovery of the poor” in their inner-city and Appalachian “pockets of poverty.” Today you are more likely to find commentary on their “disappearance,” either as a supposed demographic reality or as a shortcoming of the middle-class imagination.

In a 2000 article on the “disappearing poor,” journalist James Fallows reports that, from the vantage point of the Internet’s nouveaux riches, it is “hard to understand people for whom a million dollars would be a fortune . . . not to mention those for whom $246 is a full week’s earnings.”14 Among the reasons he and others have cited for the blindness of the affluent is the fact that they are less and less likely to share spaces and services with the poor. As public schools and other public services deteriorate, those who can afford to do so send their children to private schools and spend their off-hours in private spaces—health clubs, for example, instead of the local park. They don’t ride on public buses and subways. They withdraw from mixed neighborhoods into distant suburbs, gated communities, or guarded apartment towers; they shop in stores that, in line with the prevailing “market segmentation,” are designed to appeal to the affluent alone. Even the affluent young are increasingly unlikely to spend their summers learning how the “other half” lives, as lifeguards, waitresses, or housekeepers at resort hotels. The New York Times reports that they now prefer career-relevant activities like summer school or interning in an appropriate professional setting to the “sweaty, low-paid and mind-numbing slots that have long been their lot.”15

Then, too, the particular political moment favors what almost looks like a “conspiracy of silence” on the subject of poverty and the poor. The Democrats are not eager to find flaws in the period of “unprecedented prosperity” they take credit for; the Republicans have lost interest in the poor now that “welfare-as-we-know-it” has ended. Welfare reform itself is a factor weighing against any close investigation of the conditions of the poor. Both parties heartily endorsed it, and to acknowledge that low-wage work doesn’t lift people out of poverty would be to admit that it may have been, in human terms, a catastrophic mistake. In fact, very little is known about the fate of former welfare recipients because the 1996 welfare reform legislation blithely failed to include any provision for monitoring their postwelfare economic condition. Media accounts persistently bright-side the situation, highlighting the occasional success stories and downplaying the acknowledged increase in hunger.16 And sometimes there seems to be almost deliberate deception. In June 2000, the press rushed to hail a study supposedly showing that Minnesota’s welfare-to-work program had sharply reduced poverty and was, as Time magazine put it, a “winner.”17 Overlooked in these reports was the fact that the program in question was a pilot project that offered far more generous child care and other subsidies than Minnesota’s actual welfare reform program. Perhaps the error can be forgiven—the pilot project, which ended in 1997, had the same name, Minnesota Family Investment Program, as Minnesota’s much larger, ongoing welfare reform program.18

You would have to read a great many newspapers very carefully, cover to cover, to see the signs of distress. You would find, for example, that in 1999 Massachusetts food pantries reported a 72 percent increase in the demand for their services over the previous year, that Texas food banks were “scrounging” for food, despite donations at or above 1998 levels, as were those in Atlanta.19 You might learn that in San Diego the Catholic Church could no longer, as of January 2000, accept homeless families at its shelter, which happens to be the city's largest, because it was already operating at twice its normal capacity. You would come across news of a study showing that the percentage of Wisconsin food-stamp families in "extreme poverty"—defined as less than 50 percent of the federal poverty line—has tripled in the last decade to more than 30 percent. You might discover that, nationwide, America's food banks are experiencing "a torrent of need which [they] cannot meet" and that, according to a survey conducted by the U.S. Conference of Mayors, 67 percent of the adults requesting emergency food aid are people with jobs.

One reason nobody bothers to pull all these stories together and announce a widespread state of emergency may be that Americans of the newspaper-reading professional middle class are used to thinking of poverty as a consequence of unemployment. During the heyday of downsizing in the Reagan years, it very often was, and it still is for many inner-city residents who have no way of getting to the proliferating entry-level jobs on urban peripheries. When unemployment causes poverty, we know how to state the problem—typically, “the economy isn’t growing fast enough”—and we know what the traditional liberal solution is—“full employment.” But when we have full or nearly full employment, when jobs are available to any job seeker who can get to them, then the problem goes deeper and begins to cut into that web of expectations that make up the “social contract.” According to a recent poll conducted by Jobs for the Future, a Boston-based employment research firm, 94 percent of Americans agree that “people who work full-time should be able to earn enough to keep their families out of poverty.”23 I grew up hearing over and over, to the point of tedium, that “hard work” was the secret of success: “Work hard and you’ll get ahead” or “It’s hard work that got us where we are.” No one ever said that you could work hard—harder even than you ever thought possible—and still find yourself sinking ever deeper into poverty and debt.

When poor single mothers had the option of remaining out of the labor force on welfare, the middle and upper middle class tended to view them with a certain impatience, if not disgust. The welfare poor were excoriated for their laziness, their persistence in reproducing in unfavorable circumstances, their presumed addictions, and above all for their “dependency.” Here they were, content to live off “government handouts” instead of seeking “self-sufficiency,” like everyone else, through a job. They needed to get their act together, learn how to wind an alarm clock, get out there and get to work. But now that government has largely withdrawn its “handouts,” now that the overwhelming majority of the poor are out there toiling in Wal-Mart or Wendy’s—well, what are we to think of them? Disapproval and condescension no longer apply, so what outlook makes sense?

Guilt, you may be thinking warily. Isn’t that what we’re supposed to feel? But guilt doesn’t go anywhere near far enough; the appropriate emotion is shame—shame at our own dependency, in this case, on the underpaid labor of others. When someone works for less pay than she can live on—when, for example, she goes hungry so that you can eat more cheaply and conveniently—then she has made a great sacrifice for you, she has made you a gift of some part of her abilities, her health, and her life. The “working poor,” as they are approvingly termed, are in fact the major philanthropists of our society. They neglect their own children so that the children of others will be cared for; they live in substandard housing so that other homes will be shiny and perfect; they endure privation so that inflation will be low and stock prices high. To be a member of the working poor is to be an anonymous donor, a nameless benefactor, to everyone else. As Gail, one of my restaurant coworkers put it, “you give and you give.”

Someday, of course—and I will make no predictions as to exactly when—they are bound to tire of getting so little in return and to demand to be paid what they’re worth. There’ll be a lot of anger when that day comes, and strikes and disruption. But the sky will not fall, and we will all be better off for it in the end.

 

 

 

1 Jared Bernstein, Chauna Brocht, and Maggie Spade-Aguilar, “How Much Is Enough? Basic Family Budgets for Working Families,” Economic Policy Institute, Washington, D.C., 2000, p. 14.

2 “Companies Try Dipping Deeper into Labor Pool,” New York Times, March 26, 2000.

3 “An Epitaph for a Rule That Just Won’t Die,” New York Times, July 30, 2000.

4 “Fact or Fallacy: Labor Shortage May Really Be Wage Stagnation,” Chicago Tribune, July 2, 2000; “It’s a Wage Shortage, Not a Labor Shortage,” Minneapolis Star Tribune, March 25, 2000.

5 I thank John Schmidt at the Economic Policy Institute in Washington, D.C., for preparing the wage data for me.

6 Interview, July 18, 2000.

7 “Companies Try Dipping Deeper into Labor Pool,” New York Times, March 26, 2000.

8 Personal communication, July 24, 2000.

9 “The Biggest Company Secret: Workers Challenge Employer Practices on Pay Confidentiality,” New York Times, July 28, 2000.

10 Bob Ortega, In Sam We Trust, p. 356; “Former Wal-Mart Workers File Overtime Suit in Harrison County,” Charleston Gazette, January 24, 1999.

11 See, for example, C. A. Shively, K. Laber-Laird, and R. F. Anton, “Behavior and Physiology of Social Stress and Depression in Female Cynomolgous Monkeys,” Biological Psychiatry 41:8 (1997), pp. 871–82, and D. C. Blanchard et al., “Visible Burrow System as a Model of Chronic Social Stress: Behavioral and Neuroendocrine Correlates,” Psychoneuroendocrinology 20:2 (1995), pp. 117–34.

12 See, for example, chapter 7, “Conformity,” in David G. Myers, Social Psychology (McGraw-Hill, 1987).

13 Fear of Falling: The Inner Life of the Middle Class (Pantheon, 1989).

14 “The Invisible Poor,” New York Times Magazine, March 19, 2000.

15 “Summer Work Is Out of Favor with the Young,” New York Times, June 18, 2000.

16 The National Journal reports that the “good news” is that almost six million people have left the welfare rolls since 1996, while the “rest of the story” includes the problem that “these people sometimes don’t have enough to eat” (“Welfare Reform, Act 2,” June 24, 2000, pp. 1,978–93). homeless families at its shelter, which happens to be the city’s largest, because it was already operating at twice its normal capacity.20 You would come across news of a study showing that the percentage of Wisconsin food-stamp families in “extreme poverty”—defined as less than 50 percent of the federal poverty line—has tripled in the last decade to more than 30 percent.21 You might discover that, nationwide, America’s food banks are experiencing “a torrent of need which [they] cannot meet” and that, according to a survey conducted by the U.S. Conference of Mayors, 67 percent of the adults requesting emergency food aid are people with jobs.22

17 “Minnesota’s Welfare Reform Proves a Winner,” Time, June 12, 2000.

18 Center for Law and Social Policy, “Update,” Washington, D.C., June 2000.

19 “Study: More Go Hungry since Welfare Reform,” Boston Herald, January 21, 2000; “Charity Can’t Feed All while Welfare Reforms Implemented,” Houston Chronicle, January 10, 2000; “Hunger Grows as Food Banks Try to Keep Pace,” Atlanta Journal and Constitution, November 26, 1999.

20 “Rise in Homeless Families Strains San Diego Aid,” Los Angeles Times, January 24, 2000.

21 “Hunger Problems Said to Be Getting Worse,” Milwaukee Journal Sentinel, December 15, 1999.

22 Deborah Leff, the president and CEO of the hunger-relief organization America’s Second Harvest, quoted in the National Journal, op. cit.; “Hunger Persists in U.S. despite the Good Times,” Detroit News, June 15, 2000.

23 “A National Survey of American Attitudes toward Low-Wage Workers and Welfare Reform,” Jobs for the Future, Boston, May 24, 2000.