6

How Work Stays So Shitty

“In 2007, I broke my lease, packed my stuff into a van, and couch surfed in a brain fog. But then I faked my way into a startup as a designer in 2009, and life got better for me real fast. I was still in a codependent, abusive relationship, but suddenly I had the money to solve problems that had wrecked my life a few years before. All I had to do was work 60 hour weeks, so I did. It took two and a half years to figure out I was in a toxic work environment, underpaid for the work I was doing and literally the only person at the company who hadn’t been offered stock. I’ve been working on unlearning the idea that putting in more work and being first in, last out is the only thing that makes me useful in the workplace. I dare myself to work 35 hours a week, but it just doesn’t take.”

NINA, SOFTWARE DESIGNER, SAN FRANCISCO

 

—KAY, FREELANCE COSTUME TECHNICIAN, SEATTLE

 

“I was finally getting momentum as a writer and I wanted to chase it since I knew it was what I really wanted to be doing. But loneliness is a big one for me. There are days on end when I don’t leave the house. I tend to get depressed. I don’t see my friends as often I’d like. I am always chasing checks, which is spirit breaking. And I have no health insurance.”

—CATE, FREELANCE FILM CRITIC, LOS ANGELES

You can talk about the fissured workplace in an abstract way, moving workers from one company to a subcompany like figurines in a tabletop game. But fissuring affects workers on a practical level, with effects that can be loosely divided into the rise and glorification of overwork, the spread and normalization of workplace surveillance, and the fetishization of freelance flexibility. Each of these trends contributes to burnout in its own noxious way. But the end result is the same: They make the everyday experience of work, across the income spectrum, undeniably and unceasingly shitty.

The Rise of Overwork

The American overwork ethic has become so standardized that there’s no feeling of before or after: It’s just how it is, how it always will be. But like every ideology, it has a source—it shouldn’t be surprising that many of the same people responsible for the fissured workplace were also responsible for the fetishization of overwork. Chief amongst them: the consultant.

Elite consulting firms prided themselves on hiring the best and brightest from Ivy League universities—or, if they had to, from the most prestigious schools in a particular region. But their strategy was, and remains, perverse: They take the best students, work them into the ground, and then fire anyone who couldn’t deal with spending their workweek away from friends and family or creating business plans that often required gutting long-term devoted employees.

The consultants who made the cut had to do more than establish themselves as workhorses. As Louis Hyman explains in Temp, drawing directly from McKinsey’s own internal publications, the consultant was judged on whether “he ha[d] real promise for long-term success with the Firm based on his performance and his character.”1 Put differently: Did he dedicate his whole self, and his whole life, to his work? The initial culling usually took place a few years post-college. Oftentimes it was self-imposed: workers stuck around long enough to get their MBAs paid for, then willingly left the firm.

In the 1960s, researchers found that consultants had “more emotional instability,” and “less motivation to exercise power over others” than their peers who worked in stable corporations.2 They’d downsized so many people that they were frightened of being downsized themselves, afflicted by the same omnipresent anxiety that their work had imposed on others.

But the people who quit or got fired from McKinsey didn’t then start local boutiques, or go back and get their teaching degrees, or launch nonprofits. The consultancy cycle was so commonplace that leaving a firm wasn’t a mark against you. Instead, former consultants quickly found new jobs, often with the very companies they once advised. After all, it’s much cheaper to hire someone with McKinsey knowledge than to actually hire McKinsey. As more and more ex-consultants spread throughout corporate America, the employee-sloughing, core-competency-preserving, short-term-profits-at-all-costs ideology became commonplace. “The instability and high pay of the consulting world fed on itself, as the people who believed in this model of management cut the staffs of corporations, and when that was done, joined the staffs,” Hyman explains. “It worked for them. Why would it not work for the rest of America?”3 The same mindset extended to consultants’ standards of overwork: It was an effective sorting mechanism for their business. Why shouldn’t it be applied to every business?

Consultants, scattered to the corners of the American corporate world, helped create a new paradigm of work: what a “good” worker did, and how much of their lives they devoted to the company, and the level of stability they could expect in return (read: very little). But for all of their ubiquity, even consultants couldn’t singlehandedly shift the culture of work in America. And in the rarified air of the investment bank, a similar attitude had already become accepted gospel.

Over the last twenty years, the office with the good snacks and free lunch has become a cultural punchline: a way to highlight the absurdity of startup culture, or just the ridiculous perks millennials demand. But free food isn’t just a benefit. It’s a strategy to incentivize overwork, and the practice, along with so many other tenets of overwork, came directly from the culture of Wall Street.

That culture is what the anthropologist Karen Ho set out to study in the years leading up to and in the immediate aftermath of the Great Recession. In 1996, she took a sabbatical from her PhD in order to work at an investment bank, which she was able to pull off, despite a lack of finance training, because she was a grad student at Princeton—one of the handful of schools that investment banks conceive of as elite enough to produce investment banker material.

For her research, Ho interviewed dozens of current and former bankers, gaining a textured understanding of the day-to-day life on Wall Street as well as their overarching economic logic. Amongst her findings was that “organizational perks,” standard across investment banks, acted to incentivize and perpetuate extremely long hours. Specifically, the free dinner and the free ride home. If an investment banker worked past seven p.m., they could order takeout on the company; because so many workers worked so late, they didn’t ever have time to get groceries, let alone have the energy to make dinner. The cycle perpetuated itself. If a banker stayed until seven p.m., they might as well stay until nine p.m.—when they can take a black car home, again on the company. For the bank, footing the bill for such perks was a small price to pay for the additional work hours.

Ho found that investment banks, especially the top tier, also clung to the notion that constant work is a signifier of eliteness, their version of “smartness.” This logic was built on the fact that the banks hire their entry-level analysts almost exclusively out of the Ivy Leagues, and the Ivy Leagues only accept “the best of the best,” which suggests the people at investment banks are also “the best of the best.” It follows, then, that whatever work schedule they cultivate is superior—even if that work meant eighteen-hour days, nearly seven days a week, up to and past one’s breaking point. “If you’re single, and your family lives far away, like California, the better analyst you will be,” a vice president at a major finance bank told Ho. Analysts often start work with a significant other but, as that same vice president explained, “all of a sudden, after a few months, everyone starts finding out that they are single.”4 “The point is to create a post-college atmosphere where within days of beginning work, analysts and associates begin to ‘live’ there,” Ho argues, “comparing notes about who is staying the latest and ‘getting slammed’ the most, not to mention participating in the makeshift Nerf football game at 1 a.m.”5

Some first-year analysts experienced a brief period of shock once initiated into this lifestyle. But Ho found that they quickly internalized the ethic of overwork the same way they had back in high school, and then in college: as a badge of honor, and proof of their own excellence. As one Harvard editorial put it, writing about investment banks’ interest in the school’s graduates: “They know that four years ago, we wanted the absolute best. We did not settle for number three or four on the college rankings. They prey on our desire to find the ‘Harvard’ of everything: activities, summer jobs, relationships, and now careers.”6 In other words, those high school students who refused to “settle” for anything other than Harvard lifted the bar on what constituted “hard work” for everyone else.

And for most, that overwork actually was worth it. As Ho points out, elite Wall Street bankers are among the very few in the American economy who “still experience a link between hard work and monetary rewards and upward mobility.” Overwork, in their case, meant massive bonuses. Historically, most middle-class Americans experienced some version of this scenario: If their company was extremely productive and profitable, those profits trickled down to workers in the form of salary, benefits, and even bonuses (although never as big as on Wall Street). Now, after the great risk shift, those profits go to shareholders and CEOs—and to the bankers who recommend and enact the trades of those profitable companies.

Because investment bankers still benefit from the link between overwork and compensation, many also internalize the idea that if someone’s not making much money, it’s because the rest of the world, off Wall Street, lacks work ethic. An associate at Goldman Sachs gave Ho an extensive rundown of the way he’s come to see the world, which is worth reading in full:

 

If you go to the outside world and you start working with people, people just are not motivated in the same way. It is just a pain in the ass to get anything done in the real world. People leave work at five, six p.m. People take one-hour lunch breaks, and people do this and that and whatever. Believe me, it makes a big deal, because if you are working with people who all work real hard to do whatever it takes to get things done, it just makes things so much easier. And doing things is what makes people feel good about their life and makes them feel important. This is the whole self-worth thing—to complete and do things. In a big corporation or in the academy, it is hard to get things done. [On Wall Street], you work with so many people where anyone you talk to is so responsive and pretty bright and really motivated, it just makes for a pretty good environment. I think in the old days, back in the fifties or sixties, people kind of just had a set pattern of life. They went to work, climbed up the ladder slowly, and did whatever they were told. I think now that people are so seduced by the capabilities that you can jump ahead and how much of a difference you can make, how important you can feel or whatever it is that gets you off. . . . It feels like now, you can get a lot done, be really productive, and it is seductive. And that is why people who have more than enough money . . . more than enough respect, still are involved in this at the expense of their families because they need to feel needed. And there is nothing better than to complete things on a regular basis.

 

I’ve read this account more than a dozen times, and the line that sticks out the most is the one that gets to the motivating engine of burnout culture: “There is nothing better than getting to complete things on a regular basis.” Anything that gets in the way of “completing things” (and by “things,” here, the associate means “work”) is understood as a lack of devotion, or work ethic, or, it’s strongly inferred, a lack of intelligence. And the effects of this mindset go far beyond mere elitism. It affirms the righteousness of downsizing, layoffs, and outsourcing: Those people in the “real world” were lazy anyway. In fact, if anything, Wall Street is doing them a favor: “We’ve made everyone smarter,” an associate at Salomon Smith Barney told Ho. “Before, in the 1970s, corporations were so sloppy; now they are advanced. We’re the grease that makes things turn more efficiently.” Which is to say, they’re the grease that’s made everyone else’s work lives as miserable as their own, and with far less compensation.

It doesn’t help that beginning in the 1990s, corporations began hiring MBAs and ex-investment bankers directly from Wall Street instead of hiring leadership from within—as had been customary for decades.7 Once in a leadership role, ex–finance bankers could explicitly and implicitly reproduce the understanding of “hard work” they internalized during their time on Wall Street. (It’s worth noting that Jeff Bezos, who has fashioned a “bruising” workplace culture at Amazon, worked at the same firm as Ho.)8 The phenomenon is similar to the spread of consulting “alums” across the corporate sector: Barring a significant, psychology-altering intervention, once someone equates “good” work with overwork, that conception will stay with them—and anyone under their power—for the rest of their lives.

We tell ourselves all sorts of stories to justify our overwork. Some, like Wall Street bankers, have decided that it’s the best way of working, regardless of the fact that many readily admit that they spend a lot of their time inefficiently: bullshitting, spellchecking, or just waiting for edits on a presentation. Wall Street work isn’t necessarily better or more productive work. In truth, it’s just more work. But that doesn’t mean it hasn’t accumulated outsize power and influence on the way that other Americans work.

When I’m stressed by work, I find myself resenting the amount of sleep I need. Even though I know that sleep actually increases productivity, what I understand is that it decreases available working hours. All I want is to wake up and start, as that Goldman Sachs analyst put it so bluntly, “completing things on a regular basis.” Sometimes I read about physically and psychologically anomalous “short sleepers,” like the dozens of CEOs who survive and thrive on just a few hours of sleep a day—and feel deep pangs of jealousy. All of those people are talented, but their talent is ameliorated by their ability to let their work feed on even more parts of their lives.

You know who doesn’t need sleep? Robots. We might say we hate the idea of turning into them, but for many millennials, we robotize ourselves willingly in hopes of gaining that elusive stability we so desperately crave. That means increasingly ignoring our own needs, including biological ones. As theorist Jonathan Crary points out, even our “sleep” is increasingly a version of machines in “sleep mode” that’s not rest so much as “a deferred or diminished condition of operation and access.”9 In sleep mode, you’re never actually off; you’re just waiting to be turned back on again.

This sounds dystopic, but so are accounts of people stringing together two or three all-nighters to distinguish themselves, either in school or at work; or the lived reality of those in the precariat who work an eight-hour shift as a nurse’s aide, grab a few hours of sleep, and go out to spend the night driving Uber before dropping their kids at school and heading back to their day job. We’ve conditioned ourselves to ignore every signal from the body saying This is too much, and we call that conditioning “grit” or “hustle.”

This mindset was crystallized in a 2017 ad for Fiverr—an app through which “lean entrepreneurs” can pitch their services, starting at five dollars—that for a brief period of time was inescapable on the New York subway system. In the ad, a close-up of a harried, gaunt, yet miraculously still attractive woman is overlaid with the text YOU EAT A COFFEE FOR LUNCH. YOU FOLLOW THROUGH ON YOUR FOLLOW THROUGH. SLEEP DEPRIVATION IS YOUR DRUG OF CHOICE. YOU MIGHT BE A DOER.

“Doers”—the only type of person fit to survive the gig economy—have effectively silenced their body’s warning system. After all, it’s far easier to take some 5-Hour Energy than to look straight in the brutal face of our current economic system and call it what it is. As Jia Tolentino pointed out in The New Yorker, “At the root of this is the American obsession with self-reliance, which makes it more acceptable to applaud an individual for working himself to death than to argue that an individual working himself to death is evidence of a flawed economic system.”10

The ideology of overwork has become so pernicious, so pervasive, that we attribute its conditions to our own failures, our own ignorance of the right life hack that will suddenly make everything easier. That’s why books like Grit and Unf*ck Yourself and other titles with asterisks to blunt the profanity and the frustration have become such massive bestsellers: They suggest that the fix is right there, within our grasp. Because the problem, these books suggest, isn’t the current economic system, or the companies that exploit and profit from it. It’s us.

Surveillance Culture

I hope it’s clear at this point just how misguided that assertion is: No amount of hustle or sleeplessness can permanently bend a broken system to your benefit. Your value as a worker is always unstable. What’s deeply messed up, then, is that whatever value we do have is subject to continued optimization. And that optimization is achieved through ever-more noxious forms of employee surveillance.

Take the “open office,” which doubles as both a cost-cutting method and a way for everyone in the office to know what everyone else in the office is doing at a particular moment. Unlike the private offices that were once de rigeur, for most, open offices make actually completing work incredibly difficult, subject to constant interruptions or, if you put on headphones, suggestions that you’re a cold bitch—not much of a team player.

Stevie, who works as a copyeditor in an open office, told me she’d been told to make sure to look like she’s “doing serious work ALL THE TIME in case the big boss walks by.” Similarly, in the open office at BuzzFeed, the editor-in-chief periodically walks around, starting small talk, seeing what everyone’s up to. There’s very little you could be doing or watching on your computer at BuzzFeed to get you into trouble (save porn, which could still theoretically be rationalized). But even when my editor was nowhere to be seen, the visibility of my computer made me feel like I should always be typing or looking at something important. In a more traditional workplace, where, say, spending three hours on Reddit threads about furries would be frowned upon, the open office makes it stressful to do anything, even respond to an email from your kid’s school, that could be construed as “off-task.”

The goal of surveillance might be productivity, or quality control—but the psychological effects on workers is substantial. I spoke to a woman named Bri who worked for two years as a photo editor at an international photography agency, editing sets of images for various clients from movie premieres, award shows, breaking news events, etc. The company used a proprietary software to edit images that allowed managers to track every click and action. The actions weren’t reviewed until a month later, but then they were scrutinized closely. “It was very difficult and degrading to have a conversation with a manager over a set of images I barely had any memory of,” she explained.

“There was always this cloud of distrust that hung around our office. No one at my level felt like they were doing good work, or could do anything right,” Bri continued. “Morale plummeted, and I began having imposter syndrome, even though I’ve worked in my field for over seven years—my every move was being monitored, and the only feedback I ever received from management was negative.”

At Microsoft, managers can access data on employees’ chats, emails, and calendar appointments to measure “employee productivity, management efficacy, and work-life balance.” A growing number of companies are enlisting “tonal analysis” services that monitor meetings, calls, and Slack.11 Sabrina, who identifies as white Hispanic, lives in an urban area, has a bachelor’s degree, and makes around $30,000 a year. She was thrilled to be hired in a “research position” at a small startup before she discovered that it was in fact hours of rote data entry. Every day, she was asked to document, down to the minute, how long it took to complete each task on a Google Sheet, which was then shared with her boss, who would tell her if she was going too slow. She had to track not only how many minutes she spent inputting each segment of data, but also how many minutes she spent sending emails, or looking up how to do things, no matter if it took just a single minute.

“Having to track every single second of my productivity made me nervous to even use the bathroom,” Sabrina explained. “Do I literally write ‘bathroom’ on my time sheet? So I started to use the bathroom while I sent emails so as to not mess with my data totals and earn myself a reprimand. But then I was scared that if I entered six minutes for sending email into my time sheet, that would seem like too long of a time to be sending emails. This circular thinking, and the looming, unknown consequences, made me miserable.”

Like so many heavily surveilled employees, Sabrina dreaded going to work every day. The tasks were mind-numbing. Her forearms and hands ached from typing so fast for so long without breaks. But she stayed with it because her boss, who was a mini-celebrity in her field, promised that “hard work” could bring a chance to “prove” yourself: “To get what, exactly, I’m not sure,” she said. “The prestige of associating with him? But in the moment, those promises made it difficult to protest anything, and made me eager to please and accept his surveillance.”

This sort of monitoring is often soft-sold in the name of efficiency or happens so incrementally that employees have few avenues for resistance. “Your employer controls your livelihood,” Ben Waber, an MIT scientist who’s studied workplace surveillance, explains. “And if they say ‘give me this data,’ it’s very hard to say no.”12 When there are so few options for stable employment, you don’t get to decide whether or not you want to be surveilled. You just figure out how to manage the suffering it creates.

There’s significant evidence that the more surveilled—and less trusted—you feel, the less productive you are. In The Job: Work and Its Future in a Time of Radical Change, the organizational psychologist Amy Wrzesniewski tells Ellen Ruppel Shell that close monitoring by supervisors “makes it difficult for us to think independently and act proactively” and “nearly impossible for us to make meaning of our work.”13

Ruppel Shell points to the example of a nanny: Until recently, most nannies had total control over what they did during the day with their charges. They fed them and put them down for naps at certain times, but that autonomy helped make their experiences bearable, even enjoyable.

When I was nannying, that autonomy—paired with a living wage—did indeed make the job fun. My two-year-old charge and I rode the bus all over the city. We explored a new park every day of the week. We went to museums and street fairs and sometimes, when the rain wouldn’t stop for the fifth day in a row, we watched a movie in the theater together. And while I had a cell phone for emergencies, we did all of this totally untracked, in and outside the home. The year before, I had been nannying for an infant on the swanky Eastside of Seattle, when, unexpectedly, his grandmother came to stay in the house for several months. Every move I made, every word I used, every cry the child made, I felt watched and reported. I hated the commute, which is the reason I gave when I quit the job. But I hated the surveillance more.

Today, surveillance of childcare workers is increasingly normalized—whether in the form of hidden nanny cams, crib cameras (viewable from the parent’s phone) that show the exact moment when the child goes to sleep and wakes up, or constant text updates. When I was nannying, I’d write a brief note detailing what the toddler ate and what we did at the end of each day. Now I’d be entering it into an app, which would allow my employers to approve every decision in real time.

And then there’s the trackers. In order to decrease health insurance premiums, more and more companies are instituting programs that offer free Fitbits and calorie counters to workers. The deal is straightforward: Get in your 10,000 steps a day, or lose weight, and we all win! In practice, though, it’s one more incursion of the workplace into the personal, and a normalization of a deeply dystopian idea: that a good worker is a worker who permits their company to monitor their movements.

In September 2017, Amazon won two patents for wristband technology that tracks warehouse workers’ movements and provides “haptic feedback” (i.e., light buzzes) when you’re close to the right (or picking up the wrong) item for delivery. The disclosure of the patents raised concern that Amazon would be treating its workers like robots—but in truth, they are already are: “After a year working on the floor, I felt like I had become a version of the robots I was working with,” one former Amazon warehouse worker told the New York Times.14 “They want to turn people into machines. The robotic technology isn’t up to scratch yet, so until it is, they will use human robots.”

Or consider the Spire Stone: a small, beautifully designed tracker meant to be worn near the skin. When, through a series of different sensors, the Spire thinks that the worker is stressed, it guides them through a brief meditation. Theoretically, Spire is a tool to alleviate stress at work—and thereby optimize the worker for, well, more work. A surefire way to increase your level of stress is to be stressed out all the time over whether or not the weird pulsing rock on your skin is telling your manager that you’re stressed.

Some of these tactics feel limited to a certain echelon of worker, working for a certain sort of “paradigm-shifting” company. But technological surveillance, intended to “optimize” the worker and increase profits, has become standard within the fast food and retail industries. In Vox, Emily Guendelsberger describes how the particular stresses of the fast food workplace create a scenario similar to what one neuroscientist, in his attempts to create conditions that trigger depression in rats, called “the pit of despair.”

Employees are constantly supervised, and not just by annoying managers. “Everything is timed and monitored digitally, second by second,” Guendelsberger explains. “If you’re not keeping up, the system will notify a manager, and you will hear about it.”15 The pit of despair isn’t just what it feels like, on the job, working the cash register or the grill. It’s the whole suite of anxieties that accumulate around the minimum-wage worker.

To start, there’s the digital time clock, which penalizes workers for checking in even a minute after a shift begins, and the general stress of the worker’s schedule, which uses an algorithm and past data to determine exactly when the store needs more or fewer employees. In practice, this means ever-changing, totally unstable schedules, generally distributed to employees just two days ahead of time. (Except in selected cities like New York, San Francisco, and Seattle, where labor laws mandate that a schedule must be distributed two weeks in advance.) One longtime hotel front desk manager told me that before 2015, all the hotels she worked for posted schedules at least two weeks in advance. After 2015, that became impossible: The algorithms produced last-minute variations that made it so that the schedules were often available just a day ahead of time. At the same time, staffing budgets were tightened—forcing her and her coworkers to work sixty- to seventy-hour weeks. She usually had just one day off a week, which she dedicated to sleeping.

At one big-name fashion retailer, a worker told me that the algorithm was based on sales from the year before—with no accounting for holidays, weather, etc. Some companies now schedule “clopen” shifts, in which an employee comes in for a few hours to close, goes home for a few hours of sleep, and then returns to the store for an early open. Brooke, who works as a server at a high-end fast casual restaurant, is regularly assigned such shifts: “It makes getting consistent sleep very difficult,” she says. The same goes for “understaffing,” in which just the right amount of workers are scheduled for a particular moment in the day.

When there’s a sudden rush, unanticipated by the algorithm, everyone starts yelling for backup, creating, in Guendelsberger’s words, “maximized misery for workers and customers.” Sure, it’s inhumane. But it’s profitable.

The work schedule for Holly, who recently started a job as a front desk agent at a hotel, is based on the projected number of arrivals and departures on any given day. More senior staff gets more consistent scheduling with regular days off; those who are newer to the company, like her, are scheduled “all over the place.” In addition to “clopen” shifts, there’s no guarantee of time-off requests, “which means a lot of canceling plans on the fly, and coping with disappointment/irate family and friends because you’re unable to commit to anything except for the job.” There’s no guarantee that she’ll get forty hours a week, but her schedule’s not consistent enough to find another job. “Trying to draw up a budget,” she says, “is a big scribble nightmare.”

When you’re barely making enough money to survive, or supporting a child, as a quarter of fast food workers do, the options for stress “relief” or amelioration dwindle. You might have an hour for the gym, but not enough money to pay for it. You have less money and less wherewithal to try to buy or make healthier food. Your body begins to bear the physical signs of your labor: in burns, as reported by 79 percent of fast food workers in 2015, or flat-out exhaustion.16 You’re paid so little, and certainly not enough to save, and are so exhausted by the work you do that it’s often hard to see a way out.

Holly told me that her job has resurfaced her “long-seemingly-neutralized, painstakingly managed” panic disorder. She’s tried telling her managers that erratic scheduling makes it incredibly difficult to manage her anxiety; they respond, “That’s just the way it is.” The only option to manage her health is to quit the job—but she can’t do that until she has something lined up, and in the midst of an anxious episode, job hunting feels impossible. “Thankfully I have some solid friends to keep me from slipping into the dark place,” she says. “But for the people without strong social/familial scaffolding, it could be devastating.”

Stress is not just something you experience while trying to fulfill an order, or make it into work fifteen minutes early because you can’t trust public transportation to get you there on time. Stress disintegrates the body, and can make it unsuitable for any other type of work. A stressful job isn’t just a route to burnout. It also traps you, creating a situation in which you can see no option other than to keep doing it.

The same goes for all sorts of contingent labor: An undocumented worker, whether in the fields or as a nanny, has no legal standing, no means of reporting exploitation, no recourse when wages are withheld. “Off the books” laborers, as domestic workers often are, don’t have to be paid overtime. That’s what happens when you don’t have options: You have no negotiating power, or power of any sort, at least when it comes to the workplace. Which is why freelance work, with the “options” that accompany it, has become so alluring: The structure of formal work, whether in a fast food restaurant or a law firm, has become so stressful that going freelance, either within one’s field or working in the gig economy, seems like a perfect solution.

The Fetishization of Freelance Labor

Over the course of the Great Recession, over 8.8 million jobs were eliminated in the United States alone. Americans lost jobs in construction, at colleges, at nonprofits, at law firms, and at big-box stores going out of business. They lost jobs in recreation, at newspapers, at public radio stations, at car factories and startups, in finance, in advertising, and in publishing. In the past, recessions have busted the job market, but then recovery has rebuilt it: The jobs disappeared as companies tightened their belts, then reappeared as they felt confident expanding.

That’s not what happened this time—which is one of the main reasons why millennials, many of whom were struggling to find their first job, any job, during this era, have had such a negative experience of work. To be clear, it’s not that jobs weren’t created. In fact, strong job creation numbers were flouted every day—first by Obama, then by Trump. It’s just that they weren’t the same sort of jobs as before. A “job” can be a temp position given to a freelancer, a seasonal gig, even a part-time job. According to one study, nearly all of the jobs “added” to the economy between 2005 and 2015 were “contingent” or “alternative” in some way.17

But for those desperate for work, especially millennials graduating into the post-recession market, these jobs nonetheless provided a much-needed paycheck, however meager—and the freelance and gig economy exploded. The willingness of workers to settle for these job conditions helped foster an even deeper fissuring of the workplace: first, by normalizing the low standards of the freelance economy; second, by “redefining” what it meant to be “employed.”

The general logic behind freelancing goes something like this: You have a marketable skill, maybe in graphic design, photography, writing, digital editing, or web design. Various companies are in need of that skill. In the past, medium- and large-size companies would’ve hired full-time employees with that skill. But in the fissured workplace, those same companies are reticent to hire any more full-time employees than absolutely necessary. So they hire multiple freelancers to do the work of a full-time staffer, which gives the company high-quality work, without the added responsibility to shoulder freelancers’ health benefits or ensure fair working conditions.

From the outside, freelancing seems like a dream: You work when you want to work; you’re ostensibly in control of your own destiny. But if you’re a freelancer, you’re familiar with the dark side of these “benefits.” The “freedom to set your own hours” also means the “freedom to pay for your own healthcare.” The passage of the Affordable Care Act has made it easier to purchase an individual plan off the marketplace. But before that—and given the concerted attempt to undercut the ACA—obtaining affordable healthcare as a freelancer has become increasingly untenable.

In California, one person told me that the cheapest insurance they could find—for one person, with very little coverage and a high deductible—goes for $330 a month. I talked to a dog walker in Seattle who pays $675—without dental coverage. Another person reported that their bargain basement plan in Minnesota costs $250 a month. In Dallas, $378 a month for a catastrophic plan with a $10,000 deductible. And that’s if there’s just one of you: A freelance writer told me she’d had breast cancer, and her husband, a freelance photographer and photo editor, is an insulin-dependent Type 2 diabetic. They live in suburban New York, and currently pay $1,484 a month for coverage. Many freelancers told me their deductibles were so high that they avoided going to the doctor if at all possible, which frequently ended with even higher bills when they were finally forced to seek care—and, because they were freelancers, there was no such thing as paid time off to recover.

Freelancing also means no employer-facilitated 401k, no employee match, and no subsidized or concerted means, other than the portion of your freelance checks that go to Social Security every month, to save for retirement. It often means hiring an accountant to deal with labyrinthian tax structures, and getting paid a flat fee for the end product or service, regardless of how many hours you put into it. It means complete independence, which in the current capitalist marketplace is another way of saying it means complete insecurity.

“I get no general or consistent feedback on my skills,” Alex, who works as a freelance designer and illustrator, told me. “I accept pay less than my worth just to get a job. There’s consistent price undercutting. And there’s the anxiety over the lack of control over my own life.” “Clients,” after all, owe you nothing. When the supply of freelancers with a given skill or service is greater than the demand, wages cannot be negotiated. You adjust your rate to whatever a client is willing to pay.

Take the example of journalism: Every writer used to dream about the freedom of the freelance lifestyle. Pitch only the stories you want to write; write only for the publications you want to write for. And back when magazine publishing was healthy, you could make bank: two dollars a word (on the moderate side of things) for a 5,000-word feature meant $10,000 for a few months’ work.

But when the journalism market bottomed out with the Great Recession, everything reset. Laid-off journalists flooded the market, desperate for freelance gigs. The amount of competition drove down rates, which was about what most outlets could afford to pay. And then there were people like me: non-journalists who’d honed their voice online, on LiveJournal and WordPress, for free. In 2010, I started reading the Hairpin, a website that had sprung from the ashes of the recession.

The business model, like a lot of business models at that time, was contingent on publishing anything good by anyone who was willing to write for free. I began writing pieces, rooted in my academic research, on the history of celebrity gossip and classic Hollywood scandal. Like a typical millennial, I was chuffed that they’d even publish them. I wanted an audience for my passion far more than I wanted to be paid. This model made it possible for hundreds of people to break into writing. You can trace the careers of many prominent contemporary writers back to the Hairpin, its sister site, the Awl, its cousin site, the Toast. Same for dozens of sports writers, blogging for free on sites like the Bleacher Report. We “made it” because writing wasn’t our main gig, which allowed us to write for nothing or, as the sites gained traction and the recession faded, we wrote for what my grandmother would’ve called “pin money”: extra, surplus, gravy.

But because we were all writing as a side gig—which is why we could afford to do it for free—we also helped to drive rates way, way down. Why pay a freelance writer their established rate, the rate that would help keep them paying rent, when you could pay a graduate student in art history zero dollars for their insight?

That’s the sort of desperation that actual companies—far more than esoteric little websites—took advantage of. And no one took more advantage of it than the newly ascendant gig employers: Uber, Handy, DoorDash, and dozens of others. When we look back on the period following the Great Recession, it will be remembered not as a time of great innovation, but of great exploitation, when tech companies reached “unicorn” status (valued over $1 billion) on the backs of employees they refused to even deign to label, let alone respect, as such.


The dynamics and overarching philosophy of Silicon Valley create the perfect conditions for fissured workplaces. Silicon Valley thinks the “old” way of work is broken. It loves overwork. Its ideology of “disruption”—to “move fast and break things,” as Mark Zuckerberg famously put it—is contingent on a willingness to destroy any semblance of a stable workplace. In the startup world, the ultimate goal is “going public”: creating a high enough stock valuation, and, afterward, unmitigated growth, no matter the human cost. That’s how these companies pay back the venture capital firms that invested in them—and that’s how they make their founders, boards, and early employees very rich.

Talking about how Silicon Valley and shifting concepts of work means talking about Uber. You might be as sick of talking about Uber as I am, but its impact is widespread and undeniable. “Under our noses, the company has ushered in a wave of changes touching most aspects of society, be it family life or childcare arrangements, worker conditions or management practices, commuting patterns or urban planning, or racial equality campaigns and labor rights initiatives,” Alex Rosenblat argues in Uberland. It “confuses categories such as innovation and lawlessness, work and consumption, algorithms and managers, neutrality and control, sharing and employment.”18 The number of Americans who’ve actually driven for Uber is proportionally small. But the changes it set in motion are slowly infiltrating the rest of the economy and our everyday lives—especially those who, in any capacity, rely on the gig economy.

Like so many other startup companies of the post-recession era, Uber was founded on the premise of disruption: taking an old industry, oftentimes one that was a bit clunky, and analog, but that paid its workers a living wage, and using digital technologies to change it into something sleeker, easier, and cheaper that would funnel money to the disrupting company. Uber, along with Lyft, Juno, and a handful of other ride-hailing companies, disrupted what has traditionally been known as the “livery” business: picking people up and taking them places. Their popularity launched an entire cottage industry of services reconceptualizing quotidian tasks: Rover disrupted pet care. Airbnb disrupted lodging. Handy disrupted handymen. Postmates and Seamless and DoorDash disrupted takeout. And while these apps have made vacationing and ordering in and getting from one place to another easier for consumers, they also created a massive swath of bad jobs—bad jobs that workers, still desperate from the fallout of the recession, were (at least temporarily) thrilled to take.

For a short period of time, companies like Uber were viewed as economic saviors. They sold themselves as a means of using and distributing resources—cars, drivers, cleaners, bedrooms—with far more efficiency than the old systems, all while creating the jobs that the clawing middle class were desperate to land. The secret of these jobs, though, were that they weren’t even technically jobs, and certainly not the sort of jobs that could mend the broken class ladder. Instead, these jobs have created what the tech columnist Farhad Manjoo calls “a permanent digital underclass,” both in the United States and around the world, “who will toil permanently without decent protections.”19

That’s because, at least at Uber, the tens of thousands of people who drove for the company weren’t even considered employees. In external messaging, Uber’s posture toward these men and women remained steady: The drivers were, in fact, a sort of customer. The app merely connected one set of customers, in need of rides, with another set of customers, willing to provide it. As Sarah Kessler, author of Gigged, points out, “Uber merely took a trend among corporations—employing as few people as possible—and adapted it for the smartphone era.”20

After all, actually hiring employees, even if you’re just paying minimum wage, is “expensive”—and requires the company to take on all sorts of responsibilities. When you’re a startup burning through millions in venture capital, the goal is growth, always growth, and responsibility is an impediment to growth. Uber solved the problem by calling their employees “customers” and by officially designating them as “independent contractors.”

“Independence” meant those who drove for Uber could make their own schedule, had no real boss, and worked for themselves. But it also meant these pseudo-employees had no right to unionize, and Uber had no responsibility to train them or provide benefits. Gig economies lured workers with a promise of that independence—with work that could actually bend to fit our lives, our children’s schedules, our other responsibilities. This work was framed as particularly suitable for supposedly self-centered, picky, self-righteous millennials; as the gig economy grew in visibility, Forbes declared, “The 9 to 5 job may soon be a relic of the past, if millennials have their way.”21

But that’s not how it worked out. Not for Handy cleaners, or TaskRabbits, or laborers on Amazon’s Mechanical Turk, who bid to complete menial online tasks (clicking on every photo with a picture of a bird, for example, in order to assist with AI recognition) for pennies. Not for Door Dashers, who until a massive online backlash was using tips to cover their independent contractors’ base pay—meaning that if a Dasher was guaranteed $6.85 per delivery and received a $3 tip, they still received just $6.85; users were essentially tipping DoorDash itself. And despite Uber’s past (and thoroughly debunked) claims that an Uber driver could make $90,000 a year, the majority of people driving or cleaning or renting their spare bedroom or clicking relentlessly on a mouse in the gig economy are doing it as a second or third job—a shitty job to supplement a different shitty job.22 The gig economy isn’t replacing the traditional economy. It’s propping it up in a way that convinces people it’s not broken.

Freelance and gigging don’t make drudgery or anxiety disappear. Instead, they exacerbate them. Any time that you do take off is tinged with regret or anxiousness that you could be working. That hour at a birthday party could be thirty dollars from Uber. That hour on a run could be spent pitching to new clients. That hour reading a book could be used to seek out another writing assignment. In today’s economy, going freelance means internalizing the fact that you could and should always be working more. Nick, who does freelance stats analysis through Upwork, described the internalized pressure to be “working eternally and at all times”; Jane, a freelance writer, explains that “there is such a sense in freelancing that you are never doing enough—that you should be doing more, making more, hustling more—and that every failure you have (real or perceived) is entirely your fault. In an office job, you’re still getting paid for those five minutes it takes to make a cup of tea; when you’re freelancing, every minute you’re not working, you’re losing money.”

In practice, freelancing often means developing the mindset that “everything bad is good, everything good is bad”—a mantra I threw around with my friends during grad school to describe the perverse alchemy of overwork, in which drudgery feels “great,” and actually pleasurable activities become indelibly lined with guilt. As Kessler reports in Gigged, Uber directly exploits this mindset: When a driver attempts to close the app and refuse future calls, it responds with a variation on “Are you sure you want to go offline? Demand is very high in your area. Make more money. Don’t stop now!”23

Your ability to work is never as “free” as the word freelance suggests. If your car has to be repaired, you’re sick for a long period of time, or you simply don’t want to drive, Uber makes it difficult to start working again. You’re repeatedly subjected to the whimsy of drunk passengers who give a single star for fun. And as Guy Standing points out, “The person who works for himself works for a tyrant—you are only as good as your last job and your performance. You are constantly being evaluated and graded. Having to worry so much about where the next bit of bread is coming from means people losing control over their lives.”24 Or, as one Uber driver told Rosenblat, “you don’t have a boss over your head—you have a phone over your head.”25

Freelancing is exhausting and anxiety-building enough. But that’s compounded by the widespread refusal to see what you do as work. Just as the work of teachers or mothers is devalued (or unvalued), jobs within the sharing economy aren’t figured as jobs at all—they’re attempts to monetize your hobby, to have fun conversations while driving around the city, to invite people into your home. Even calling these jobs “gigs,” with all the inherent connotation of brevity and enjoyability, elides their status as labor. It’s not the gig economy after all; it’s the always-frantically-seeking-the-next-gig economy.


“We’ve idealized the idea of portable work, promoting the notion of people roaming about with a portfolio of skills they can sell at a price they set themselves,” Standing argues. “Some are able to do that, of course. But to think that we can build a society on this platform, with no protections, is fanciful.”26

Many of Uber’s employees continue to fight for the right to bargain with their employer. Freelancers in media from all over the United States have created their own iteration of union, in which they collectively set rates and, when media employees are laid off from an organization or strike, refuse to “scab” into their former roles. More and more freelancers, gig economy laborers, and temps are realizing that flexibility is meaningless without stability to accompany it.

But the only way to call for these types of action is to have leverage: to have options, but also to be acknowledged as an employee. This means an overhauling of our current system, an action that may need governmental intervention. If lawmakers force companies like Uber to stop misclassifying its employees as independent contractors, it would reinforce the social contract between companies and laborers—the idea that companies are responsible for the livelihoods of those who labor for them, and that the profits gleaned through this labor should trickle down, in some form, to them. That might seem incredibly radical, but if you look back just sixty years, it was also an incredibly American way of conceiving of profits.

It’s a solution that’s especially difficult to implement when the head of the company is saying there’s no problem in the first place: “I think a lot of the question about whether this is employee versus independent contractor misses a little bit of the point,” Tony Xu, CEO of DoorDash, told ReCode Decode. “I mean, if you think about what is the root problem, the root problem is, how do we maximize all this flexibility, which Dashers love, and provide a security blanket for those who need it?”27

One very obvious way: Hire them as employees. Masking exploitation in the rhetoric of freelancing and independent contractors’ “flexibility” avoids talking about why that flexibility is coveted: because the supposedly “thriving economy” is built on millions of people being treated as robots. “What worries me most is that this is just the beginning,” Manjoo wrote in the aftermath of the DoorDash tipping backlash. “The software-driven policies of exploitation and servility will metastasize across the economic value chain. Taking DoorDash workers’ tips today will pave the way for taking advantage of everyone else tomorrow.”

Manjoo’s right. But the people it’s most poised to take advantage of in the immediate future are those who have no other options—and those, like millennials and Gen Z, who don’t realize there’s any other way. Which underlines the current conundrum: Shitty work conditions produce burnout, but burnout—and the resultant inability, either through lack of energy or lack of resources, to resist exploitation—helps keep work shitty. Significant legislation to updates labor laws to respond to current workplace realities can and will help. But so will solidarity: an old-fashioned word that simply means consensus, amongst a wide variety of people of like mind, that resistance is possible.