CHAPTER 9

PLAYBOR OF LOVE

Technology

VIDEO-GAME PROGRAMMERS LEARN TO CELEBRATE “CRUNCH” FROM THE get-go. Like many of his peers, Kevin Agwaze went to a specialized school that taught coding for games, rather than a traditional university. Such schools normalize a brutal workweek, treating high dropout rates as a badge of honor, and instilling the idea that the games industry is a shark tank where only the strong survive. While in his native Germany, he noted, “Uni is free,” the program he attended, a two-year course, costs around €25,000 (about US$29,000). Such programs can cost even more in the United States, where a specialized education might run $100,000.

The schools, Agwaze and other programmers explained to me in a London pub, pump out “eight gazillion” games developer grads, for whom there are not necessarily enough good jobs. By the time they graduate, programmers expect to work long hours to prove themselves, and for those hours to stretch even longer when deadlines loom. To Agwaze, it seemed to be worth it to work in a field about which he was passionate. “I knew it was going to be bad for me,” he said with a lopsided grin. “I thought, ‘I am young, my body is going to be fine. I can do it for a while. I can handle bad conditions.’”

He wanted to work in what they call triple-A games—the video-game equivalent of a blockbuster film, with a big budget and production teams that span multiple countries and studios. He applied for jobs all over and wound up in the United Kingdom at a company called Studio Gobo. The company, which bills itself as “a family of graphics geeks and artistic misfits,” offers “AAA console game development services for a global client base.” What that means, Agwaze explained, is that they work on specific parts of bigger properties for major studios. “We have all the creative freedom but none of the risk, like if Ubisoft [a French video-game company] is going to cancel [a] game, they will still pay us,” he said. He’s pretty happy at his job, all things considered.1

His day-to-day work schedule depends to a degree on other programmers working in offices that might be several time zones away. There’s no time clock to punch, no overtime pay; he comes in to work around 10:00 a.m., he said, and leaves most days around 7:00 or even 8:00 p.m. The late evenings are in part, he explained, because he’s working with developers in Montreal, who don’t arrive at work until after he’s had his lunch. “I come into the office, read all the emails about stuff that happened after I left, when they were still working,” he said. There is, he joked, a 50/50 chance that the thing he’s supposed to be working on will be broken in some way when he arrives and he’ll have to wait for Montreal to be online to fix it; if it isn’t broken, he can do some work before they’re up.

The seemingly inefficient process is common across the industry, he explained. In part, that’s because so many different people work on different parts of big games that it would be impossible to have them all in one office, or even, it seems, one company. There is also the desire for what he called “acculturation” benefits—making sure that games are accessible and interesting to audiences in a variety of locations rather than being so culturally specific to one that players in a different market won’t want it. “If you have people with different backgrounds working on a game,” he said, rather than employing “the same Bay Area American people” each time, “it might just end up being a better game.”

There is also the question of costs—some of the programming is outsourced to countries like India, where the wages are lower and the working conditions less regulated. “Somebody working in India and somebody working in Sweden can have completely different working conditions,” he noted, “even though they are working at the same company on the same game and the same project, maybe even the same feature.”

The grueling hours lead to high turnover at the jobs in the industry, even more so than the programming schools. It’s a workload, Agwaze and the others said, designed for young men without families or caring responsibilities, who can dedicate their entire lives to the job. And indeed, the demographics of the industry bear this out: recent surveys of the United Kingdom’s games workforce found that the vast majority were young men. Only 14 percent were women, and as for workers of color, like Agwaze, in 2015 they made up a dismal 4 percent. In the United States, meanwhile, a 2019 study found that only 19 percent of the workforce was female, while a slightly better 32 percent identified as something other than white. When the appeal of working on games no longer trumps the desire to have a life outside of work, programmers leave and go into a different industry. Their skills might have been honed to make blockbuster games, but the same code that makes up the backbone of Red Dead Redemption can also be used to make the latest financial technology app, for more money and shorter hours. “It’s just a different planet,” Agwaze said.2

That turnover itself makes the industry less efficient than it could be: rather than trying to retain experienced workers, companies bring in more young workers like Agwaze to make up the difference. Meanwhile, senior positions sometimes go unfilled for months. It becomes a circular problem: hours stretch longer and longer as junior developers scramble to fix bugs; they get tired of the struggle and quit; and then a new person with even less practice is plugged into their spot. And the companies’ idea of how to make the job more sustainable is to put in a Ping-Pong table and give out free food. Agwaze laughed, “Let’s put a bed in there! Sleepover! Put in showers!” Studio Gobo’s website promotes “Gobo Friday Lunch,” with “Freshly cooked (free!) food by our in house chef, the only rule is you’re not allowed to sit next to the people you did last week. It’s an opportunity to relax and hang out as a team and some of our best ideas have emerged over a warm home-cooked meal.”

But, of course, it’s not home-cooked. Instead, it blurs the distinction between home and work. “I have time periods where, like, I sleep for two or three hours,” Agwaze said. “I’m just going home to bed and waking up and going back again. I don’t remember what happened. I just remember going to bed and being in the office again.” Coworkers become close friends, late shifts can take on a party atmosphere, and the feeling that everyone is part of something important often prevails. Studio Gobo’s website again: “Fun is at the heart of what we do. We know that if we want to make fun games, we also have to have fun making games.”

Yet that fun atmosphere itself is designed to entrap workers into staying longer daily, even without direct pressure from the boss. “I had a senior employee tell me, ‘Kevin, I notice that you stay long hours a lot and I think it has a bad impact on the whole team, because if you stay longer, everybody else wonders, “Do I need to stay longer?” It puts pressure on your team. Even if you want to do that, that might negatively affect everybody else.’” At the time, Agwaze said, he shrugged it off. The individual pressures—the need to build one’s CV—mitigated against collective concern. “I remember being like, ‘Ah, whatever. I am fine. I am doing good.’”

Agwaze’s experience was rare, though, he noted—most employers applied the opposite pressures. Crunch was endemic to the industry: over half of the workers questioned in one survey said they’d worked “at least 50 percent more hours during crunch than the standard work week of 40 hours.” The issue came to the fore in 2004 with a public “open letter” from the spouse of a developer at Electronic Arts (EA), complaining of her partner’s eighty-five-hour crunch weeks. Two class-action lawsuits followed, alleging unpaid overtime. Both were settled out of court, but the practice continued up to 2020. And it’s not clear the practice is even worth it for employers. “Crunch,” Agwaze noted, “produces bad games, a lot of average games, and some good games. Just because you crunch doesn’t mean that the game is going to be any good at all.”3

Beyond their expected loyalty to their own CV, the programmers were encouraged to consider themselves part of the family, and to work hard to pull their weight within it, even if, as Agwaze said with a sardonic laugh, “Maybe I crossed the country to start this job and I was fired in my first week after they told me I had now entered the family.” While this had never happened to him, it wasn’t an uncommon experience in the industry.

Some managers in the industry are starting to realize that they need to figure out better ways to retain experienced developers than trying to make the office feel less office-like. But the culture of the industry remains mired in the idea that putting in long hours is a mark of quality and dedication, rather than burnout and inefficiency. “They can’t even imagine it as a bad thing,” Agwaze said. “This is how it is. How can anybody believe this to be bad or wrong? This is how we need to do it.”

With the arrival of COVID-19 in Britain, Agwaze joined the masses suddenly working from home. For him, that meant an even further blurring of the lines between time on and time off the job. At first, he said, he was told he needed to keep going to the office, but when the government announced its recommendations, he was allowed to stay home. He did some rearranging in his flat: when a roommate moved out, he was able to take over their room for a workspace, and he was able to borrow a computer with a bigger monitor on which to work. “I wake up, go to the other room to the PC. Then, I work for a long while. Then, at some point, I stop working. It might be after eight hours or slightly more or slightly less. I used to pretty rigorously take an hour of lunch break at 1 p.m. sharp with other people from work, but now I’m like, ‘Did I eat anything today? No, I didn’t. I should probably eat. What’s the time? Oh, it’s 2 p.m.’”

And after all the time that he spends dedicating himself to making games, he said, he doesn’t really play them that much anymore. He laughed, “I don’t have time. I sneak one in every now and then.”

image

PROGRAMMING, A FIELD CURRENTLY DOMINATED BY YOUNG MEN, WAS invented by a woman. Ada Lovelace was the daughter of Romantic poet Lord Byron, but her mother steered her into mathematics, “as if that were an antidote to being poetic.” Lovelace was inspired by mechanical weaving looms to design a program for Charles Babbage’s “Analytical Engine,” an early idea of a computer. Her insight was that the computer could be used not just to calculate complex equations but to handle music, graphics, words, anything that could be reduced to a code—perhaps even games. Her paper on the subject, now considered the first computer program, was published in a journal in 1843, years before anything resembling a computer had actually been built.4

These days, the tech industry—as the shorthand would have it, leaving aside the question of just what is considered “technology”—is fawned over as the main driver of innovation in the world’s major capitalist economies. Programmers are lionized in the press, their long hours held up as proof of romantic commitment to the work rather than inefficient work processes, their skills envisioned as something between God-given talent and Weberian hard work and grit. Those skilled workers are seen as geniuses the way artists used to be, gifted with superior abilities in a field inherently creative and specialized. Tech jobs are described as dream jobs, where the most skilled workers are wooed with high salaries, great benefits, stock options, and fun workplaces where you can bring your dog, get a massage, play games, and, of course, enjoy the work itself—and all of this leads to more and more work. The obsession with “innovation” is actually less than a century old, but the concept is often used to obscure the way skills become gendered and racialized, associated with a certain image of a certain kind of worker, and how that perception is reproduced along with our attitudes toward work.5

Programming was not always illustrious work, and computers were not always fancy machines. “Computer” was a job title for humans, often women, hired to crunch numbers on mechanical calculators at high volumes. Women did so in the United States during World War II, when men were being sent to the front lines and the first computing machines were being developed. The Electronic Numerical Integrator and Computer (ENIAC) was designed to replace those human computers, but its ability to perform calculations relied on human hands manually moving cables and flipping switches. At the time, the programming of the computer was considered routine work, and men were in short supply, so the University of Pennsylvania, where the ENIAC was born, recruited women with math experience to work on the machine.

In 1945, the first six women learned to be computer programmers: Jean Jennings, Marlyn Wescoff, Ruth Lichterman, Betty Snyder, Frances Bilas, and Kay McNulty. The women flirted with soldiers, argued about politics, and calculated differential equations to make the complicated machine work, learning its inner workings as well as any of the male engineers who’d designed and built the thing. The ENIAC—a massive, eighty-by-eight-foot mass of vacuum tubes, cables, and thousands of switches—“was a son of a bitch to program,” Jennings later commented.6

The women knew their work was difficult, skilled labor, but the male engineers still considered the programming to be closer to clerical work—women’s work, in other words—than the hardware side. Yet it was the women who stayed up late into the night, “crunching,” to make sure the ENIAC was working for its first demonstration—to which they were not invited. “People never recognized, they never acted as though we knew what we were doing,” Jennings said.7

After the war’s end, the women who had been pressed into wartime service were encouraged to return home, free up jobs for men, and start families. Yet the women who worked on the ENIAC had a special skill set that made them harder to replace. “We were like fighter pilots,” McNulty said. Instead, they stayed on and worked to design computers for nonmilitary uses, working alongside mathematics professor and navy reservist Grace Hopper. “Women are ‘naturals’ at computer programming,” Hopper told a reporter in 1967. Yet even then, as software work gained prestige, the men were taking it over.8

Male programmers deliberately sought to shift the image of the field. Men, after all, wouldn’t want to go into a field seen as women’s work. To add cachet to the work, they created professional associations, heightened educational requirements, and even instituted personality tests that identified programmers as having “disinterest in people” and disliking “activities involving close personal interaction.” People skills, like those taken advantage of in the classroom or the retail store, were for women, and apparently just got in the way of programming, a collective task being re-envisioned for solitary nerds. As Astra Taylor and Joanne McNeil wrote, the notion of the computer hacker “as an antisocial, misunderstood genius—and almost invariably a dude—emerged from these recruitment efforts.” Changing the gender profile of programming, Taylor and McNeil wrote, also had the effect of boosting its class status. Rather than work learned by doing, programming was now the purview of rarefied graduate programs at the few research universities able to afford computers of their own.9

By the time the US Department of Defense bankrolled the project that would eventually become the Internet, computing was so thoroughly masculinized that there were no women involved. Instead, the Advanced Research Projects Agency Network (ARPANET) would be, in the words of Katie Hafner and Matthew Lyon, Where Wizards Stay Up Late. The men who built the network—funded by the DOD’s Advanced Research Projects Agency (ARPA) in order to link computer labs up around the country to share research—were “geniuses” whose commitment to their work involved a lot of one-upmanship about who could work longer hours.10

ARPA’s Information Processing Techniques Office was funding cutting-edge research that the private sector, and even the universities, might otherwise have shied away from throughout the 1960s. Created in reaction to the USSR’s launch of the Sputnik 1 satellite, ARPA reflected the fear that the United States was falling behind. It was this same fear that led to an increase in the education budget and expanded public schooling, but it funded plenty of research that didn’t have clear military applications. One of those projects was ARPANET.11

Making computers communicate required all sorts of new technologies. At the time, most computers didn’t speak the same language. In Hafner and Lyon’s words, “Software programs were one-of-a-kind, like original works of art.” The innovations that would make the ARPANET, and then the Internet, possible were the result of a collective process between dozens of programmers and graduate students on multiple continents. Despite the tendency to ascribe progress to the unique genius of each of these men, researchers in different countries came up with similar ideas at nearly the same time.12

These computer whizzes were building on one another’s breakthroughs, and the ARPANET would help them integrate their collective knowledge more deeply. In the obsession with the individual genius, we miss the real story, assuming that works of brilliance are the result of singular minds rather than collaboration—a notion that just happens to mitigate against the idea of organizing. “If you are not careful, you can con yourself into believing that you did the most important part,” programmer Carl Baran said. “But the reality is that each contribution has to follow onto previous work. Everything is tied to everything else.”13

The fetish for the tech innovator who dropped out of college may have begun, too, with the creation of the ARPANET. Bolt, Beranek and Newman, the firm given the contract to make the network a reality, was known for hiring dropouts from the Massachusetts Institute of Technology (MIT) in its hometown of Cambridge. Dropouts were smart enough to get into MIT, but without the degree, they cost less to hire. In just a few short years, the field had gone from instituting degree requirements as a class and gender barrier to entry to preferring those who cheerily tossed those requirements aside—and not long after that, to the legend of the Stanford or MIT dropout who created a company in his garage.14

There were a lot of sixteen-hour days, a lot of late nights and missed dinners, and a lot of sleeping at the desk for the programmers involved in creating the network—as well as for the graduate students who, at the various receiving sites for the ARPANET-connected computers, did much of the work of getting computers to talk to one another. They hammered out protocols, shared resources, and came up with the very first email programs collaboratively, sharing information with one another and hashing out disputes informally. The early Internet took the shape of the men who made it—it was anarchic, a place for sleepless computer nerds to express themselves, and argue for hours, whether it was about their ideas for the network or their political convictions (Defense Department money or no Defense Department money). They even figured out how to make games for it—a stripped-down version of the tabletop game Dungeons and Dragons, called Adventure, for example, was built by one of the Bolt, Beranek and Newman coders and spread widely across the Net.15

Video games were the perfect sideline for workers expected to be chained to their desks late into the night in a field where one’s sleeplessness itself was a status symbol. If the programmers played with the network as much as they did hard work on it, that was just another way that they expanded its capabilities and kept themselves interested in the work they were doing. Later theorists named this playbor, simultaneously work and play, unforced yet productive. Adventure gaming blurred the lines between work and play just as the lines between work and home were being blurred by all those long nights at the office. That the network could be used for fun made the labor that went into making it seem even more worthwhile.16

Early video-game companies capitalized on these same ideas. As Jamie Woodcock wrote in Marx at the Arcade, “companies like Atari promised ‘play-as-work’ as an alternative to the restrictive conditions of industrial or office-based Fordism.” The 1970s were, after all, the decade in which the rebellion against the Fordist factory was slowly synthesized into the neoliberal workplace. Forming a union was out. Instead, little forms of disobedience, like playing video games on the office computer, would come in and be absorbed into the workflow in the tech industry itself. Atari, which at this time developed early home consoles for playing video games on personal televisions, was the first company to prove that games could be big business. And as the computer business boomed, the tension between work and play, between fun and profits, only continued to grow.17

Programmers had been given a huge amount of freedom in the early days of the ARPANET. Coder Severo Ornstein from Bolt, Beranek and Newman had even turned up to a meeting at the Pentagon wearing an anti–Vietnam War button. But as the private sector began to get into the act (and woo away many of the academics and public employees who had been instrumental to the project), the question of how much power individual workers could be allowed to have was occurring to managers. Far from the purview of a handful of unique “wizards” and “geniuses,” the daily workings of what was now a rapidly growing “tech” industry required a lot of work from a lot of skilled but interchangeable laborers. And those laborers had to be prevented from organizing.18

Silicon Valley eclipsed Cambridge as the tech hub for many reasons, but one of them was that the nonunion atmosphere allowed companies to maintain their cherished “flexibility.” While Massachusetts had a long-established union culture, California was the wide-open frontier. Nevertheless, the 1970s and 1980s saw some attempts to unionize at tech companies from Atari to Intel, stories mostly written out of the history of tech as the industry grew.19

By this time, computers and games were becoming more firmly entrenched as toys for boys (or men who’d never stopped being boys). Women’s participation in computer science programs fell from nearly 40 percent in the 1980s to below 20 percent at present, as personal computers, mostly good for gaming early on, were marketed to little boys, cementing further the idea that it was men who would be the new programmers. Pop culture picked up on this trend, making heroes of white male computer geeks. Anyone who didn’t have a personal computer fell behind when it came to computer skills, erecting a class barrier to go with the gender barrier. Schools tended to accept, and companies tended to hire, people who looked like their idea of a “computer person,” which was, according to science and technology researcher Janet Abbate, “probably a teenage boy that was in the computer club in high school.” The assumption remained that computers, like art, were something one had to have a natural talent for; women were good at community and caring for others, and men were good at things that required an isolated, antisocial genius. The split between the two kinds of laborers of love solidified, keeping them from seeing that they both had similar struggles over long hours, capricious management, and a lack of control over the products of their work. That these gender roles were socially created stereotypes, not innate characteristics, seems not to have occurred to any of these supposedly brilliant men.20

The dot-com boom of the 1990s saw personal computers become ubiquitous, big profits reaped, and then the first big bust, as overvalued companies, inflated with venture capitalists’ cash, deflated or popped. The Clinton administration largely built on the privatization and deregulation of the Reagan-Bush years, but gave them a veneer of cool, and the dot-coms epitomized this trend. During this period, sociologist Andrew Ross was studying the workers of New York’s “Silicon Alley” to understand these new workplace trends, which he dubbed “no-collar.” In the brave New Economy, workers embraced a certain antiauthoritarian perspective, trading in the old status markers of power suits and briefcases for hoodies and T-shirts. The workers adopted the work styles of the bohemian artist, bringing their expectations of creative labor to their new jobs in tech. They also brought a willingness to work in lousier environments in return for deferred financial gain (stock options, in many cases) as long as the work itself was stimulating, creative, “work you just couldn’t help doing.” Ross dubbed this phenomenon the “industrialization of bohemia.”21

These workplaces were designed to incorporate the “playbor” of techies, whose tendency to color outside the lines otherwise might have become more obvious resistance. Let the coder wear his “RESIST” button to the Pentagon, let the developers play games on their work machines, then they’ll be happier to do their work. These “digital artisans,” as Ross called them, were made to feel that they had a level of control over the machines. But unlike the original artisans, whose tools were theirs to control, the tech workers were still laboring for a big employer pocketing the profits. After all, the original Luddites didn’t break machines because they opposed technology, but because the technology was designed to deskill them and make them obsolete. The fun-loving tech workplace, already beginning to be stocked with foosball tables and other games to play, made the programmers feel secure that they were powerful and could never be replaced. Yet companies were already increasing their workplace surveillance, and in many cases already trying to figure out ways to break up tasks and cut into the creative freedom of the programmers.22

These workspaces, researcher Julian Siravo pointed out, take their cues from the spaces that techies themselves created. “Hackerspaces” took inspiration from the 1960s and 1970s protest movements’ tendency to take over public or private buildings for their own use; the emerging computer culture adapted this practice from student radicals and autonomia and began to create its own spaces in the 1970s and 1980s. Groups like the Chaos Computer Club in Germany established regular in-person meetings, which were imitated elsewhere. The spaces continued to pop up all over the world: communal, nonhierarchical locations in which members do a variety of programming and physical construction. Before the Internet, hackerspaces were necessary to share information and skills; after the Internet, they became places in which members are, Siravo wrote, “questioning radically the ways in which we currently live, work and learn,” taking a William Morris–like interest in challenging the divisions in capitalist production. But that freedom is something different in a space that people have designed for themselves in which to explore and create; in trying to replicate those spaces in a for-profit company, the big tech corporations have co-opted this exuberance.23

The boundaries between work and leisure thus blurred even more in the new tech companies, bringing more of the things workers might have done in their spare time into the workplace. The growth of the Internet helped blur these lines even for workers outside of the tech industry, who were now expected to check email at home, or who might play a game or write a personal blog on company time—and, particularly with the growth of social media, sometimes face workplace consequences for things they did in their free time and documented online.24

The lines blurred in another way, too: users’ online behavior, from the items they searched for on Google to their interactions during online multiplayer video games, created value for the tech companies. “Users made Google a more intuitive product. Users made Google,” Joanne McNeil pointed out. But that didn’t mean users owned Google. How was their labor—the labor of producing data, of producing a “user experience” that necessitates other users to be meaningful—to be calculated?25

The values of the early Internet—openness, sharing, collaboration—meant something different on a privatized Web where profit was the name of the game. As the cliché goes, “if you’re not paying for it, then you’re the product,” but users on today’s Internet are something more than just the product—they’re more like a self-checkout counter where the thing they’re scanning and paying for is themselves. The users are being sold to advertisers, but they are also providing the labor that makes these companies profitable—labor that is unpaid, and indeed invisible as labor. Facebook and Twitter would be worth nothing without the people who use them—and the fact that millions do is the reason why these platforms are hard to give up. Yet thinking of those users—ourselves—as workers would require us to understand the “social” part of social media as requiring valuable skills as well, something that tech companies resolutely refuse to do. And, of course, it’s in their interest not to—if they had to pay for the value we create for them, those tech billionaires wouldn’t be billionaires.26

image

THE CREATIVE WORK OF THE TECHIES, THEIR MUCH-VAUNTED “INNOVATION,” is the thing that is celebrated in these flexible, toy-filled workplaces, but this emphasis belies the fact that most programming work is, frankly, boring. It’s grueling, repetitive, requiring focus and patience—and often plenty of cutting and pasting or working from pre-prepared kits. Yet the myth of the tech genius obscures much of this labor. Think of how many of Apple’s fantastic devices, for example, are attributed to the singular brilliance of Steve Jobs, who couldn’t write a line of code, rather than the legion of engineers who did the real work. These tech prodigies were justified by such hype in hiring little clones of themselves, in never questioning how it was that everyone who was a genius was also white and male, never asking why the number of women who left tech jobs was double the number of men.27

The reality is that the work—like most creative work, ruthlessly romanticized—is a slog. A New York Times story on Amazon’s work culture featured employees who’d been told that when they “hit the wall,” the solution was to climb it. They spoke of emails arriving in the middle of the night, and followed by angry text messages if they did not answer immediately. The staff faced an annual cull of those who purportedly couldn’t cut it. Employees “tried to reconcile the sometimes-punishing aspects of their workplace with what many called its thrilling power to create,” but the exhausting pace made them feel more like athletes than artists. Employees frequently cried at their desks, trapped in something bearing an uncanny resemblance to the ups and downs of an abusive relationship.28

At Facebook, things were a little bit different—at least according to Kate Losse, who detailed her experience as one of the company’s early nontechnical employees in her memoir, The Boy Kings. But the sense of awe at the power in her hands was the same, at least before Losse’s eventual disillusionment and break with Facebook and its founder, Mark Zuckerberg. The work that Losse did—customer service work—was devalued from the very start by Zuckerberg, who fetishized hackers and Ivy Leaguers who he imagined were crafted in his own image. “Move fast and break things,” was his motto, and moving fast and breaking things were things that boys did. Losse nevertheless worked her way in, figuring, “You can’t run a successful company with boys alone.”29

Losse befriended the “hacker boys,” including one particular teenager who was hired after he hacked Facebook itself. She joined them on trips to a Lake Tahoe house that Zuckerberg rented for his employees, as well as to Las Vegas and the Coachella festival. She even convinced Zuckerberg to splurge on a pool house where his employees could move in—the ultimate home office. When Zuckerberg offered to subsidize housing for anyone who moved within a mile of the office, Losse did that, too—even though, as a customer service worker, she at first was excluded from the perk. “It wasn’t enough to work [at Facebook], you had to devote as much of your life to it as possible,” she wrote. To that end, the engineers’ floor at Facebook HQ was littered with toys—puzzles, games, Legos, scooters. New toys showed up constantly to keep the boy kings amused while they worked late. “Looking like you are playing, even when you are working, was a key part of the aesthetic, a way for Facebook to differentiate itself from the companies it wants to divert young employees from and a way to make everything seem, always, like a game,” she wrote. But even at the many parties, the coders had their laptops along and managed to get work done.30

In fact, they loved their work so much that they created new features and new projects without even being asked, and sometimes explicitly without permission. Facebook Video was one such project: it was done after-hours (if there were after-hours at Facebook) as an experiment—at least until Zuckerberg decided to publicly announce it, to much acclaim. At that point, the programmers who’d begun it as a lark worked to the point of collapse to make sure it would launch on time. “It was like my body wouldn’t ever work again,” one of them told Losse.31

The coders who were breaking their bodies were at least lavished with perks and praise. Meanwhile, customer care was women’s work: low paid, undervalued, not really considered work at all. At Twitter, for example, complaints from users about relentless abuse on the platform have been met with a steadfast refusal to hire support staff. Startup founders, Losse wrote elsewhere, have often relied on friends or girlfriends to do any work that required emotional labor. Silicon Valley later outsourced it to other countries, such as the Philippines, or even to refugee camps in Gaza, where the disturbing work of purging social networks of violence, porn, and anything else that might prove offensive to users was done for a fraction of what US wages would be. One article estimated the number of such workers at over one hundred thousand. Astra Taylor called the process Fauxtomation, whereby actual humans perform jobs that most people probably assume are done by algorithm. It is the secret of Silicon Valley, nodded to by Amazon with its Mechanical Turk service—the Mechanical Turk was a gadget created centuries before the computer to, purportedly, play chess. Inside the Turk was a human making the decisions. Now Amazon’s “Turkers,” many of them inside the United States, do repetitive “microtasks” for pennies, but the myth of the genius programmer helps to mystify the work still being done by human hands and human minds.32

The Silicon Valley workplace, created in the image of the boy king, seemed almost designed to erase the caring labor discussed in earlier chapters. No family, no friends, and no responsibilities outside of the office; within the office, all their needs are catered to, and toys are provided to make them feel eternally nineteen. (Facebook and Apple even offer egg-freezing to their employees, offering up a tech fix to the problem of work versus family, at least for a while, so that women, too, can abide by the “no families outside the workplace” rule.) It’s no wonder that the apps designed by all these man-children have been, collectively, dubbed “the Internet of ‘Stuff Your Mom Won’t Do for You Anymore.’” Need laundry done, dinner delivered, your house cleaned? There’s an app for that, and the app’s founders have no doubt been breathlessly hailed as technical geniuses, even though their real innovation is finding new ways to skirt labor laws. The result has been the gig economy—a patchwork of short-term non-jobs performed by nonemployees who are barely getting by.33

Whether they be app-distributed gigs or jobs in Amazon’s warehouses, or even programming jobs themselves, the tech industry’s solution for the continuing need for humans to do deeply un-fun work has been “gamification.” Gamification is almost the antithesis of “playbor”—a way to pretend that the same old backbreaking manual work is “fun,” a game you can win. To make the work of packing boxes at Prime speeds less like, well, hard work, Amazon has introduced video games to the distribution center floor. The games have titles like “PicksInSpace” and “Dragon Duel,” and the employees can play alone or against one another—the latter bit designed to up the competition factor and perhaps encourage faster picking. One gamification expert explained that the games might “give a bump to workers’ happiness,” but can also be used to ratchet up productivity goals: “It’s like boiling a frog. It may be imperceptible to the user.” Uber has used gamification as well; so have call centers. And it’s being applied both in learn-to-code contexts and in the actual workplaces of software developers. Turn work into a game! What could be more fun? The problem, as artist and author Molly Crabapple acidly predicted years ago, is that “the prize is what used to be called your salary.”34

The gamifiers are on to something—people hate drudgery, and no one expects to enjoy packing boxes or lifting them for an eight- or ten-hour shift. But it’s not being plugged into a game that makes work enjoyable or not. It’s autonomy that people value, and that is precisely what is being pitched with all those toys on the Facebook shop floor. “We trust you to get your work done,” the toys and perks imply. “You can decide how and when you do it and how and when you have fun.” With the feeling of autonomy comes the feeling that long work hours are a choice; they become a status symbol rather than a sign of unfreedom. As Miya Tokumitsu wrote, in Do What You Love, “The promise of worker autonomy is embedded in the ‘you’ of DWYL.”35

But surveillance is as rampant in the tech industry as it is elsewhere. As early as the 1990s, Andrew Ross found that tech companies routinely monitored their workers. It shouldn’t be a surprise that companies like Facebook, who make their profits off extracting data, might want to keep an eye on their employees, or that the fallen WeWork, a real estate company that leased coworking spaces yet sold itself to investors as the techiest of tech companies, harvested a wellspring of data from the people who worked—and might have lived—in its buildings. WeWork pitched itself as “creat[ing] a world where people work to make a life, not just a living,” selling a version of the dream tech-industry workplace to the masses of freelancers on their own in the neoliberal economy. And the more time those workers spend at the office, the more data that can be extracted. Sleep pods, rare whiskies, steak dinners, and all the toys are designed to enclose the worker in the workplace, just as the social networks enclose users—they offer free tools that the user then feels unable to give up.36

The company provides everything, in other words, that the tech worker needs to reproduce himself (and the worker is always assumed to be a HIM-self), leaving him free to focus solely on work. In this way, it fills the role less of his mother than his wife. The tendency of companies like Facebook to hire those boy kings means that the company is often shepherding them from youth to adulthood, filling that gap, perhaps, between mother and marriage. As video-game programmer Karn Bianco told me, when it comes time for slightly older workers to consider having a family of their own, they must create distance from the company and its desire to be all things to them.

And while, for now, programmers are lavished with benefits and treated as irreplaceable, the capitalists of tech are also betting that their status won’t last. The plethora of “learn-to-code” boot camps are designed not as altruistic ways to get the working class into high-demand jobs (even the ones that promise to teach girls to code to counteract decades of industry sexism), but to drive down the cost of labor. Programming might be destined not to be a prestige field for wizards and boy kings, but rather, as Clive Thompson of Wired wrote, “the next big blue-collar job.” Some of the boot camps are out-and-out scams, like one that promises to pay you to learn—and then takes a cut of your salary for the next two years. But all of them will have the effect of making coders more common, and thus making the work less rarefied—and less well remunerated.37

Mark Zuckerberg also has a plan to bring in lots of short-term workers from overseas. His immigration nonprofit, FWD.us, was created to lobby for immigration reform. That sounded nice in the age of Trump, but Zuckerberg’s main concern was increasing the number of H1-B guestworker visas for skilled workers. H1-B workers are tethered to a particular job; if they quit or get fired, they have to leave the country, which makes them spectacularly compliant as well as cheaper to hire.38

All of this means that tech workers might have more in common with the industrial workers of midcentury than they might think. Silicon Valley touts itself as the “New Economy,” but it still relies on products that have to be built somewhere, and the tactics of offering perks on the job don’t work quite as well on them. Elon Musk promised free frozen yogurt and a roller coaster to disgruntled employees at his Fremont, California, Tesla car factory—but the workers were complaining of injuries on the job because of the pace of production, and they didn’t want frozen yogurt to soothe their pains. They wanted a union.39

Yet the hype for Silicon Valley continues, and ambitious programmers don’t want to just be labor, anyway—they want to be startup founders, the next Zuckerbergs themselves. Peter Thiel, the PayPal billionaire and Trump buddy, advises would-be founders to “run your startup like a cult.” Cult devotees, of course, will work their fingers to the bone out of love, not for money. Not many people consciously want to join a cult, but as Losse pointed out, there’s another name for a group that inspires love and commitment and unpaid labor, and it’s one that tech bosses cheerily invoke: the family. As Kevin Agwaze said, though, families don’t lay you off once a year.40

Better by far to be your own boss, and start your own startup, even though startup founders themselves are reliant on the bigger boss—the venture capitalist. Author Corey Pein recalled asking a VC if startup founders were capital or labor. His “cheerfully cynical” reply was this: “For every Zuckerberg there’s one hundred guys who basically got fired from their startups. They aren’t capital. They’re labor.” The wannabe Zuckerbergs are their own kind of gig worker, scrambling individually to make a buck, just on a grander scale.41

Rather than leave to become startup founders, some tech employees have instead taken a page from the Tesla factory workers, or indeed, from the workers who serve them those catered lunches: they’re organizing. The Tech Workers Coalition (TWC) began with an engineer and a cafeteria worker turned organizer who challenged a few of the shibboleths of Big Tech—namely, the idea that different kinds of workers have no interests in common, and the assumption that the programmers have more in common with the Zuckerbergs of the world than they do with the working class. It built slowly for a while, and then, after Trump’s election in 2016, a burst of action drew many new recruits, both to the TWC and to Tech Solidarity, a group begun to help tech workers find ways to act on their anger. The first actions of many tech workers were to challenge their companies not to work with Trump. IBM employees petitioned their CEO, Ginni Rometty, asking that IBM not work with the Trump administration as it had with Nazi Germany and apartheid South Africa. Nearly three thousand workers at a variety of companies, including Amazon, Facebook, and Google, signed a “Never Again” pledge promising they would not work on projects that would aid the Trump administration in collecting data on immigrants or racialized groups. Amazon workers demanded the company not provide facial-recognition software to law enforcement; Microsoft employees called on the company to stop offering its cloud services to Immigration and Customs Enforcement (ICE).42

The first real tech-worker union drive, though, came at a smaller company, Lanetix. The problems began with the firing of an outspoken programmer. Coworkers described her as a stellar employee but said her questioning of company decisions had gotten her sacked “out of the blue.” If she could be fired like that, the others began to worry for their own jobs, and decided to unionize with the NewsGuild. “As soon as they started to compare notes, they realized that each manager was just trying to individualize the complaints that everybody had,” engineer Björn Westergard explained. But after sending a letter to management requesting recognition of their union, they were summarily fired. All fourteen of them. The story spread through the industry, and they filed a complaint with the National Labor Relations Board—retaliation for forming a union is illegal. Before the NLRB hearings could proceed, Lanetix settled with the fired workers, paying out a total of $775,000 to them. One of the former workers called it “a landmark win for tech workers.”43

Which brings us to Google, a company that—with its mini-golf and climbing walls and free food—is a dream job for many. That is, for those who get the coveted full-time-hire white badge. For others, who come into Google only as temps, there is the red badge, and interns get green. The inequality rumbling through Google, as with Lanetix, wasn’t limited to a few malcontents, and it spilled over in 2018. There was another petition, this time over Project Maven, an artificial intelligence program that was to be used with military drones, and some workers quit in protest before Google gave in. But it was sexual harassment that got the workers to organize as workers.44

There had been rumblings before at Google. A wage discrimination investigation by the US Department of Labor “found systemic compensation disparities against women pretty much across the entire workforce,” according to DOL regional director Janette Wipper. The anger sparked by the investigation was fanned by the distribution of a memo written by a Google employee, James Damore, who insinuated that the gender gap in tech labor was due to inherent differences between men and women. But the Google walkout—by tens of thousands of employees across multiple countries—came after the New York Times published a report of widespread sexual harassment and impunity for perpetrators at the company. The $90 million golden parachute given to one executive, who was forced out after he was accused of sexual assault, was too much.45

The walkout took place at 11:10 a.m. in every time zone, rippling across the world (and Twitter) in an impossible-to-ignore wave. The organizers gave credit to the women who’d organized in the fast-food industry through the Fight for $15, as well as to the #MeToo movement, which began online after Hollywood mogul Harvey Weinstein was accused of numerous instances of sexual assault. “A company is nothing without its workers,” the Google organizers wrote. “From the moment we start at Google we’re told that we aren’t just employees; we’re owners. Every person who walked out today is an owner, and the owners say: Time’s up.”46

That organizing was followed by demands, in the summer of 2020, that Google end its contracts with police departments. Tech workers joined in solidarity with protesters across the country, calling for defunding and abolishing policing after a Minneapolis officer killed George Floyd. A letter signed by more than 1,600 Google employees read, in part, “Why help the institutions responsible for the knee on George Floyd’s neck to be more effective organizationally?” Amazon programmers, meanwhile, had been organizing to support the company’s warehouse workers, protesting their dangerous working conditions during the COVID-19 pandemic. In both cases, the tech workers were taking their lead from those on the front lines.47

Suddenly the tech industry no longer seemed so impenetrable. After all, these behemoth companies operate with a relatively tiny workforce. Google’s parent company only broke the hundred-thousand-employee mark in 2019, and Facebook had a little under forty-five thousand employees at the end of 2019. This means big profits, as Moira Weigel noted in The Guardian, but it also means that individual workers still have quite a lot of power, and it doesn’t take many of them to shut things down. If workers could organize at Google, one of the world’s most powerful corporations, and pull off a massive collective action that spanned continents, what else is possible?48

image

THE FIRST STRIKE IN THE VIDEO-GAME INDUSTRY WAS CALLED BY VOICE actors. Members of one of the old Hollywood unions, the Screen Actors Guild–American Federation of Television and Radio Artists (SAG-AFTRA), struck against eleven of the biggest games companies for just over a year. They were calling for residuals and royalties to be paid to voice actors, like those film actors enjoy, and though they did not win those demands, they did win raises and proved that games companies could be brought to the table to negotiate with a part of their workforce.49

To Kevin Agwaze, at the time, the victory seemed far off from the work he was doing. There was a sense from the developers, he said, that they were the ones doing the real work of making the games, and the voice actors just showed up and talked—a sense that echoed the companies’ treatment of the actors. He’d been in the United Kingdom for just a few months at the time and remembered thinking, “Yeah, it’s bad but that is just how it is.” He thought he’d be able to adjust, to work his way up the ladder. But the discontent was bubbling up around the industry.

It boiled over at the 2018 Game Developers Conference in San Francisco. A panel was scheduled for the conference titled “Union Now? Pros, Cons, and Consequences of Unionization for Game Devs.” The people putting together the panel, Agwaze explained, were closer to management than the rank-and-file developers, and a group of developers who were talking union began to organize around the panel to get pro-union workers to attend and ask questions. What had begun as a Facebook group, and then a chat on the Discord service, became a campaign that now had a name, an official website, flyers, and a goal: Game Workers Unite (GWU).50

After the panel, Agwaze said, the discussion of organizing snowballed. People joined the Discord chat, and then began to start local chapters where they lived. The conference was based in the Bay Area, but as workers in a massive international industry, the developers knew they had to take advantage of their reach on the Internet to start chapters on the ground where they worked. They talked about crunch, but they also talked about sexual harassment and discrimination. And discrimination was something that particularly drove Agwaze to get involved. “A bunch of these problems, they just get progressively worse if you are a person of color and LGBTQIA person,” he said. “They become factors compounding an already shitty environment.” His actual work experience has been fine, though the long hours persist, but, he recalled, “in school, they asked us for a current figure in the industry, in your field, that you look up to, relate to. I couldn’t name a single Black person in games.” He remained, at the time we spoke, the only person of color at his company, and for him the union was a way to speak up for marginalized people in the industry.

Most of the games workers had no experience with unions; the industry’s age skew mitigates against that, but it is also true that young workers are driving a recent uptick in unionization in many industries. The workers have also needed to be creative about organizing. The UK group moved from the Discord chat into offline spaces, and then into forming an actual trade union for games workers, one of the first in the world. Agwaze is treasurer. After talking with a variety of different unions, the games workers became a branch of the Independent Workers Union of Great Britain (IWGB). A relatively new union begun in 2012, IWGB represents mainly low-paid immigrant workers in fields that had been long nonunion: cleaning workers, security guards, and gig economy workers like Deliveroo bike couriers and Uber drivers. It was both a strange and a perfect fit, explained Game Workers Unite’s Marijam Didžgalvytė.

The games workers in many ways, obviously, are better off than many of the workers who are already part of IWGB, but they bring a militancy that can be infectious, and the union holds social events to bring members together in a solidarity that reaches beyond the picket lines. The games workers’ social media reach is a help for the other workers as well. And social media helps the union reach a key audience: video-game consumers, who are notably vocal when they dislike a game, but could be marshaled, too, to support the games workers. A recent campaign to “Fire Bobby Kotick,” the CEO of Activision-Blizzard, who received a multimillion-dollar bonus after laying off eight hundred employees, drew plenty of attention from gamers and the games press. Laying off workers while juicing stock prices with buybacks and raising investors’ dividends is a fairly common practice in today’s economy, but the campaign aimed to make the human cost of such practices visible to gamers. Didžgalvytė said, “I think the players are beginning to understand that the people creating their games are suffering.”51

The GWU-UK union was helped by the United Kingdom’s labor laws, which do not require the union to win a collective bargaining election in a particular workplace in order for workers to be able to join. Other workers in other countries have different challenges, but the demands of the UK union, voted on by the membership, are largely the same as the demands elsewhere. They include improving diversity and inclusion at all levels; informing workers of their rights and supporting those abused, harassed, or in need of representation; securing a steady and fair wage for all workers; and, of course, putting an end to excessive and unpaid overtime. “We try to avoid the term ‘crunch’ because it sounds so funky,” Agwaze explained. “‘It’s crunchy! It is cool!’ No, it is excessive unpaid overtime.”52

Because of the developers’ relative power in the industry, they have been able to put forward demands on behalf of less powerful workers. Issues like zero hours contracts—work contracts, in the United Kingdom, where contracts are common, that do not promise workers any hours or give them a regular schedule—are still pervasive in the lower levels of the industry, particularly for workers doing quality assurance (QA) testing. Some QA workers, Agwaze said, even get paid per bug found in a game. “This incentivizes the wrong thing,” he noted, and it also means that someone could spend hours poring over a game, find nothing wrong, and make no money. GWU’s concern even extends to professional game players in “e-sports” leagues—which tend to be owned by the companies that produce the games. A company, Agwaze explained, can just wipe an entire league out of existence if it no longer wants to pay for it. And the workers wanted, too, to make demands on behalf of the people who did the work to produce game consoles in the first place, from mining rare minerals in the Congo to assembling the products in factories, often in China.

There is still a tendency in the industry, which affects workers’ desire to organize, to pretend that it is apolitical. “We make great art, we don’t make politics,” is how Agwaze summed up this argument. Yet the games, he pointed out, are inherently political, from war games (discreetly funded by the military) to superhero games, like a Spider-Man game that featured Spider-Man using police-operated surveillance towers to track down criminals. “How can this not be a political statement?” he asked. Online gaming culture had a track record of toxic culture, particularly the right-wing “Gamergate” movement, and that kind of culture rubbed off on the workplace. Games companies, in the wake of the 2020 racial justice protests, rushed to put out statements saying Black Lives Matter, but they rarely, Agwaze said, acknowledged the conditions they created inside their companies.53

One of those companies, Ustwo, billed itself as a “fampany,” an awkward portmanteau of “family” and “company.” It proclaimed its commitment to diversity and inclusion, but when it fired Austin Kelmore, GWU-UK’s chair, its internal emails criticized him for spending time on “diversity schemes and working practices,” and for being a “self-appointed bastion of change.” One email, shared in The Guardian, proclaimed, “The studio runs as a collective ‘we’ rather than leadership v employees,” but also said that Kelmore had put “leadership… on the spot.” (The company spokesperson told The Guardian that Kelmore was leaving for reasons unconnected to his union activity.) GWU-UK fought for Kelmore, but even before the pandemic, such processes took time; after the pandemic, they were backed up even more.54

Agwaze’s time organizing with GWU-UK had taught him that companies were often less efficient and practical than he’d expected. “They’re more of a chaotic evil,” he laughed. Few of them were aware of the labor laws, or of how their actions would be perceived. Then, as with the Black Lives Matter protests, they scrambled to try to win some goodwill through largely symbolic actions, like donating money to racial justice organizations.55

Still, all of this reflects the start of a change in the industry, signaled by the rise in political awareness within and about games. Members of the UK Parliament have even formed an all-party group to look into the gaming industry, though Agwaze noted that GWU-UK’s invitation to speak to the group had been delayed as a result of Brexit and the general election in December 2019, and then because of the COVID-19 pandemic. Still, it marked a change from the assumption most people had, he said, that “it’s fine, because it is video games. It must be fun, even in its working conditions.”

With the pandemic, Agwaze said, some of the union’s usual means of gaining new members—in-person meetings and speaking engagements—had to be scrapped, and the 2020 Game Developers Conference, where they’d planned a panel, was postponed. New members were finding them anyway, however, because of immediate problems on the job. “They are more like, ‘Oh, shit is on fire right now! I need to find some union assistance!’” he said. Workers at some companies were being furloughed, but being asked to keep working without being paid. Others were being told they had to go to the office despite the lockdown. And then there was the immigration question. The games industry, Agwaze noted, depended on immigrant labor—he himself was an EU migrant living in the United Kingdom, a status that could be disrupted by Brexit and, under Prime Minister Boris Johnson, the government’s intention to crack down on migrants. The pandemic exacerbated these problems: workers who lost jobs were unsure about their visa status, and with the backlog at both the Home Office and employment tribunals, there was a lot of uncertainty among workers that brought them to the union for help.

All of this meant progress—and more challenges—for Agwaze and the union. The workers at games companies, and in the broader tech industry, were finally starting to understand themselves not as lucky to have a dream job, but as workers who are producing something of value for companies that rake in profits. After all, as Agwaze noted, “for the one and a half years we’ve been around now, we’ve been the fastest-growing branch of the IWGB. We’re the fastest-growing sector that they’ve ever had.” The union is a crucial step toward changing power in that industry and claiming more of it for themselves.