CHAPTER 8

PROLETARIAN PROFESSIONALS

Academia

KATHERINE WILSON HAS AN INTENSE GAZE; WHEN YOU FIND OUT SHE’S spent much of her life in and around the theater, it comes as no surprise.

Sitting in her small, spare office at Fordham University’s Manhattan campus, just steps from Lincoln Center, she explained to me that soon she would have to give up the space. She’s not a full-time professor but an adjunct, which means that she’s paid by the class, not a full-fledged university employee. That gig doesn’t come with permanent office privileges.

“Open an old encyclopedia and it says, ‘Academic’ or ‘Professor.’ What does that look like?” she asked. “It looks like an office lined with bookshelves, supplies, paper—nowadays, their own printer so they can print twenty-five copies if they need to or something. It has the desk surface. This is what academic work is. Suddenly, they are expecting over 50 percent of their teaching faculty to work as if that is not how this labor was designed. For some reason, we can all work out of our cellphone or we can all work out of a satchel and we can do our grading on the subway.”

And the subway is where she spends no small amount of her time. Besides Fordham’s Manhattan campus, she also goes to its main campus in the Bronx, and then there’s the class she teaches at Hunter College of the City University of New York (CUNY), also in Manhattan but on the other side of town. She’s taught at other schools, too—in every borough of New York City except for Staten Island, and in New Jersey and on Long Island as well. Given that travel schedule, necessary just to cobble together a living, she’d have to grade on the train even if one of those jobs did give her a proper office. “Sometimes it was LaGuardia, which is in Queens, out to Brooklyn College on the same day,” she said. Another time, it was Hunter up to Fordham in the Bronx, shifting from the mostly working-class city kids at the CUNY schools to Fordham students who lean more well off—and whiter—sporting Fordham gear and spending their weekends at sports games.

This has been her life since 2002 or 2003, when she returned to academia. It took her a minute to recall all the different places she’d taught since then, in multiple departments. For an adjunct to teach in multiple schools isn’t rare, particularly in New York, where there are so many universities from which to choose. But, Wilson noted, “I teach three unrelated classes in three distinct departments. That is not so common.” Much of her bread and butter has been composition courses, a core requirement nearly anywhere, but the semester we were discussing also had her teaching Arabic cinema in translation and a course she designed that she described as the anthropology of fashion. Fridays were her rough day, with her first class at Fordham at 8:30 a.m., and then another class later in the day, and then a double session on Saturday. “The point for the adjunct life is you take what you can get,” she said.

The grading, course preparation, and any other work she needs to do happens at home. Where a full-time professor would be paid a salary that was expected to cover teaching, course preparation, and advising students, plus their own continuing research and publishing, adjuncts are paid by the class. That pay structure leaves Wilson falling well short of what a full professor would make even with three different jobs, and on top of that she still has to cover more of her own supplies. “A lot about the pay structure and the resources is designed as if we appear in class and God had given us all our materials for that lesson,” she said. “It just emerged from nowhere.”

Wilson grew up with a single father who was an English professor at CUNY, and he encouraged her and her twin sister to follow their dreams. “It was not the right spirit for the age, for the post-Reagan society I was living in,” she said. She studied philosophy as an undergraduate, and then, like many a liberal arts grad, spent a bit of time figuring things out—time further complicated by a chronic illness. “I was groping toward art,” she said. Landing in Boston, she “fell into theater quite accidentally. I never took a class. It wasn’t on my horizon.” The theater she did was politically radical and experimental but humorous, something she described as the “feminist granddaughter of Brecht.”

Such theater, of course, didn’t lend itself to a stable income. Alternative theater is rarely celebrated at the time of its creation, Wilson noted. What the theater world celebrates from times past, it mostly ignores in the present. After about a decade of doing work that didn’t make money, she decided to pursue a master of fine arts (MFA), hoping it would lead to something more stable than the ad hoc jobs she was pulling together. That was 2003, and it was then that she began teaching, but the MFA didn’t satisfy her.

“I realized in the MFA that what I was, was an intellectual, and theater had been my vehicle,” she said. But the theater world no longer felt comfortable to her. “The university looked like a haven.” Much of her social circle and even her family consisted of academics—besides her father, her sister had gone straight through school to a PhD and become a (twice) tenured professor.

Hoping for a career in academia, she enrolled at CUNY’s Graduate Center. But, she said, the program didn’t really prepare her for the academic job market she’d be turned loose into. Some of that was timing, of course—while she was working on her PhD, the 2007 economic crisis hit—but the field had been changing for a while. “We would start to get these emails saying, ‘Alt Academic,’ like ‘Market yourself, brand yourself for an alternative academic career!’ ‘Oh, you can try nonprofits!’ and it was very frustrating for me because anything they named, and then some, I had done that before,” she said. “I came in saying, ‘This is certainly the best fit in contemporary United States society for me.’ To have all that discourse of ‘Branding yourself, get your website, and be something besides what we have been educating you to be’ was very, very demoralizing and painful.”

Despite the struggles, she said, “I loved being in the classroom. It was challenging, but I like teaching a lot.” She’s never taught anything related to her dissertation, though—a study of the way a play script moves in the world—and only twice taught in theater departments. Her focus was Arabic theater in translation: she laughed, “I thought that after September 11, that theater departments would snatch up Arabic. Right.” Her varied background gives her more options for departments in which to work, but the variation becomes just one more thing to juggle, one more gear to shift between classes. Between schools, her students vary: at the CUNY schools, she’s often teaching them “how to be a student,” while the Fordham students approach her very differently: “It is more litigious. A lot of them run to authorities here. An adjunct lives in terror of the student evaluations.” Because, of course, bad evaluations could mean not being asked back. This is the kind of thing, she noted, that also leads to grade inflation: to many adjuncts, there are more incentives to make students happy than to grade accurately—though, she said, “I try to hold out and I think I pay the price for it.”

Besides the different student bodies and the departments, there are also small yet frustrating differences from school to school. “Everything now is privatized,” she explained. “For a grading system, it is not an internal university program. It is a rented program and they all have their own little names. You have to learn those programs.” She has multiple email addresses, too, one for each of the different schools. Two or three of those a semester add up. “Mentally it is very challenging, and I think for most of us, our instincts would be to blame ourselves. For example, I will be in University A and I need to enter into the system and I will type the password for University B and I say, ‘Oh, Kate, you are so stupid.’” The universities continue to shift costs onto the faculty and students, nickel and diming them for little things like photocopies. “Everything just multiplies about the bureaucracy,” she said. “It is too much bouncing around. I think it is very much characteristic of postmodern life—fragmented, scattered, not coherent. And all of that amounts to—obviously—exhaustion. With no sick days.”

Then there are the little slights of the obvious two-tier nature of the system. Department chairs who have tenure come off as oblivious to the conditions of the people who work under them. Teaching, of course, is hard work whatever you’re paid, but it’s frustrating to hear complaints from senior faculty when the adjuncts constantly have to “do more with less.” For Wilson, it’s an issue that is even closer to home: her partner is tenured faculty at Fordham. Yet that experience has also made it clearer to her that the distinction between tenure track and adjunct track is an accident of timing. “If anything, it has sort of helped me think, ‘Yes, I would at least deserve to be on the pathway,’” she explained. “I think if I weren’t so close maybe I would slide into that paranoia of, ‘Oh, I must not be good enough or smart enough.’”

When the COVID-19 pandemic hit New York, all the uncertainty was magnified. “Fordham faculty were hurtled into ‘remote’ teaching and the school closed. CUNY followed, after wavering uncertainty, a week or so later, though my own spring class had been canceled so I wasn’t personally affected,” Wilson explained. “Those who straddled jobs at different schools had to juggle the different policies and technical parameters of that shift.” How to teach, she said, was left up to faculty members; she opted for the virtual classroom, while others used recorded lectures or just posted assignments online. For adjuncts, too, there was the question of technology. “Obviously with our low salaries we’re not likely to have state-of-the-art, top-of-the-line equipment. In my case, my mouse died, and I prayed my anemic Wi-Fi would endure the session of every class.” And then there was the added burden of emotional labor as she tried to help students navigate the crisis—something, she said, that likely fell harder on women, whether adjunct or otherwise. Tenure-line faculty faced the same issues, but, she noted, “their exertions earn a livable wage, and greater inclusion in university processes.”

The question of reopening was painful, too. As schools discussed a “hybrid” system for the fall semester—and as the different universities where she worked tossed around different plans—the adjuncts were left feeling powerless. The idea of going back to in-person teaching in a pandemic was, naturally, unnerving, but the loss of the classroom stung because Wilson put a lot of effort into making her time with students meaningful. When we sat down in her office, before the pandemic, she’d just come from her Arabic cinema class, where her students were making animations. Their engagement and pleasure in the material was fulfilling for her, too. “Most of the time,” she had told me, “I leave that class happy. I have that.”

image

THE WRITER AND PROFESSOR STANLEY ARONOWITZ ONCE CALLED ACADEMIA “the last good job in America.” At its best, the academic workplace allows the professoriat a great deal of autonomy at work, the ability to pursue projects that intrigue and inspire them with single-minded focus and little need to compromise. Historical precedent gives them a great deal of involvement in university governance, and their work has long been seen as more of a vocation than simply a job. In this way, though the teaching part of the job is not actually all that different from teaching a grade-school class, the image of the academic has much in common with the image of the artist as lone genius, though perhaps swathed in tweed in a corner office stuffed with books rather than a paint-splattered garret studio.1

Higher education has a long history as a tiered, hierarchical structure: after all, it’s there in the name. Higher education was, from imperial China to the pre-Columbian Americas, a way to train the upper castes of society first and foremost. Only later did it develop into a place for the kind of intellectual pursuits Katherine Wilson was looking for: independent scholarly work, with knowledge production more or less for its own sake seen as a social good.2

Even when higher education became a place for experimentation and debate, it was still restricted to society’s elites. From India, where Hindu and Buddhist centers of learning also taught arts, mathematics, astronomy, and more, to ancient Greece, Plato’s Academy, and later the Musaeum of Alexandria, where students came from far away to study, the ruling classes were able to pursue knowledge largely because someone else did all the work. Han dynasty China’s imperial academy admitted students based on skills, providing some form of social mobility, but this was far from mass public higher education.3

The university as such was born in Italy in the eleventh and twelfth centuries, an offshoot of the guild system and existing religious education. It taught what were known as the seven liberal arts (“arts” in the original meaning of skills): grammar, rhetoric, and dialectic, along with music, arithmetic, geometry, and astronomy. In Paris and Bologna, centers of learning expanded and drew students from across Europe. The university developed at the intersection of Christian, Muslim, and Jewish influences, where the resurgence of classical Greek was made possible through translations back from the Arabic, where it had been preserved.4

Still, these universities existed to train the elites, and the intellectual curiosity of the academics was limited by the rules of the church and the state. Some who pressed too far into church terrain were even burnt as heretics. It is not surprising, then, that the academics organized into guilds—in part as a response to the way students also organized themselves, and occasionally raised hell, causing riots that in turn caused scholars to migrate to new towns to set up new universities. The students were roughly equivalent to a guild apprentice in the hierarchy; those who had passed through one level of schooling were equal to journeymen or bachelors (hence “bachelor’s degree”), and those who had studied all the arts became masters. But power struggles continued, between city and university, master and student, church and university, state and university—power struggles that shaped the university as a space of contention.5

Fights between church and state also shaped the early universities: Oxford was opened in twelfth-century England after students returned home from France, driven by a spat between Henry II and the pope. The University of Naples, meanwhile, was founded as a public institution, perhaps the first secular university, though virtually all institutions taught religion as part of their curriculum. The Protestant Reformation hit the universities hard, and the 1600s generally were a low point in attendance and production. Most of the scientific discoveries of the period were made outside of the university’s bounds, in the new (and mostly amateur) academies of science. There was even a lull in student riots.6

The French Revolution’s leveling of French institutions also helped to revive the university as a center of learning—the new government nationalized universities and fired church-backed teachers, with the intent of creating a new state-run system. That system, like many of the plans of the revolutionaries, didn’t quite come to pass, yet it helped clear space for the development of the modern university. The reforms of Wilhelm von Humboldt, who as part of his work in the Prussian Ministry of the Interior revamped the country’s education system, enshrined in the University of Berlin the model that bears his name. The Humboldtian university combines research and teaching, expecting each professor to produce knowledge rather than simply passing it on. With this ideal was born the concept of academic freedom—freedom to learn and to teach. The mission of academe, to pursue truth, was supposed to set the university and its workers apart from the masses.7

By the late eighteenth century there were over 140 universities across Europe, and more and more of them were constructed upon this model. The beginnings of the research journal could be spotted, as academics began to publish their work for broader sharing across the community of scholars. Academic freedom of a sort was guaranteed, and some protection from interference instituted, though it did not, notably, extend to protection for political expression. Professors began to specialize in one subject, and to combine their teaching with specialized research as well; this began as a way to save money, in Scottish and German universities, and then scholars began to make names for themselves—including some we still know of, like political economist Adam Smith. Access to higher education expanded, becoming available to a growing middle class as industrial capitalism developed, and this meant more jobs for professors.8

The first universities in what would become the United States were elite institutions, religious in nature, but the United States’ real contribution to higher education was the state university system. Beginning in North Carolina and Georgia in the 1780s, the state-funded institution helped to make higher education accessible to a broader swath of the country. City College of New York, now the CUNY system in which Katherine Wilson adjuncts, was founded in 1847 to educate, tuition-free, the children of the modest classes. Then the Morrill Act of 1862 created the “land grant” universities, paid for through the sale of public land granted to the states to fund “Colleges for the Benefit of Agriculture and the Mechanic Arts.” The sixty-nine schools funded through this act include the Massachusetts Institute of Technology, Cornell University, and the University of Wisconsin at Madison. (This was land, it’s important to note, that was seized from Indigenous nations and sold at a profit—a reminder once again that the new universities were never intended for everyone to access.) The American-style research university was a new kind of institution, better funded than universities had been and producing work, particularly in the fields of science and technology, that became a draw for scholars from around the world. Privately funded institutions and state-backed ones competed for students and research accolades. By the 1920s, the proportion of students in higher education in the States was five times higher than in Europe. Of course, those schools were still racially segregated—separate was certainly not equal—and women made up a much smaller proportion of the student body than men.9

Higher education was slowly becoming a path to upward mobility for a small but growing fraction of the working class. The post–World War II period brought more and more students into colleges and universities, across Europe and particularly in the United States, thanks to the act commonly known as the GI Bill, which provided military veterans with college funding. But despite the bill’s facially race-neutral language, in practice Black veterans were excluded, often formally rejected or forced into vocational programs rather than universities. Historically Black colleges and universities (HBCUs), which would have happily taken on more students and where Black faculty were welcomed, were underfunded and could not accommodate all of the would-be attendees, leaving many more out in the cold.10

The Cold War brought new funding into universities as the United States and the Soviet Union competed for scientific (and thus military) superiority. States and the federal government both committed substantial funds to higher education, including student loans and direct university subsidies, and most students attended public institutions. But as the university expanded and became less elite, its professors began to lose status. Institutions and the students in them were ranked in terms of prestige, and that prestige would largely define working conditions. Still, upward mobility through the university into what Barbara and John Ehrenreich dubbed the “professional-managerial class” (PMC), was a fact of twentieth-century life, and more and more people wanted in.11

The PMC, according to the Ehrenreichs, consisted of those service and management professionals whose jobs required some schooling and gave them some degree of power, usually over those further down the class ladder than themselves, and who retained some degree of autonomy on the job. Teachers, doctors, journalists, social workers, and of course college professors were part of the class. As opposed to those in the “managerial” part of the PMC, the professionals mostly considered themselves outside of the battle for profits and saw their work as having intrinsic social value. “Educational work,” the Ehrenreichs wrote, “was highly labor intensive, and there was no obvious way, at the time, to automate or streamline student-teacher interaction and make universities a profitable undertaking.” Perhaps because of their status as a temporary respite from profit-seeking, universities began to be a home for dissent and rebellion, as well as agitation for the university itself to open up further to those long excluded.12

Faculty fought for tenure protections, in particular, to preserve their job security and academic freedom. Despite the caricature—like that lobbed at public school teachers—of tenure as a protection for “lazy” professors, tenure protections, much lampooned in the years of right-wing budget-cutting and culture-war mania, allow a modicum of independent thought in the university. Through the 1950s, Stanley Aronowitz wrote, most faculty existed on year-to-year contracts, keeping them toeing the line. The American Association of University Professors (AAUP) agitated for tenure not to protect the radicals but to make everyone’s job more secure; nonetheless, tenure has always been particularly valuable to academia’s rebels. “Well into the 1960s, for example, the number of public Marxists, open gays, blacks, and women with secure mainstream academic jobs could be counted on ten fingers,” Aronowitz archly noted. “The liberal Lionel Trilling was a year-to-year lecturer at Columbia for a decade, not only because he had been a radical but because he was a Jew. The not-so-hidden secret of English departments in the first half of the twentieth century was their genteel anti-Semitism.” Yet tenure still did not protect the radicals from the pressures that the job itself placed on them, the conformity encouraged by academia’s own traditions of peer review, and the hoops to be jumped through while on the tenure track itself.13

The AAUP’s definition of academic freedom, so precious to the university professor, holds up professionalism—the judgment of one’s peers, in essence—as the standard to which academics should be held. An expansion on the Humboldtian concept, dating back to 1940, the AAUP statement on the subject “maintains that a professor’s research and teaching should be free from outside interference as long as he or she abides by the academy’s professional standards of integrity, impartiality, and relevance,” though as scholar Ellen Schrecker noted, those protections were less regularly applied to what a professor did off campus—meaning they could still be fired for political activities or speech. But theoretically at least, a professor was supposed to be free to teach and research what she liked, as long as she upheld her duties to the university—which meant committees, peer review, a variety of governance duties that professors complained about but nevertheless valued as signs that it was they who ran the university.14

The public university, accessible to broad swaths of the working classes, reached its heights in California and in New York, in the system where Katherine Wilson still teaches. The CUNY system was considered the “proletariat’s Harvard” in its heyday; children of immigrants with dreams of scholarship and middle-class life, those who didn’t make it into the Ivies, moved through its halls. It was also, from 1969 onward, fully unionized, with faculty, graduate students, and staff all members of the Professional Staff Congress (PSC). The year after the union was founded, CUNY gave in to pressure from Black and Puerto Rican student organizers and formally opened up to all New York City high school graduates who wanted to attend. “By combining an open admissions policy with free tuition, CUNY broke new ground in democratizing access to higher education in the United States,” wrote CUNY professors Ashley Dawson and Penny Lewis. “And in 1973, after voting to strike, CUNY faculty and staff won their first contract.” The University of California system, too, was free; its Master Plan (enshrined in 1960) committed to educate anyone who wanted to be educated, though the burgeoning New Right took aim at this ideal nearly as soon as it was written into law. One of Ronald Reagan’s campaign aides, as he ran for governor, laid out the stakes clearly: “We are in danger of producing an educated proletariat. That’s dynamite! We have to be selective on who we allow to go through higher education.”15

In 1975 the right was able to strike back against CUNY. New York City’s fiscal crisis—one of the turning points of the decade—marked a shift away from funding public goods that were accessible to the working class and toward the neoliberal politics we now know today. As the infamous newspaper cover had it, President Gerald Ford had told the city to “Drop Dead,” leaving New York to fill its budget holes however it could, meaning deep austerity for public services and a turn to “business-friendly” policies. CUNY tuition was one of the first things to be instituted—just a few brief years after it had truly been opened up to the working class. Bondholders had to be paid off; students, meanwhile, would start taking out loans of their own, or more likely, for many of them, skip higher education altogether. The faculty union fought to keep its protections but could not stave off the institution of tuition, nor stop the firing of hundreds of young professors, only recently brought on to handle the expansion.16

In a way the Reagan aide was right: the rebellions of the 1960s and early 1970s, which had helped to create the open admissions period of CUNY’s history, and had shaken up many other college campuses as well, had in part emanated from a newly educated stratum of society no longer content to simply move into professional-status jobs. Their idea of changing the world was different from that of the Progressive Era reformers: they wanted revolution and they wanted it now. Angela Davis became one of the early targets of the counterrevolution when Reagan sought to have her fired from her position at the University of California Los Angeles. Davis had a PhD and a stellar record, but was a Communist and associated with the Black Panthers, and Reagan was able to chase her out, academic freedom be damned. The university had been a target of the McCarthy-era witch hunts, but by the 1970s it had become easier to strip it of funds than to try to get individual professors fired one by one. Reaganism was tested out on the Cal system, as Aaron Bady and Mike Konczal wrote at Dissent: “The first ‘bums’ he threw off welfare were California university students.”17

Margaret Thatcher too took aim at British professors. In what one researcher called “one of the most dramatic systemic changes in the terms of academic appointments,” in 1988 the Thatcher government eliminated tenure for university faculty. Ostensibly, this was to reduce distinctions between the traditional—and traditionally prestigious—universities and newer institutions, and to introduce “accountability” for faculty, which, as it does for other teaching staff, tends to mean “making them easier to fire.” The argument was the same as it is everywhere that the elimination of job security is debated: that “deadwood” tenured faculty who weren’t up to internationally competitive standards should be cleared away to save money. An otherwise Thatcher-supporting professor from the London School of Economics argued to reporters at the time that eliminating tenure would “make British universities into something very second rate,” and that the reforms would direct money to profitable programs while hacking away at the liberal arts. It was not the last time that refrain would be heard.18

The right had taken the analysis of the professional-managerial class from leftists like the Ehrenreichs and twisted it to useful form in order to attack the university. The right in the United States railed against, the Ehrenreichs wrote, “a caricature of this notion of a ‘new class,’ proposing that college-educated professionals—especially lawyers, professors, journalists, and artists—make up a power-hungry ‘liberal elite’ bent on imposing its version of socialism on everyone else.” That the people doing the excoriating were, in fact, members of this class themselves was perhaps lost on them, but it is a reminder that just because one wants to call a group of vaguely similar people one opposes a “class” doesn’t make it so. Classes, we recall, are composed, and as neoliberalism hit, the PMC was beginning, in fact, to be decomposed.19

While the managerial side of the PMC was doing better than ever—executive pay headed back upward in the late 1970s and kept going up—the professions were undergoing a very different process, one in which job security and pay rates were falling, and their treasured autonomy disappearing. Academia was at the very heart of this transformation. After all, education was the very thing that made one into a member of the PMC in the first place, as Barbara Ehrenreich noted, which made the university a central location of these changes as it trained the doctors, lawyers, social workers, and professors of the future. The academic profession itself, like many others, was becoming polarized into a handful of stars at the top and a vast academic proletariat at the bottom, made up of people like Katherine Wilson, cobbling together a living if they could, and feeling a sense of shame at not having achieved the career they’d aimed for. The middle class—a better term than “PMC”—as Ehrenreich wrote in Fear of Falling, was still “located well below the ultimate elite of wealth and power.” Further, she wrote, “Its only ‘capital’ is knowledge and skill, or at least the credentials imputing skill and knowledge. And unlike real capital, these cannot be hoarded against hard times.” A PhD might have been a symbol of so-called human capital, but its value could not be guaranteed.20

Just as the vaunted “knowledge economy” was making headlines, in other words, the labor of knowledge workers was being devalued and deskilled. Doctors became more likely to work for large institutions, lawyers in massive firms or to work in-house at corporations. We started to hear more about “stress” and mental health on the job than physical injury. Until the aftermath of World War II, the term “stress” was rarely used to describe something that happened to humans; researchers, though, began to apply the term to the wear and tear on the human body caused by, among other things, psychological strain on the job. By the 2000s, it had overtaken physical ailments as a cause of absence from work. Like “burnout,” we can understand this concept as a side effect of the cracks in the labor-of-love myth. Fewer of us may be getting physically injured on the job, but more of us are struggling with the emotional toll of work.21

Professional workers were becoming subject to the controls of capital, and yet, as more and more people made it through higher education, the demand for credentials only grew. More and more universities were opened across the world, and the percentages of school-age cohorts attending them exploded, from under 10 percent in 1960 to around 50 percent in many countries by the twenty-first century. Something like 3.5 million professors taught over 80 million students worldwide by 2000. Yet their working conditions were, in many ways, getting worse.22

For one thing, even as access appeared to be expanding, a degree was also becoming more expensive. The cost of a degree in the United States spiked between 1987 and 2007, from less than $3,000 a year for public universities, and less than $7,000 at private ones, to nearly $13,000 a year for public and nearly $35,000 for private. Since then, and in the wake of the global financial crisis and austerity, those numbers have ballooned again—by nearly 25 percent. In the United Kingdom, university fees were reintroduced in 1998, and have expanded since. Yet that money was not going to pay more qualified professors better salaries; instead, teaching faculty were facing cuts. Complaints of lower quality at the universities were used as justification for public budget cuts, firing professors, and raising tuition. Universities competed for a few prestigious faculty members, offering not just excellent pay but lowered teaching loads, an ability to focus on research, and the opportunity to mentor graduate students who might enhance their own reputations. Meanwhile, that teaching load being removed from the fancier professors fell on the shoulders of those same graduate students, adjuncts, or junior professors scrambling for the tenure track. The resulting competition meant that research requirements were going up even as fewer people were given the kinds of job supports that would allow them to do that research.23

The Humboldtian ideal of the university professor has always been a combination of two related but distinct forms of work: part of it in front of a classroom, part of it hidden away in the lab or the office with a stack of books. To Aronowitz, who enjoyed both of these parts of the job, the two parts were complementary. “I am one of a shrinking minority of the professoriat who have what may be the last good job in America,” he wrote. “Except for the requirement that I teach or preside at one or two classes and seminars a week and direct at least five dissertations at a time, I pretty much control my paid work time.… I work hard but it’s mostly self-directed. I don’t experience ‘leisure’ as time out of work because the lines are blurred.” He described “writing days” where he composed articles and read and worked on longer book projects, fundraising time, and student-advising time and exams for grad students. Academic labor, he noted, bled, for professors like him, into everything; anything he read might make it into the classroom or into a piece of writing. For Aronowitz, teaching was a genuine pleasure; for many others, it’s simply a distraction from the research they’d prefer to be doing. One might be a good teacher and a brilliant lab scientist; it’s certainly possible, but nothing about the one suggests, necessarily, the other.24

The splintering of the academic workforce into tiers suggests that these two parts of the job have in fact come apart. The adjuncts and the full-time faculty, noted part-time lecturer and union activist Amy Higer, from Rutgers University, have a symbiotic relationship: full-time professors often don’t want to be in the classroom. “Some of them like teaching, but I would say most of them don’t. And it’s a research institution; that’s fine. I love to teach. This is what I wanted to do with my PhD.” The problem was not the split workload, to her, as much as it was the devaluation of the part of the work that she did—the feminine-gendered work of teaching. Adjuncts are paid per class for their teaching and given no support at all for their research. For Katherine Wilson, research was something she’d hoped to do, and she found herself stymied by the demands of adjunct work. She agreed with Higer that the research part was often seen as the higher-level part of the job, teaching the lesser.25

The “last good job in America” (or in England, or France) is now reserved for a few: all over the world, academics face the increase of part-time positions and the loss of autonomy and power. Increasing enrollment has not come along with increased full-time staffing, and salaries have stagnated as class sizes have increased. While European universities still offer more security than many US institutions, the situation of part-time faculty in the Americas (Latin America, too, has a long history of so-called taxicab professors, part-timers with little attachment to their institutions) is a bellwether for the rest of the world. By 1999, an estimated one-fifth to one-half of European countries’ academic staff were “nonpermanent.” In the United States between 1975 and 2003, according to the AAUP, “full-time tenured and tenure-track faculty members fell from 57 percent of the nation’s teaching staffs to 35 percent, with an actual loss of some two thousand tenured positions.” Professors don’t always have to be laid off; attrition does a lot of the work as tenured professors retire, and their jobs are filled in by temporary staff. Meanwhile, much of the expansion of college access has been at community colleges, where even if tenure exists, the job is nothing like that of a professor at a top-tier research university.26

And that more prestigious research part of the job is increasingly commodified. In 1980, the US Congress passed the Bayh-Dole Act, which allowed universities to generate funds from licensing intellectual property and the sale of research. Universities like Columbia—which made it policy almost immediately that the school had rights to faculty inventions, though they generously granted them royalties—generate hundreds of millions of dollars from their professors’ intellectual work. Outside funding for research also has an effect: drug companies subsidize research at universities that they then get to patent. Research funds are scarcer in general, and often tied to specific outcomes—which may incentivize tweaking of results. Graduate students, paid far less than professors, do much of the actual work of laboratory research. And sometimes sponsors demand such secrecy that researchers cannot always publish what they’ve discovered—it’s all swept up into the company, rather than the tenure file. Even in the humanities, corporate donations wound up shaping policy; sponsorship of “chairs” meant influence over which professors got to sit in them. The fears of that British professor in 1988 were well founded, as funds pumped into the sciences are often drained away from the humanities, creating yet another form of tiering in the workplace.27

All of these changes crept in under the mantle of “accountability,” to use Margaret Thatcher’s words. Accountability meant stripping away the traditional faculty governance in favor of external boards and executives who come in from other parts of the business world or from government. As it happened to public school teachers, and indeed, to the autoworkers at Lordstown, so it did in the university: demands for reform and accountability to the community or the workers themselves were turned into excuses to impose “flexibility” on the workforce. The protest movements of the 1960s, led by radical students demanding control over curricula and challenging the power structures of the university, were turned, in the hands of the right, into letting “the market” decide what should be taught, while challenging the usefulness of education for its own sake. Thus, Aronowitz wrote, “neoliberalism entered the academy through the backdoor of student protest.”28

Globalization has in one way brought the university back to its roots. Medieval European professors taught in Latin, and faculty and students crossed borders freely in pursuit of education; now a global labor market for academic work has opened up, and students regularly study outside of their countries of origin. The European Union has instituted regulations requiring comparability of degrees, and some American schools, including New York University (NYU), have campuses littered across the world. This phenomenon has created an international job market for academic workers, such that conditions in one place wind up linked to conditions elsewhere.29

In academia, as in many other professional fields, the tradition of a period of apprenticeship dates back to the medieval guilds. In the modern university, PhD students teach and grade and research while they earn their degrees, doing the work that allows full professors to focus on their own projects; indeed, grad students do some of the research that goes out with the professor’s name on it. This hierarchy is justified as paying one’s dues, but it also, importantly, functions to maintain quality control. Not just anyone can be a professor; one must have done research judged to count by one’s peers, passed through hurdles set by accomplished mentors, smiled through the long hours, and pretended to be cheerful while eating ramen noodles, all this hope labor performed in what used to be more than a hope of a career. Passing through the set of qualifications to a good job at the end was, for a time, a ritual that one could more or less count on. Nowadays, this isn’t true.30

Graduate students who are funded receive a stipend, and in return, they do plenty of teaching and grading as well as their own research. Yet university administrators will argue, if those graduate students try to unionize, that they are not really working at all—that their funding is not a wage, but a grant to subsidize their education. Their labor is not really labor, but a privilege. Sociologist Erin Hatton called this double bind—applied not only to graduate students but also to student athletes (as well as to prisoner laborers and workfare recipients after welfare reform)—“status coercion,” because their status as something other than workers allows their supervisors extra punitive power. “The education, degree conferral, and future employment of science graduate students are in the hands of the faculty advisors for whom they labor,” she wrote. “Such advisors can dismiss them from the PhD program as well as delay their graduation because they have become productive workers in the lab.” This kind of coercion links the working conditions of graduate students to other precarious workers—to retail workers and domestic workers as well as interns—as it primes them to accept undervalued and insecure work in the future. And like welfare-to-work programs, graduate programs mobilize both moralistic language about hard work and a labor-of-love rhetoric that denies certain work is work at all by denying that what workers are paid is indeed a wage.31

To Aronowitz and others, the “last good job in America” could be a guidepost for all: shorter working hours and more autonomy could be key demands to improve others’ working conditions. Instead, though, the opposite has happened in the twenty years since he wrote about it: the academic workplace has become more like the rest of the service sector. For those who had that last good job, as it was with other parts of the PMC, it had been, for a while, easy to ignore the struggles of those outside the university, those with fewer credentials, doing manual labor, perhaps, or caring work. Even on campus, tenured faculty could be prone to ignoring the conditions of those serving the food or keeping the lecture halls clean; off campus, the tradition of seeing the university as a location apart meant that too few professors realized that the downward trajectory of other knowledge work was connected to their own. The university’s culture of individualism—particularly the intense focus on individual research, created in part by the endless pressure to produce and publish such research—mitigated against academics’ collective action for a while. But as the conditions of academic workers began more and more to resemble those of those other workers, academic workers began to reach for the tool of the working class: labor unions.32

Union density was in decline across the United States by the 1990s, but two hundred thousand or so faculty and staff at universities were still union members. Graduate students organized in large numbers, challenging the idea that their work was not work. They were aware of their importance in the institution, the amount of work that, if they refused to do it, would simply not be done. The corporatization of the university, by requiring graduate students to produce useful research, had hastened their realization that they, too, were necessary workers. By 2000 there were more than thirty graduate assistants’ unions with contracts across the country. Most of that unionization came at public institutions, though; private institutions have had a different war to fight. The National Labor Relations Board, with its Yeshiva University decision in 1980, ruled that faculty at private universities were management and therefore ineligible for union protections. Looking over the arguments for faculty governance of the university, the board decided to take professors at their word, despite the trimming away of those privileges and duties as the tenure track declined.33

When NYU’s graduate student union, the Graduate Student Organizing Committee / United Auto Workers (GSOC-UAW) Local 2110, struck in 2005, it made visible several of the many fault lines of academic labor. The strike dragged on for seven months before being broken by the university through a process of intimidation, media battering, firings, and more. GSOC had been the first union to win a contract at a private university in the wake of the 2000 NLRB decision that graduate assistants were workers, and the university management was determined not to have a second contract. And the board had changed by then—in 2004, with a majority of new conservative appointees, it reversed itself and said that private universities had no right to union recognition. Even at the time, NYU was a popular and relatively newly prestigious university, an emblem of the corporate or neoliberal turn in the academy, with high demand for applications and students who graduated with the highest debt load in the country. It also had one of the highest percentages of courses taught by non-tenure-track staff, including the striking grad students. Its president at the time had made an argument for the role of the university as anchoring a new key sector of the economy: he called it “ICE” (intellectual, cultural, and educational) as a complement to New York’s famed FIRE sector (finance, insurance, and real estate), from which the university drew most of its trustees. He had thus concretized the argument that the university was key to the new “knowledge” economy, melding it with the “creative class” even as he tried to worsen working conditions for those knowledge producers.34

There was a core group of some 220 tenured and tenure-track faculty who supported the strike, even moving classes off campus so as not to cross the grads’ picket lines. In the midst of the strike, the union managed to win another majority vote among graduate students, reiterating that it had wide support even if the university wouldn’t recognize it and some of the grad students (particularly international students) had been bullied into returning to work. Protests by undergraduates and union activists, including the president of the AFL-CIO at the time, John Sweeney, supported the graduates, but in the end, the university held out and broke the strike. The union did not give up, though, and eventually, in 2014, after protracted battles and a new union vote, it won another contract.35

The question of what higher education is for is intimately tied up with the questions of the conditions of its work. If higher education is to be, as the students of the 1960s demanded, open to all, a place to explore and to learn and to challenge, that means it should be taught by faculty who are supported and encouraged in their own learning, who are challenging and exploring themselves. If, however, the university is simply a machine for producing credentials, with degrees like commodities to be purchased by students shopping in the market, then it is harder and harder to argue for the necessity of faculty who have time and resources to develop their own minds. What political theorist James Cairns called “Austerity U” is, he wrote, about “teaching disentitlement,” not only to students, but also to faculty.36

Perhaps the best example of the simultaneous deskilling and deprofessionalization of higher education and its corporate takeover is the for-profit college. In her book Lower Ed, sociologist Tressie McMillan Cottom dissected the for-profit college industry, pointing out the ways in which it is a logical outgrowth of both the “education gospel” and the neoliberal turn. When demand expanded for education in the 1960s, she noted, public universities expanded. But now, with public funding on the decline, and the stagnant economy making a new credential more appealing, for-profits have stepped into the gap. As for work at the for-profits, recruitment is more of a focus than teaching, and forget about research. The courses remain the same, but the faculty are constantly changing: they are even more temporary than the adjuncts at more traditional institutions.37

In 2013, the Ehrenreichs revisited the professional-managerial class and found it much decomposed. In a report titled “Death of a Yuppie Dream: The Rise and Fall of the Professional-Managerial Class,” they documented the “devastating decline” of many of the professions they’d originally tracked. “In this setting,” they wrote, “we have to ask whether the notion of a ‘professional-managerial class,’ with its own distinct aspirations and class interests, still makes any sense, if it did in the first place.” The replacement of tenure-track professors by low-wage adjuncts and the increasing concentration of control at the top of the university were high on their list of changes in the class’s expected privileges. The cost of college itself was also part of the problem—it was increasingly untenable for the PMC to reproduce itself, with the price of a degree spiking nearly eight times faster than wages were rising. By 2020, a degree was 1,410.83 percent more expensive than it had been when the Ehrenreichs first coined the term PMC. Those who can, therefore, jump ship from academia into “direct service to capital,” becoming analysts for finance or working exclusively for the wealthy. Those who can’t wind up as adjuncts, in the service industry, or sometimes both. In a 2019 interview, Barbara Ehrenreich explained, “I would say that what happened to the blue-collar working class with deindustrialization is now happening with the PMC—except for the top managerial end of it.” In other words, instead of a professional-managerial class, you have management—and everyone else.38

Years after her fight with Ronald Reagan for her academic post, Angela Davis suggested, as the Ehrenreichs did, that academics had to answer for their own elitism. The solution to the problems of academic labor, and particularly for Black women in the academy, she wrote, would come not simply from defending their individual rights to exist there, but through collective struggle—a struggle that should include university workers from the cafeteria and cleaning staffs to the professors. “I include workers because it would be a mark of our having reproduced the very elitism which excluded and continues to exclude so many of us if we assumed that there is only one group of Black women whose names are worth defending in the academy,” Davis wrote. In the United Kingdom, meanwhile, lecturer and social theorist Mark Fisher noted that teaching itself was becoming a service industry, with teachers required to treat students as customers rather than encouraging them to challenge themselves. “Those working in the education system who still want to induce students into the complicated enjoyments that can be derived from going beyond the pleasure principle, from encountering something difficult, something that runs counter to one’s received assumptions, find themselves in an embattled minority,” he lamented. Once the United Kingdom’s vibrant art school culture allowed the working classes to create; now education was being restratified, restructured along lines dictated in a report that was overseen by a former British Petroleum executive: the 2009 “Browne Report,” commissioned by a Labour government but released under the Conservative–Liberal Democrat coalition. It recommended the series of changes, including huge tuition hikes, that would spur a massive student protest movement in 2010.39

The proletarianization of big chunks of the PMC makes them dangerous even as it strips away their power. As Nixon and Reagan and their advisers once worried about an educated working class, so today’s politicians face uprisings of what journalist Paul Mason called “the graduates with no future.” In response, they have cracked down further on the university. In Britain the student movement of 2010 was a response to student fee hikes and broader austerity and laid the groundwork for the left turn of the Labour Party. In Wisconsin, where Governor Scott Walker all but eliminated tenure and public-sector collective bargaining in 2011, faculty, and particularly graduate students, led the protests and the occupation of the statehouse that ensued. (Politicians also pursued access to faculty emails, in a breach of both academic freedom and privacy that seems both small and telling.) In Quebec, student protests against tuition hikes brought down the provincial government after weeks of strikes. As long as academia provides some top-tier positions to aspire to, hope labor may keep some graduate students and non-tenure-track faculty scrambling along, cobbling together a living and eking out research. But for how long?40

Even tenured faculty are feeling the crunch, the pressure to do more. As one professor wrote, “We live a day-to-day illusion that we don’t have a boss. We have only ‘self-imposed’ deadlines. Everything we do is our choice.” After her physical collapse due to overwork, she tabulated what she had been doing regularly: “In the fall semester, I taught two graduate courses. My department has three programs, and I was running one of them, with its 10 faculty members and about 50 master’s and doctoral students. I served on two committees, one in the department and another at the college level. I completed six manuscript reviews for leading journals, serving as a deputy editor on one of them. I had four doctoral and two master’s advisees and served on 15 graduate committees—providing feedback and writing letters of recommendation. Last fall I wrote close to 40 such letters. Add to that the steady stream of emails I must read and respond to every day.” Her job description said that 60 percent of her time should be spent on research, but where were the extra hours in the day to come from? Faculty of color, particularly women of color, do even more of this “invisible work” of “making the academy a better place,” everything from serving on diversity committees to extra mentoring for students of color. Such invisible work eats up their time and hinders, rather than helps, their chances at promotion up the ladder. And then there’s yet another form of labor expected of faculty in the digital age—becoming a social media star. McMillan Cottom noted that, like these other forms of extra work in the academy, such a burden falls hardest on Black women.41

In higher education organizing, it is common for adjuncts or others to note their educational attainments as they draw comparisons to “other” low-wage workers. The implication can often be that perhaps low wages are fine for some, but those who have jumped through the university’s hoops are entitled to the middle-class trappings to which they’d aspired. This can be an insidious argument, although, as theorist and occasional adjunct herself Yasmin Nair reminded us, “we might seize this opportunity to reconfigure the terms of academic success to signify a system that allows everyone opportunities to do the work they desire, without holding ourselves up to mythical standards of class empowerment.” Nair called it “class shock,” and we might note it as a symptom of the decomposition of the PMC, of downward mobility, or at least thwarted upward mobility. That sense of middle-class scarcity can lead at all levels to wanting to pull the ladder up behind you, whether that be the tenured professor ignoring the struggles of the adjunct down the hall, or the graduate student breaking the strike, or the adjuncts themselves casting aspersions on “those” workers.42

The coronavirus pandemic, as Katherine Wilson noted, sent universities scrambling. In many cases, the workers wound up pitted against each other as budget cuts loomed, particularly at public institutions. Because, as usual, the crisis did not hit all workers, or all departments, equally. Graduate students worried that their funding would disappear, that the research they could not do from home would be irreparably damaged. Adjuncts, whose contracts are renewed semester by semester, held their breath. But in some cases, the workers took Angela Davis’s advice and tried to unite up and down the university, with tenured faculty standing up for service jobs. At Rutgers in New Jersey, the campus unions joined in a coalition twenty thousand workers strong, fighting to hold onto jobs for temporary faculty and maintenance workers alike, with the best-off volunteering to take furloughs to save the money for the most vulnerable. The coalition, said faculty member and historian Donna Murch, was giving the workers “a way to fight something that often feels abstract, which is this politics of corporatization, privatization, de-unionization, with real people that you know and that you see in regular meetings.”43

Fighting for the university in a moment of crisis would take more than just convincing arguments. Adam Kotsko argued that this moment was an opportunity to reestablish faculty governance and potentially bring back “the last good job in America”—but, he noted, “that can’t happen as long as we allow cost-cutting administrators to divide us into a privileged minority of tenured and tenure-track faculty and a disposable majority of contingent faculty and graduate students.” Fully inclusive unions and coalitions—like the one at Rutgers—were necessary, and a reminder that “the answer is not persuasion, but power.”44

The ideal academic workplace is often one that comes about not by following the rules, but by resisting them. Philosopher Amia Srinivasan found a vision of the university that she wanted to see during a 2019 strike. Part of the University and College Union and a professor at Oxford, Srinivasan was on strike for eight days with colleagues from sixty institutions across the United Kingdom. At issue, she wrote, were “pensions but also pay cuts, casualization, overwork and the gender and racial pay gap.” Claire English, an associate lecturer at Queen Mary University of London, was one of those casualized workers, on a year-by-year contract, and to her, the strike helped break through the shame many of them felt at not having permanent positions. “It’s been an amazing experience to be on the picket line, to find that there are so many other people in the same position as me and all of us being jerked around in terms of our pay, getting paid a month late, not getting our contracts until well after we’ve started teaching,… being told that we’ll have five hours of seminars and then student numbers change and you only get three.” Despite the exhaustion and the constant paring away of hard-won conditions, Srinivasan wrote, academic labor “contains a spirit of vocation and reciprocity” that is why people still aspire to it. Yet, she noted, “when people insist that the university is simply a place of love, and not also a place of work, they offer cover to exploitation—of staff, of students, and of the ideals of the university itself.… Those who insist that striking lecturers do not love their students fail to see that love can still be work, and that the picket can be a classroom.”45

image

KATHERINE WILSON HAD BEEN AN ACTIVIST ON MANY FRONTS FOR MOST of her life, but labor hadn’t really been one of them. She’d been a feminist and an activist for LGBT rights. She’d been part of Palestine and Latin American solidarity movements, and much of her theater work had been in collectives, where putting in time and sweat equity mattered and the ethos, she said, was “do what needs to be done.” But at CUNY, the union felt distant from her; it included everyone from tenured faculty on down, and she never felt particularly drawn to it. That changed at Fordham, though.

Sitting in her office, she was wearing a maroon sweater to which she’d hand-applied varsity-letterman style letters, reading “FFU” across the back. FFU stands for Fordham Faculty United, the name of the union that includes adjuncts like Wilson as well as non-tenure-track lecturers, who were contingent as well but a step up from the adjuncts—“They get health benefits and they teach four courses a semester, whereas we are capped at two.”

Getting involved in the union at Fordham was very different from the kind of “sitting and talking or screaming in the streets” that had made up her prior activism. She said, “It was fascinating to me. Like, ‘Oh, we can’t just scream our top ideals. We have to actually come in here and think of how this would work.’”

The organizing process began at Fordham with a handful of adjuncts, but initially they had a hard time getting a union to work with them. Wilson wasn’t involved early on, but after the group connected with the Service Employees International Union’s new Faculty Forward campaign, focusing on precisely their kind of precarious faculty, she was drawn in. Eventually, through a combination of public pressure from students and tenure-track faculty and actions by the adjuncts, they got the university to agree to allow a union election. The instructors voted 16 to 1 to unionize with SEIU Local 200.46

It was 2018 when bargaining began, and Wilson found the process engrossing. Their union representative held open negotiating sessions, so anyone could come and watch bargaining unfold. “I felt that I learned from every single session I attended,” she said. “Every now and then we might say something, but it was mainly observing. But [our representative] would consult us and occasionally we would vote on something, like, ‘Would we be willing to strike?’ or ‘Did we want to fight for health insurance or higher pre-course wages?’” Only later did she realize that the speed and success of the Fordham process had been inspirational to other schools. The contract they won included wage raises from 67 percent to 90 percent for adjuncts over its three-year duration; that would bring most of them to between $7,000 and $8,000 per class by the contract’s end. Full-time lecturers would reach a minimum salary of $64,000 by the third year of the contract, an increase of roughly $14,000 a year for the lowest paid. They won just-cause protections, meaning they couldn’t be fired without a reason given, as well as some professional development funding and paid professional leave for full-timers.47

When they held their first election for union officers, Wilson found herself recruited to be cochair alongside French lecturer Josh Jordan, and they went on to work closely together on both big-picture and day-to-day issues. The process had taught her a lot about the campus: adjuncts tended to be isolated from one another as well as from other tiers of faculty in the stratified system. “I had to learn who my brothers and sisters were, so to speak,” she said. That involved getting to know the different campuses and schools, and learning that the social-work adjuncts “were paid a pittance” compared to her and others in the humanities, even though the adjuncts in the humanities were in turn paid little compared to those at the business school. Fordham had wanted to have the adjuncts and the lecturers in different bargaining units, but the contingent faculty stuck together, Wilson said, and made sure that while the contracts had different details, they “tied them together”: “We hinged the calculus that determines our salaries so that if one goes up, the other one goes up. You are not going to divide and conquer.”

Professional development funds were huge for Wilson. “Since 2002, I have been presenting at conferences, [and] I have never gotten a dime from a school for that,” she said. “CUNY has it, but it is stringent about who qualifies for it.” Of course, she laughed, now that she’s a union officer, that takes up a lot of her time outside the classroom and she has less time for conferences. But the prospect of funding means a recognition that adjuncts and lecturers, too, are scholars doing research as well as teaching. They also won a level of security against last-minute canceled classes: the university has to tell them by a certain time whether they’ll be teaching, and if they cancel a class, they still get paid a fraction of the salary. (The scheduling fights are reminiscent of those among retail and service workers, and were no doubt aided by SEIU’s experience organizing other parts of the service sector.) Finally, they have the right to union representation at every step of their process. “That forces them to recognize us, that we are working people, that we have pasts and futures,” said Wilson.

Implementing the contract has taught her about what needs to be improved next time. From the limitations on professional development funds to the impossibility of office hours without office space, she’s realized more and more what it would actually take to make adjuncts equal to the rest of the faculty. In their first year on the job, she said, she and Jordan were expected to be the “pretty face” of the union, to do things like going to Central Labor Council meetings, meeting with administrators, and connecting with other unionists across the city. But instead they’ve been involved in the nuts and bolts of implementation. “I just said last week to Josh, ‘Our real work starts now. A year and a half in, this is when we are finally turning to our real work.’”

Academia, she noted, often draws people who bought into the ideal of the lone intellectual: “We worked solo. We did a dissertation solo. We did the loneliest five years, eight years, whatever.… And now, you are throwing us together and saying, ‘Oh, yeah, you will work together fine.’” She was one of the few involved in the union’s leadership with activist experience, and even then, she hadn’t been involved with many formal organizations. Within the contingent faculty, she noted, titles didn’t determine rank, but there were in practice differences in whether one got involved with the union. Many universities justify hiring part-time adjuncts by arguing that they are working professionals in their field who supplement that work with teaching, and sometimes that’s even true. The “moonlighters,” Wilson said, were harder to sign up and get interested in the union than those like her and her cochair, who had gone through the PhD process and had looked forward to traditional academic careers.

Although there are union stewards on each campus, it can still be a challenge to get faculty involvement. Old-style union tips for organizing, Wilson noted, don’t work for contingent faculty. There is little shared space, and schedules vary wildly. “We don’t have a clique. We don’t have anything. We have to invent the watercooler,” she said. “That is about the structure of what they do with us and spatially… we sometimes don’t share anything. The fact that you have to invent it is very, very different from what [happens in] a factory or hospital, where the organization already wants them to be a well-oiled cadre, and now if you can bend that cadre toward a union, you have terrific strength. Science is not solitary, but science doesn’t get very involved. For those of us in humanities and social sciences, the nature of our work is very, very isolated.”

The way the university has changed shaped Wilson’s experience beyond just the nature of her own job, she said. She had let go of the desire for that window office with the bookshelves, the dedicated workspace. But what still got to her was how the students came in anxious and stressed, and how that led to less risk-taking, less learning, and more playing it safe to get the grade that would get them the credential. “It is chores you do to get the grade. I find it so alienated from working with knowledge and working with literature and working with writing,” she said. “Also, arts are about adding—I don’t mean beauty in the sense of pretty flowers, but about adding a kind of beauty and recognizing pleasure. That is quantified and that is commodified and commercialized, but in my mind—obviously—is not about price.”

That kind of pricelessness, she said, connects to the dignity that adjuncts are fighting for. “It is not just about the pay. I don’t put poetry on par with housing and food, but it is not far behind,” she said. “Bare existence, bare subsistence, bare life, that is not our vision of humanity, and particularly for academics, we’ve immersed ourselves in the fruits of human creation and civilization. It is a painful oxymoron that then our daily lives and subsistence had become close to abject.”

The union itself served to break up their isolation and provide something beyond that bare life. “We have monthly happy hours down the avenue. The social is a little slower, but we are trying to start to build those,” Wilson said. “I would like to organize—in Fordham—adjuncts who straddle schools and start to build that.” But it takes work to build a union that isn’t just seen by most of the faculty as a service provider. “We get members approaching us like, ‘Do this for me,’ or ‘Provide me this service.’” Contingent faculty still hang on to the hope that their gig will be short term; even if they want the protections of the union, Wilson said, they don’t want to invest too deeply in it. There was still a sense of shame that made the adjuncts less inclined to identify with their role; no one, Wilson noted, said with a sense of pride that they were an adjunct. They had this in common with other parts of the precarious workforce: a need to break through the disappointment and decide that the way to change things was to improve the adjunct job, not just to keep hope-laboring toward escape from it.

She understood this feeling. “I did give up looking,” she said. “After this contract it will be, ‘So, do I continue this or don’t I?’ I don’t know. Is this happy or content or how miserable and undignified is this?” Fifty-seven, she noted, would be a difficult age at which to be considering another change from work she trained for and is good at, but it is also the reality she faced.

“There is no question that the only time I have felt dignified working in a university was the activist work in the union here. That is the most dignified relationship I have had. My own work as an adjunct, I can’t think of a semester or a month where I would say, ‘That felt dignified,’” she said. “That is what most political struggles have been fighting for, in addition to the material gains and substantial rights.” The five, ten, sometimes twenty hours a week of unpaid work for the union was the thing that mattered most to her. “People say, ‘Why do you do it?’ That is my answer. This is where the dignity comes from.”