1

Our Burnt-Out Parents

“You think you’re burnt out? Try surviving the Great Depression and World War II!” In the wake of the millennial burnout piece, that was the most common critique in my inbox. The sentiment usually came from boomers, who, somewhat ironically, had endured neither the Great Depression nor World War II. Other greatest hits: “Buck up, life is hard” and “I worked my tail off in the ’80s, and you don’t see me complaining about being burnt out.” These statements are variations of what I’ve come to understand as the boomer refrain: Stop whining, millennials—you don’t know what hard work is.

The thing is, whether they realize it or not, boomers were the ones who taught us not only to expect more from our careers, but to consider our thoughts on the state of work, and our exhaustion, important: worth expressing (especially in therapy, which was slowly becoming normalized) and worth addressing. If we’re as special, and unique, and important as we were told we were throughout childhood, it’s no surprise we refuse to shut up when our lives don’t make us feel that way. And that can oftentimes sound like complaining, especially to boomers.

In truth, millennials are boomers’ worst nightmare because, in many cases, we were once their most well-intentioned dream. And in conversations about boomers and millennials, that’s the connection that’s often left out: the fact that boomers are, in many ways, responsible for us, both literally (as our parents, teachers, and coaches) and figuratively (creating the ideologies and economic environment that would shape us).

For years, millennials and Gen-Xers have chafed at critiques from boomers but couldn’t do much about it. Boomers had us outnumbered and surrounded: Our parents were boomers, but so were so many of our bosses, and professors, and superiors in the workplace. What we could do was roast them online using memes. “Old Economy Steve” first appeared on Reddit in 2012, pairing a 1970s high school portrait with a caption suggesting he’s now your market-loving dad who won’t shut up about how you should really start putting money into your 401k. Subsequent iterations narrativized his economic privilege: DRIVES UP FEDERAL DEFICIT FOR 30 YEARS / HANDS THE BILL TO HIS KIDS, one version of the meme exclaims; “WHEN I WAS IN COLLEGE MY SUMMER JOB PAID THE TUITION” / TUITION WAS $400 says another.1

More recently, on TikTok, Gen Z popularized the phrase “OK Boomer” as a reaction to someone with an outdated, intractable, and/or bigoted point of view. It could be directed, as Taylor Lorenz pointed out in the New York Times, toward “basically any person over 30 who says something condescending about young people—and the issues they care about.” But the contemporary connotation of “boomer” as condescending and single-minded is worth noting.2

It’s not just that boomers are old or uncool; every generation gets old and uncool. Boomers are increasingly positioned as hypocritical, unempathetic, completely unaware of just how easy they had it—the generational equivalent of being born on third base and thinking you hit a triple. This criticism emerged forcefully in 2019: the year boomers were projected to cede their status as the largest generation to millennials. To be fair, Gen-Xers have a long and glorious history of boomer antagonism. Yet this particular argument was popularized, particularly online, as the tangible differences between boomers’ and millennials’ financial situations have become more pronounced.

Whether or not someone is familiar with the stats—that, say, the net worth of millennials, according to a 2018 study commissioned by the Federal Reserve, is 20 percent lower than that of boomers at the same point in their lives, or that boomers’ family income was 14 percent higher when they were millennials’ current age—they can still intuit boomers’ role in our current generational divide. As the comedian Dan Sheehan put it in 2019, in a tweet that’s been liked more than 200,000 times, “Baby Boomers did that thing where you leave a single square of toilet paper on the roll and pretend it’s not your turn to change it, but with a whole society.”

I shared that animosity—and reading all those emails from boomers only stoked my anger. But as I began reading more and more about the currents that contributed to the massive expansion of the American middle class, it became clear that while boomers, as a generation, grew up in a period of unprecedented economic stability, their adulthoods were marked with many of the same pressures of our own: generalized scorn from their parents’ generation, particularly around their perceived entitlement and aimlessness, and panic over the ability to maintain (or obtain) a spot in the middle class.

Boomers were anxious and overworked and deeply resentful of the critiques levied at them. The problem, and why it’s often hard to think of them charitably, is their inability to tap that experience in order to empathize with their own children’s generation. But that doesn’t mean that their anxiety, or attitude toward work, didn’t influence us. The boomer ethos of the ’80s and ’90s was the backdrop of our childhood, the foundation for so many of our ideas about what our future could look like, and the roadmap to achieve it. To understand millennial burnout, then, we have to understand what shaped—and, in many cases, burnt out—the boomers that made us.


Boomers were born between 1946 and 1964, the eighteen-year “baby boom” that began with the economic recovery of World War II and accelerated as soldiers returned home. They became the biggest, and most influential, generation the United States had ever seen. Today, there are 73 million boomers in America, and 72 percent of them are white. Donald Trump is a boomer—so is Elizabeth Warren. They’re now in their sixties and seventies, parents, grandparents, and in some cases great-grandparents, retiring and grappling with the aging process. But back in the 1970s, they were in the position that many millennials find themselves now: entering the workplace for the first time, getting married, and figuring out what raising a family might look like.

The cliched understanding of the ’70s is that society was, as a whole, in retreat: still recovering from the hangover of the ’60s, backing away from activism, and embracing a newfound focus on the self. In New York Magazine, the author Tom Wolfe famously dubbed the ’70s “The Me Decade,” describing, in hypnotic detail, boomers’ obsession with self-improvement through threesomes, spiritualism, Scientology, or organic co-ops.3 “The old alchemical dream was changing base metal into gold,” Wolfe wrote. “The new alchemical dream is: changing one’s personality—remaking, remodeling, elevating, and polishing one’s very self . . . and observing, studying, and doting on it. (Me!)” Self-care, but with a very ’70s hue.

It will surprise no one that the tendencies Wolfe described and softly lampooned in his article were actually those of the professional middle class: people with the means, financial and temporal, to pay more for groceries or spend their weekends attending deep-breathing seminars in hotel ballrooms. But beneath that supposedly self-obsessed turn was a shared anxiety, spreading across the nation: a creeping realization that after decades of prosperity, things in America seemed to be getting markedly worse.

More specifically: the train ride of growth and progress that had marked boomers’ entire lives had significantly slowed. There were multiple, interlocking reasons for this deceleration, and they all come back to versions of the same narrative, which begins something like this: Amidst the Depression, one of the most significant bills signed into law by President Franklin D. Roosevelt was the National Labor Relations Act of 1935, which granted legal protections to many employees in the private sector if and when they attempted to organize or join a union. The Labor Relations Act also gave that union “teeth”: from that point forward, business owners were legally required to participate in collective bargaining, in which union representatives negotiate with business owners to establish a pay and benefits structure that applies to all union members. If an agreement could not be reached, union members could go on strike—and be legally protected from losing their jobs—until one was. With considerable risk, you could organize or join a union before 1935. But after 1935, you could organize or join a union with the law on your side.

A single employee could never stand up to the whims of management, but when every union employee did, it made them all the more powerful. And between 1934 and 1950, unions leveraged that power toward favorable working conditions. Depending on the workplace, “favorable” could mean a few things, all related to the general health and well-being of the worker: increased safety on the assembly line, say, or recourse for mistreatment, and regular breaks. It could mean an hourly wage high enough to support a middle-class lifestyle, what was colloquially known as the “family wage.” Or, as stipulated by the Fair Labor Standards Act of 1938, getting paid overtime if your workweek exceeded forty-four hours, which helped prevent overwork, simply because it was more expensive for the company. “Favorable” could also mean healthcare, so you wouldn’t go bankrupt paying medical bills or devote significant mental energy to worrying what would happen if you did, and a pension, which would keep you out of poverty as you aged. (It did not mean Ping-Pong tables at work, or free cab rides home after nine p.m., or catered lunches on Monday and Wednesday, or any of the other employee “perks” so often sold to millennials today as a means to paper over the fact that the employer is paying barely enough to afford rent in the city where it’s located.)

Favorable work conditions were the result of robust unions, but they would’ve been impossible without what the labor scholar Jake Rosenfeld calls “an active state”: a government invested in growing the middle class, working with big, healthy employers across the economic sector. Which is part of why this postwar period has become known as a time of “economic miracles,” where unprecedented growth meant “average people everywhere had reason to feel good.”4 As you aged and grew weary, you could retire with a pension and/or Social Security, easing the burden on your children. Some call it the “Great Compression,” a reference to the ways rich people became less rich and poor people became less poor as income distribution “compressed” into the middle class.

During this period, the Greatest Generation achieved the closest to equitable distribution of wealth that this country has ever seen. Companies were allocating more money to paying wages and benefits; CEOs were paid relatively little, especially compared with today, and in proportion to the rest of the employees of the company. (In 1950, CEOs made about 20 times more than the regular employee; by 2013, they made more than 204 times more.)5 Corporations enjoyed “unrivaled economic progress,” generated steady profits, invested in their employees, and experimented and innovated—in part because they were far less beholden to shareholders, who didn’t yet expect the endless, exponential growth of today. “The jobs might have been repetitive, but so were the paychecks,” the labor historian Louis Hyman writes. “Capitalism worked for nearly everyone.”6

To be clear, the benefits of the Great Compression were not equally distributed. The protections fought for by unions, and granted by the US government, did not extend to the millions of workers in the home and in the field. When Social Security was first signed into law, it excluded federal and state employees; agricultural workers; and domestic, hotel, and laundry workers until 1954. As Hyman points out, the reforms of the 1930s may have been a “turning point” for white men, but not for the Black men and women who, in many parts of the country, were still governed by the restrictive Jim Crow laws. There were still deep pockets of poverty across the United States; employees, union or not, were periodically subject to layoffs during mini-recessions; the “family wage” was still a pipe dream to anyone working outside of a major corporation.

 

The 1950s and ’60s weren’t some immaculate golden age. But general volatility for companies—and for those on the job—was significantly lower than it is today. Following the economic and societal catastrophe of the Great Depression, the political scientist Jacob Hacker argues, “political and business leaders put in place new institutions designed to spread broadly the burden of key economic risks, including the risk of poverty in retirement, the risk of unemployment and disability, and the risk of widowhood due to the premature death of a breadwinner.”7 Some of these programs, like Social Security, would be “paid into” with every paycheck; others, like pensions, would be part of the employment contract. But the idea was the same: Some risks are just too great for the individual, on their own, to bear; instead, the risk should be spread across a much broader pool of people, thus blunting the effect when and if individual catastrophe does arrive.

When people talk about the growth of the middle class after World War II, then, they’re talking about some sort of economic utopia—a massive growth in the number of people (largely, but not exclusively, white men) across the country, with or without college degrees, who were able to find economic security and relative equality for themselves and their families.8 And as Hacker explains, it briefly expanded the “fundamental expectations” of the American Dream to millions.

This was the environment in which middle-class boomers grew up. Which was also why when some of them reached college age, they felt increasingly comfortable pushing back on the status quo. As Levinson explains, this era of economic stability “arguably engender[ed] the confidence that brought vocal challenges to injustices—gender discrimination, environmental degradation, repression of homosexuals—that had long existed with little public outrage.”9 But when these boomers began to protest segregation, or patriarchal norms, or American engagement in Vietnam, or even just the perceived conformity of the suburban existence, they were labeled as ungrateful and spoiled. The renowned neoconservative sociologist Edward Shils called student protesters of this era “a uniquely indulged generation”; in a passage that should sound familiar to any millennial, another sociologist, Robert Nisbet, placed the blame on “massive doses of affection, adulation, devotion, permissiveness, incessant and infant recognition of youthful ‘brightness’ by parents.”10

To these critics, whose generation had weathered the deprivations of the Great Depression and World War II, these boomers were simply ungrateful. They’d been given the keys to the American Dream but failed to cultivate any sort of work ethic, or the sort of deferred gratification that would allow them to pass their middle-class status down to the next generation. Instead, boomers “dropped out” of society in their early twenties. They opted for “occupations,” like cabdriver or house painter, instead of white-collar work. They ignored social mores, and stayed in seemingly endless graduate programs instead of pursuing honorable careers.

Or at least that was one way of looking at it, codified in books like Midge Decter’s Liberal Parents, Radical Children, released in 1975. Dector detailed the various archetypes of disappointment: There was the new graduate who “once made his parents the envy of all the rest, handsome, healthy, gifted, well-mannered, winner of a scholarship to Harvard,” who “languishes now in a hospital where the therapists feel that in another few months he might attempt a few tasks and ultimately—for the prognosis is good—even hold down a job,” and another son who “lately sent a postcard to his sister announcing that he had taken up photography and that as soon as he gets some work he plans to buy himself a piece of land and build himself a house on it.” There was a daughter living with a divorced older man, and the other daughter on her “third—or is it her fourth?—postgraduate degree.”11

This discourse—articulating the fear that white bourgeois boomers had gone “soft” in some way—was like so many conversations about child-rearing and generational expectations: moralizing in tone, but deeply rooted in class anxiety. The unique thing about the middle class, after all, is that middle-class-ness must be reproduced, reclaimed, with each generation. “In other classes, membership is transmitted by simple inheritance,” Barbara Ehrenreich writes in Fear of Falling: The Inner Life of the Middle Class. “If you are born into the upper class, you can expect to remain there for life. Sadly, too, most of those born into the lower classes can expect to remain where they started out.”12 But the middle class is different. Its form of capital “must be renewed in each individual through fresh effort and commitment. In this class, no one escapes the requirements of self-discipline and self-directed labor; they are visited, in each generation, upon the young as they were upon the parents.”13 The son of a lawyer must work just as many years as his father did, for example, to sustain the same position in society.

The middle-class boomers who refused that path were perceived as neglecting that lifelong slog to stay in the middle class. Or at least that was the view of a handful of jaundiced conservative critics writing the 1970s equivalent of a David Brooks or Bret Stephens op-ed bemoaning the state of kids these days. But that sentiment was just part of a much larger, creeping societal anxiety, one that boomers would internalize as they came of age. The postwar expansion and solidification of the American middle class—which had lasted just long enough for people to believe that it could last forever—was over.

Consider the psychological impact of this downturn on the American worker: Thanks to wage stagnation, the amount of money you receive every month stays the same, or even goes up, but its actual worth, along with the rest of your savings, goes down. Unemployment hit 8.5 percent in 1975, as American jobs began their slow migration overseas, where corporations could pay less (and avoid unions) to manufacture similar products. But that wasn’t all. In the wake of the civil rights and women’s movements, more people of color and women were competing for jobs, from manufacturing to medicine, that had been formerly limited to (white) men. And all of this took place against the backdrop of the Vietnam War, Watergate, the resignation of Nixon, and generalized disillusionment with the government at large. Major demographic change, declining trust in public institutions, financial precarity—all of this should sound familiar.

And so, after years of post-Depression, post–World War II collectivism, many in the middle class began to turn inward. Culturally, and somewhat superficially, that looked a lot like what Wolfe described as “The Me Decade.” But it also manifested as a rightward shift in their politics: the embrace of Reaganism and “market-oriented thinking,” also known as the idea that the market should be allowed to work things out without government intervention, as well as union busting and massive cuts to public programs that accompanied it.

In The Great Risk Shift, Hacker maps the concurrent development of the “Personal Responsibility Crusade,” or the increasingly popular idea, articulated in various forms across culture and society, evident in the tax code and reigning economic thought, that “government should get out of the way and let people succeed or fail on their own.”14

Central to this framework, Hacker argues, was the notion that “Americans are better off dealing with economic risks on their own, without the overweening interference or expense of wider systems of risk sharing.” In other words, risk sharing, be it in the form of robust funding for higher education or company-run pensions, was presumptuous, and indulgent, and unnecessary. And then there was the argument, now so familiar to conservative thought as to feel mundane, that safety nets make people lazy, or ungrateful, or self-indulgent—and are thus, at their heart, un-American. “By protecting us from the full consequences of our choices,” Hacker explains, insurance was thought to “take away our incentive to be productive and prudent.”15

The risk shift also took the form of transferring the responsibility for training to the individual, rather than the employer. In the past, many companies would hire workers with or without college degrees and pay them as they trained them for a specific job. In a factory, someone hired as a packager could get trained up to inspector; a receptionist at an accounting firm could eventually get her CPA. A mining company, for example, would help fund engineering programs at local colleges, and create scholarships for students to attend them. They might not be doing the training themselves, but they were effectively paying for it—with the “risk” (e.g., the cost) falling on the company, not the worker.

These days, the vast majority of employers require applicants to shoulder the burden of their training. We pay for undergraduate degrees, certificates, and graduate degrees, but we also foot the bill for internships and externships, in which a person “self-finance[s] their own training in the workplace,” either in the form of paying for college credits (to provide free labor in an internship that doubles as a “class”) or just providing uncompensated labor.16 Some companies still train workers out of necessity (highly specific trades, like solar panel work) and some white-collar employers foot the bill for employee MBAs. And there’s always, of course, the military. But the responsibility for the vast majority of training now falls on the worker—and even then is no assurance of a job. This shift happened so gradually that it’s hard to see how profound a change it is, and how much student debt has resulted from it, but it started, however quietly, as boomers came of age.

The most obvious by-product of the risk shift is the fate of the pension, which has become so rare in today’s economy, so wholly outside what we can imagine, that for many, it feels gluttonous to even think about, let alone expect such a thing. When I think of my Granddad’s pension—which he began receiving when he retired, at age fifty-nine, from his job at 3M—my immediate reaction is that it was preposterous. But the idea of a pension was not, and is not, extravagant. It’s premised on the idea that some of the profits you help produce for a company should go not to stockholders, or the CEO, but back to longtime workers, who would continue to receive a portion of their salary even after they retire. In essence, the worker committed years of their life to making the company profitable; the company then commits some extra years of its profits to the employee.

Combined with Social Security—which every worker pays into for their entire working life—most unionized and professional workers during the postwar period were able to retire in comfort. They weren’t sent to the literal poorhouse, as many elders were before the Depression and the passage of the Social Security Act; nor were they forced to depend on their children. But as the economy shifted in the 1970s, companies began to see the pension as a liability. Starting in 1981, some companies exchanged pensions for 401k programs, which allow workers to save pre-tax dollars for retirement. A portion of those companies also provided “matching” dollars up to a certain point: If you put one dollar in, they’ll put in anywhere from five to fifty cents.

But more and more companies began to offer nothing at all. In 1980, 46 percent of private-sector workers were covered by a pension plan. In 2019, that number had fallen to 16 percent.17 A Pew Charitable Trusts analysis of data from the 2012 Survey of Income and Program Participation found that 53 percent of private sector employees had access to a “defined contribution” plan, like a 401k or a Roth 401k IRA. And while many celebrate the ability to move from job to job instead of sticking with an employer simply to maximize pension benefits, that flexibility creates significant 401k “leakage”: employees forget to roll over a 401k, or withdraw it to cover “hardship” expenses, from college tuition to medical emergencies.18 And access to a plan is different from participation: Only 38 percent of private sector workers actually enrolled in offered defined contribution plans. It’s difficult, after all, to force yourself to save for future security when your present feels so incredibly insecure.

When my other set of grandparents retired in the late ’80s, they were able to live—not luxuriously, but live—on their Social Security benefits. Today, to survive on Social Security alone often means barely covering basic expenses. And yet the idea of personal responsibility has persisted: If you plan well and start saving when you first started working, theoretically you should be fine. But you might also end up living Social Security check to Social Security check, even after a lifetime of hard work. Before the Great Depression, that was the American way: abject insecurity for the vast majority of the country. That’s what the Greatest Generation lived through; those are the stories that were passed down, with reverence rivaling any war story, to their boomer children. Which is why it can feel so mind-boggling that either generation would willingly return to that American way again.

But like so many contradictory ideological turns, it’s mind-boggling and yet readily understandable. Americans, after all, love the idea of the self-made, bootstrapping American whose success could be linked to dogged perseverance no matter the barriers. But the myth of the wholly self-made American, like all myths, relies on some sort of sustained willful ignorance—often perpetuated by those who’ve already benefited from them.

The endurance of the “pull yourself up by your bootstraps” narrative, for example, has always relied on people ignoring who’s allowed boots and who’s given the straps with which to pull them up. The cult of the individual elides all the ways in which the individual’s hard work was able to take root and flourish because of federally implemented programs and policies, from the Homestead Act to the G.I. Bill—programs that often excluded people who were not white or male.

But it’s easier—and more heroic—if the story of middle-class ascendency is all about individual hard work. And no one wants to lose any of the hard-won benefits of that work. Which helps explain the popularity of the Personal Responsibility Crusade amongst both boomers and their parents: Members of the middle class were so freaked out by seeping economic instability that they started pulling the ladder up behind them. They helped elect leaders, like President Ronald Reagan, who promised to “protect” the middle class through tax cuts, even though Reagan’s policies, once put in practice, worked to defund many of the programs that had allowed the middle class to achieve that status in the first place. On the state level, they elected lawmakers who passed “right to work” legislation to defang unions, which were increasingly depicted as greedy, corrupt, and destroying American competitiveness in the global market.

Pulling up the ladder also meant justifying the elimination of social services by demonizing “welfare queens,” and signing on to the newly accepted wisdom that programs intended to alleviate poverty actually kept people in it. It meant deep cuts to departments that disproportionately affected Black communities, like Housing and Development. As Maurice A. St. Pierre, writing in the Journal of Black Studies, explained in 1993, “The policies of the Reagan administration—based on the philosophy of hard work, independence, thrift, minimum government intervention in the lives of citizens, and making America strong again—affected the poor, many of whom are Black, more negatively than the economically better-off.”19

The best way to the collective good, according to Reaganism, was through eagle-eyed focus on cultivation of me and mine, with little thought of how the reverberations of those actions would affect their children and grandchildren in the years to come. This notion developed into the only-kinda-joking argument that (white, middle-class) boomers are, at their heart, sociopaths: lacking in empathy, egotistical, with a high disregard for others. In his book A Generation of Sociopaths: How Baby Boomers Betrayed America, published in 2017, Bruce Gibney argues that boomers are also antisocial: not in the “doesn’t want to go to the party” connotation of the term, but in the “lacks consideration for others” way.

It’s not a scientifically rigorous hypothesis, but today, Gibney’s overarching thesis feels more and more credible. All the way back in 1989, Barbara Ehrenreich had articulated a similar idea. Tracing the development of the student protest movement, the backlash against it, and the anxiety over the newly expanded and newly threatened stability of the middle class, she argues that boomers retreated from the liberalism of the ’60s into “a meaner, more selfish outlook, hostile to the aspirations of those less fortunate.”20 They broke the “social contract” that, according to the economists Matthias Doepke and Fabrizio Zilibotti, had defined the postwar period, “and decided to look out for themselves: they invested more in their education and individual success, while deeming social protection less important.”21

Critics and scholars of this time are careful to note, however, that this was largely the trajectory of the rich and the “professional” middle class, the mix of managers and college graduates and professors and doctors and writers and consultants whose class status was “confirmed” through the production of organization and knowledge. They were mostly but not exclusively white; they were most likely suburban, but scattered throughout the United States, endemic to college and factory towns alike. They were salaried, as opposed to paid by the hour, and unlikely to be part of a union.

While these professional middle-class boomers were by no means the majority—making up just 20 percent of the population—their proximity to levers of power and cultural visibility gave them, and the ideologies they embraced and propagated, outsize force. They were “the elite,” and as Ehrenreich argues, “an elite that is conscious of its status will defend that status, even if this means abandoning, in all but rhetoric, such stated values as democracy and fairness.”22

Such hostility toward others was motivated, at least in part, by their fear of falling from their class perch, and the social humiliation that would follow.23 In order to avoid that fate, some of those young boomers—graduating into the late ’70s and early ’80s—began to adopt a different understanding of the purpose of education and consumption. Like millennials graduating into and after the Great Recession, they finished high school or college and the long-assumed jobs were nowhere to be found. They were the first boomers to enter into the workplace after the “miracle economy,” and understood, in some way, that they’d have to chart a different route than their parents toward middle-class security.

Ehrenreich calls that new mindset “the Yuppie Strategy.” Like the hipsters of the late 2000s, yuppies (or young urban professionals) were a social category to which few willingly admitted membership, mercilessly satirized in texts like The Yuppie Handbook. But their popularity—as the subject of media trend stories, as a cultural punching bag—suggested a new societal direction, at once disconcerting and aspirational.

The most stereotypical yuppies were college educated, lived in New York, and worked in finance or consulting or law. They consumed in a way that rejected the thrift of their parents, spending lavishly on gadgets (the Cuisinart) and specialty food items (sun-dried tomatoes, sushi) and status-oriented vacations (the Bahamas) and purchases (Rolexes). They got into wine, house plants, and the newly cool hobby of “jogging.” They bought up real estate in gentrifying neighborhoods, making prices unaffordable for everyone except other yuppies. (If all of this sounds like a slightly dated version of our current consumer culture, that’s because it is.)

Most important, they were unembarrassed about loving money. As an iconic Newsweek cover story put it, the yuppies had “marched through the ’60s, then dispersed into a million solitary joggers, riding the crests of their own alpha waves, and now there they go again, barely looking up from the massed gray columns of the Wall Street Journal, they speed toward the airport, advancing on the 1980s in the back seat of a limousine.” They weren’t necessarily Gordon Gekko in Wall Street, a movie released in 1987, but Gekko was a distillation of their worst traits. Unlike earlier boomers, “they did not waste time ‘finding themselves’ or joining radical movements,” Ehrenreich writes. “They plunged directly into the economic mainstream, earning and spending with equal zest.” That “yuppy” was a play on Yippie—the name for one of the radical protest groups of the 1960s—was part of the point. The hippies had gone corporate.

The first step of the yuppie strategy, according to Ehrenreich, was a sort of “premature pragmatism”: choosing a major based on which one that would land them in a position to make a lot of money very quickly. Between the early 1970s and the early 1980s, the number of English majors declined by nearly 50 percent, as did those majoring in social sciences. During the same period, business majors doubled.24

This “pragmatism” should be familiar to millennials. Yuppies wanted what they’d been trained to want, which is the same thing that middle-class millennials were trained to want: a middle-class lifestyle like their parents’, if not membership in an even higher socioeconomic bracket. But because of the shifting economy, a college degree was no longer enough to assure that lifestyle. They had to choose the right major, and get the right job to shore up that elite status—and start treading water fast enough to keep afloat.

Yet the “right job” was often one that exacerbated the conditions that made yuppies so frantic in the first place. As the historian Dylan Gottlieb points out, yuppies were “the beneficiaries of the unequal social order they helped to create.”25 For yuppies to keep treading water, others had to sink below the surface—economic casualties of yuppies’ on-the-job actions as stockbrokers, consultants, and corporate lawyers.

This is why yuppies became such a flash point in conversations about the ’80s and boomers in general: “Talking about yuppies was a means to make sense of the eclipse of manufacturing and the rise of the financial, professional, and service industries,” Gottlieb explains. “Yuppies were a way to signify the growing inequality between the college-educated upper middle class and those who were being left behind.”

Not all boomers were yuppies—not even close—but thinking through the actions of the yuppies gives us a window into the larger anxieties of the boomer middle class. They took form over the course of the ’70s, metastasized in the ’80s, and became the base temperature of the ’90s. Sometimes the blame for the end of prosperity is placed on “big government,” sometimes on vague understandings of global competition. It became more acute during small economic recessions, but the “recoveries” offered only slight relief. Some boomers managed to cling to their parents’ class status, while others became part of what became known as the “disappearing middle class,” a.k.a. the working middle class, whose jobs and class security had been jeopardized and then, in many cases, completely destroyed. But the animating, enervating question for this generation remained the same: Where did our security go, and why can’t we get it back?

Navigating a baseline nervousness about your class position, and struggling to find a job that will allow you to try and maintain it—that was the boomer’s iteration of what we now know as burnout. They didn’t have cell phones or massive piles of student debt to exacerbate it, but they did have the fundamental unease, the psychological toll of dealing with everyday precarity.

Examining boomers through the lens of economic history helps explain so much: about their voting habits, and their turn inward. But if you’re still wondering what this has to do with millennial burnout, think about it. Surrounded by perceived threats and growing uncertainty, middle-class boomers doubled down on what they could try to control: their children.