CHAPTER I
PULLING APART: THE STATE OF DISUNITED AMERICA
Though many things change with time, some truths appear to be largely unaltered by the turning of the hands on a clock or the progression of a calendar. It has been more than two millennia, after all, since the Greek philosopher Plato gave voice to a social reality easily recognizable in each generation from his time to the present:
Any city, however small, is in fact divided into two, one the city of the poor, the other of the rich; these are at war with one another.1
As with Plato, philosophers, novelists and poets down through the ages have made note of inequality. While the work of Dickens, mentioned previously, is perhaps the obvious referent here—and indeed we will come back to him—many others have written and spoken just as descriptively about the reality of class division. At the turn of the twentieth century, Theodore Dreiser described the divide with regard to New York City, in his novel Sister Carrie:
Along Broadway men picked their way in ulsters and umbrellas. Along the Bowery, men slouched through it with collars and hats pulled over their ears. In the former thoroughfare businessmen and travelers were making for comfortable hotels. In the latter, crowds in cold errands shifted past dingy stores.2
Seventy-five years ago, in The Grapes of Wrath, novelist John Steinbeck described in visceral prose the way that economic division so often plays out, with the rich unaware of the strain and suffering felt by those struggling to survive:
The fields were fruitful, and starving men moved on the roads. The granaries were full and the children of the poor grew up rachitic, and the pustules of pellagra swelled on their sides. The great companies did not know that the line between hunger and anger is a thin line. . . . On the highways the people moved like ants and searched for work, for food. And the anger began to ferment.3
James Baldwin, whose graphic depictions of America’s racial divide were among the most searing ever produced, famously discussed the difference between Park Avenue uptown, in Harlem, and Park Avenue midtown, where the affluent and white caroused in a universe quite their own: one city but two worlds, separated by gulfs of race and class, as foreign to one another as persons living in lands divided by vast oceans:
I still remember my first sight of New York. It was really another city when I was born—where I was born. We looked down over the Park Avenue streetcar tracks. It was Park Avenue, but I didn’t know what Park Avenue meant downtown. The Park Avenue I grew up on, which is still standing, is dark and dirty. No one would dream of opening a Tiffany’s on that Park Avenue, and when you go downtown you discover that you are literally in the white world. It is rich—or at least it looks rich. It is clean—because they collect garbage downtown. There are doormen. People walk about as though they owned where they are—and indeed they do. And it’s a great shock. It’s very hard to relate yourself to this. . . . You know—you know instinctively—that none of this is for you. You know this before you are told.4
Far from seeking to inspire the reader to rediscover great literature, my purpose here is to establish the way in which scholars, artists and public intellectuals have long recognized inequity as a serious social problem; and just as in their respective times, so too today the economic inequities to which these authors gave voice are as real as ever, and in some ways more deeply entrenched than before. This is not because such vast inequities are natural or inevitable—the commonly believed but altogether false assumption made by many—but because of decisions we have made within political and civil society, decisions that can be just as readily undone through collective action once we recognize the source of the trouble.
Don’t misunderstand: a certain degree of inequality between persons is to be expected. We all have different talents and interests, after all; some can sing, some cannot; some are artists, some are not; some simply work harder than others. But the extremes between rich and poor to which we are being exposed today are unlike anything that can be written off to the normal distribution of abilities. It is not the simple fact of inequalities that concerns us, but the extremity of the gap, the shape of those disparities, and their increasing impermeability that should give us pause. There is nothing normal or acceptable about those things, however much we may allow for a reasonable range of talents and rewards based upon them. Not to mention the fact that what we have chosen to value in society—which work, for instance, is most amply rewarded in the market—has been the result of choices we’ve made, rather than some natural process. As such, the inequities we can readily see all around us reflect little about the individual worth of people at the top or bottom of the scale; rather, they reflect social power relationships that have elevated the work product of some above others, even when (as we’ll see) many of those “others” perform work generally acknowledged to be more socially valuable than the work performed by the wealthy economic minority. So even if a certain degree of inequality is inevitable in any remotely free society, we should not extrapolate from that fact the notion that those inequities that currently exist are preordained.
To get a sense of the “two cities” nature of modern American life, consider the following: As of 2014, the stock market reached an all-time high.5 Corporate profits as a share of the overall economy have risen to a level unseen since the late 1920s,6 and as a share of all national income those profits are higher than at any point in recorded history.7 For the wealthiest one percent of Americans (roughly three million people), incomes rose by about a third from 2009 to 2013, largely making up for whatever stock market–related losses they suffered during the recent Great Recession.8 And yet, while corporate profits are at their highest level in the past eighty-five years, worker compensation as a share of the economy remains at its lowest point in the past sixty-five. For millions of average working people the recession never really ended, and far from a one-third increase in average wages, income for the bottom ninety-nine percent of us only rose four-tenths of one percent (0.4) from 2009 to 2013.9 In other words, virtually all the income gains during the first few years of the recovery flowed to the nation’s top one percent.10
Even those gains for persons who weren’t one-percenters were received exclusively by the next nine percent. From 2009 to 2012, the bottom nine-tenths of the wage-earning population saw their incomes actually fall, meaning that a statistically improbable but nonetheless true 116 percent of all income gains in the first years of the recovery went to the highest-earning tenth of Americans.11 In 2013, hourly wages grew at only one-fifth the rate of corporate profits, barely staying ahead of inflation, suggesting that the economy is producing far higher returns at the top than in the middle and bottom of the distribution.12 Although economists have pointed to recent Labor Department data to suggest that things are getting better—so, for instance, as of January 2015, wages seemed to be finally ticking upward—it remains to be seen whether this trend will last, and whether the wage gains will extend to the lowest rungs of the job ladder.13 Despite claims of recovery, from January through April wage growth has bounced around from 0.5 percent down to 0.1 percent,14 back to 0.3 percent15 and finally to virtually no growth at all by late spring.16 But even if higher gains manage to return, such that the annual growth of wages might reach as much as 2.2 percent per year, this would remain well below normal economic recovery targets,17 and after inflation would only come to about one percent annually in real terms, hardly sufficient to reverse the slide of the past decade.18
Even more disturbing, there is good reason to believe that the job and wage recovery of the last few months (as of this writing) won’t last long, if current rumblings from the Federal Reserve—the nation’s central bank—are to be believed. Although the Fed has been holding interest rates down for years in the hopes of spurring businesses and consumers to borrow so as to boost consumption, production and economic growth, now that things are beginning to look up, the Fed seems concerned that more jobs and rising wages might push up prices. Recently, Fed chair Janet Yellen signaled the bank’s intention to begin raising interest rates so as to keep inflation low by putting the brakes on borrowing a bit, and thus on economic growth. The thinking here is that if the labor markets tighten too much and employee pay grows “too fast” (a concept that must seem laughable to workers given the last two decades of wage stagnation), people will spend their increased earnings and inflation will spiral out of control, thereby damaging the economy. And this is feared, even as wage-related inflation has been largely nonexistent for several decades. Ultimately, if the Fed hikes interest rates (and this appears a certainty as of this writing), the result could be the loss of hundreds of thousands of jobs that would otherwise have been created, as borrowing for the purpose of new job creation and business expansion becomes too expensive.19 Such a move could easily choke off the job and wage recovery, long before it has time to filter throughout the ranks of the working class. In short, despite recent signs that things may be getting better for American workers, the long-term prospects for fundamental gains in wages and living conditions remain sketchy at best.
Joblessness and Underemployment in Post-Recession America
While the rich ride high, there are still millions of Americans whose economic situation is grim. As of April 2015, there were still about 8.5 million people who were officially unemployed (which means jobless and actively seeking employment), and several million more who say they want a job but have given up looking for one at present. On top of these there are about 6.6 million additional workers who say they desire full-time employment but are having to settle for part-time jobs.20 So although the official unemployment rate is only 5.4 percent—a definite improvement since early 2013 when it was still hovering around eight percent, and far superior to the ten percent rate in 201021—once we consider the plight of involuntary part-timers and discouraged workers, the true rate of joblessness and underemployment is likely to be nearly twice as high.
And even though recent reports suggest that jobs are beginning to come back to the private sector, it is worth noting how these jobs differ from those lost during the slowdown. Although jobs in lower-wage industries (paying less than $13.33 per hour) represented only twenty-two percent of job losses during the recession, they have accounted for forty-four percent of new jobs since 2010. Today, lower-wage industries are employing nearly two million more workers than they were in 2008. As for mid-range-paying jobs (paying as high as $20 per hour), these have actually slipped in the recovery, and now account for nearly a million fewer jobs than at the outset of the recession. And while higher-paying jobs (paying up to $32 per hour on average) represented more than forty percent of job losses in the recession, they have only accounted for thirty percent of recent job growth. As a result, there are nearly a million fewer higher-paying jobs now than in 2008 when the recession began.22 In other words, even when people are finding work it is often at income levels well below that which they had been earning prior to the economic collapse. In the most recent jobs report as of this writing, there were only 1,000 new jobs created in manufacturing, out of 225,000 new jobs in all (this, after an actual decline in manufacturing positions during the previous month). Meanwhile, some of the biggest gains were in areas such as retail sales, low-paid health care jobs like physician’s assistants and home health care aides, temporary services, and jobs in restaurants and bars. Indeed, more than half of all jobs created in the most recent month were in these categories.23 On average, during 2014, new jobs created paid about twenty-three percent less than the jobs lost during the recession.24 Unless the recent bump in wages and employment continues and accelerates, the hollowing out of the middle class will not likely be arrested, nor are we likely to see a diminution of rising income inequality.
Although the job picture has been bleak for Americans of all races and ethnicities, communities of color are having an especially difficult time. Latinos are about sixty percent more likely than whites to be unemployed (so much for the often heard refrain that they’re “taking all the good jobs”) and African Americans are almost two and a half times as likely as whites to be out of work: nearly ten percent unemployment for blacks as opposed to just a bit more than four percent for whites.25 Even when only comparing whites and persons of color possessing the same degree of education, racial gaps persist. Latinos and Latinas with a diploma have an unemployment rate more than twenty percent higher than that of similar whites, while Latino/a college graduates are fifty percent more likely than comparable white graduates to be out of work. Meanwhile, black high school graduates are twice as likely as comparable whites to be unemployed, and even black folks with college degrees are seventy percent more likely than white college graduates to be out of work.26 Things are especially troubling for recent black college graduates. Despite persistent cries about “reverse discrimination” from whites who seem to feel they are being bumped from jobs by less qualified African Americans, recent black college grads are more than twice as likely as comparable whites to be unemployed. For graduates between the ages of twenty-two and twenty-seven, unemployment rates for blacks are 12.4 percent, compared to only 4.9 percent, for comparable whites—a quintupling of the jobless gap between white and black recent college graduates just since 2007.27 This pattern holds true for graduates in every possible category of academic study, regardless of their majors. Even black folks who obtained engineering degrees are nearly twice as likely as white engineering grads to be out of work.28 Among millennials (ages eighteen to thirty-four), racial disparities remain stark: white, male, high school dropouts have the same chance of finding work as black males with two years of college.29
For many, their stint on the unemployment line is no brief interlude between jobs. Millions find themselves out of work for half a year, a full year, even two full years, no matter how hard they look for a job. As of April 2015, nearly thirty percent of the unemployed—about 2.5 million people in all—had been out of work more than twenty-six weeks, despite actively looking for a job the entire time, and forty-two percent had been out of work for at least fifteen weeks. Indeed, unemployed persons are just as likely to be out of work for twenty-seven weeks or more as they are to be unemployed for less than five weeks, meaning that long-term unemployment is now just as prevalent as short-term joblessness.30 And while unemployment is always stressful, long-term unemployment is especially crushing. Those who suffer this fate will typically experience impaired emotional and physical well-being, significantly elevated rates of suicide, and substantial family dysfunction because of their job situations.31 Even when the long-term unemployed finally do find work, it is usually at wages well below what they were earning previously, and often without the benefits available in their old jobs.32
Poverty, Wage Stagnation and Deprivation Amid “Recovery”
By the end of 2013, there were forty-five million Americans officially living below the poverty line—about one in every seven persons in the country.33 To understand what this means in practical terms, consider that to be officially poor that year a single individual would have to have made less than $11,188; the threshold for a two-person household averaged $15,142; for three persons $18,552, and for a family of four $23,834.34 In other words, if you made even $12,000 in 2013 as a single individual, you would not be considered poor in America despite how incredibly difficult it would be to live on such an income. Likewise, if you and your partner had one child and your combined income reached even $19,000, you would no longer be considered poor, despite your precarious economic station; so too in a family of four earning even $24,000 a year. So when we speak of poverty, we are talking about substantial financial insecurity. Worse still, a growing number of Americans are not simply poor but are living in extreme poverty, defined as income less than half the poverty lines above. As of 2013, nearly twenty million people lived in this state of destitution,35 which is an increase of about eight million since 2000.36
While the national poverty figures are disturbing enough, the picture is even more distressing for persons of color. Although whites make up the largest group of people living in poverty at nineteen million, or forty-one percent of the total,37 the rate of poverty is far higher for Americans of color. According to the Census Bureau, African Americans are nearly three times as likely as whites to be poor, and Latinos are 2.5 times as likely as whites to live in poverty. Approximately one in four Latinos and twenty-seven percent of blacks officially live below the poverty line.38 Among American Indians and Alaska Natives, between twenty-five and thirty percent are poor, and in some indigenous communities—particularly reservation lands on which about a third of the nation’s Indian peoples live—nearly half of the community lives in poverty.39
Although some have pointed to Asian American income as proof of equal opportunity in America—and to suggest that there is something wrong with blacks, Latinos and Indian folks who lag behind—the data marshaled for this purpose is misleading. To begin, as mostly voluntary migrants, Asian Americans are a more self-selected group than blacks, Latinos or indigenous persons. They include more persons who came to the country with middle-class status, had college degrees, or were in the process of obtaining those degrees upon arrival. So naturally, we would expect Asian Americans in the aggregate to therefore appear more “successful” than groups whose members represent more of a cross-section of class status and experience. That said, when we actually examine Asian American status relative to white status, we discover persistent evidence that Asian folk too, despite claims of their “success,” are struggling and lag behind the dominant group. For instance, according to the most recent data on earnings, when we compare whites and Asians of the same age and with the same degree of education, whites routinely earn more than their Asian American counterparts. For those with high school diplomas only, white males between the ages of thirty-five and thirty-nine earn twenty-three percent more than comparable Asians—a gap that grows to a nearly fifty percent advantage between the ages of forty and forty-four. For those with undergraduate degrees, white males between thirty and thirty-four earn twenty-two percent more than comparable Asian Americans, and by the time those white men are in their mid-forties they are earning forty-six percent more than their Asian American counterparts—almost $30,000 more each year, on average.40
Additionally, claims that Asian American households are doing as well or better than even white households—because they have higher aggregate income than white households nationwide and poverty rates that are only slightly higher than whites’—rely on data that masks substantial disparities at the state and local level.41 About half of all Asian Americans live in the higher-income (and higher-cost-of-living) West, with roughly sixty percent residing in just six states: California, Hawaii, New York, New Jersey, Illinois and Washington. As a result, they will tend to have higher incomes and lower poverty rates than members of other groups who are more geographically dispersed in much lower-income and lower-cost areas.42 However, if we examine income and poverty data in the places where so many Asian Americans actually live, thereby comparing them to others who live in those same higher-income areas, things change dramatically. In cities with heavy Asian American presence like Los Angeles, San Francisco and New York, Asian American poverty rates are roughly double the rates for whites.43 In other words, despite claims of Asian “success” and the attempts of some to cast them as a “model minority” to be emulated by other more presumably problematic ones, Asian Americans too are struggling relative to whites.
As with poverty in general, extreme poverty is a particular concern for people of color. In fact, blacks and Hispanics are more likely to live in extreme poverty than whites are to be poor at all: one in eight African Americans are extremely poor and one in eleven Latinos are living at half the poverty line or below, compared to only about one in twenty-five whites who are that impoverished.44 Among the impoverished, people of color are also far more likely to live in high-poverty neighborhoods than are whites, further deepening the severity of their economic condition and limiting their ability to escape impoverishment. Impoverished African Americans are more than seven times as likely as poor whites to live in high-poverty neighborhoods, while poor Latinos are nearly six times as likely to do so.45 Although deprivation is always stressful for those experiencing it, living in communities of heavily concentrated poverty magnifies those stresses many times over. Such communities have fewer hospitals per capita than other communities, are less likely to have access to healthy food, and are less likely to have adequately resourced schools, in part because school funding is so over-reliant on property taxes in most places. Residents in concentrated-poverty neighborhoods are also cut off from the job and opportunity networks that exist for middle-class families, and even for lower-income families who live in communities where middle-class families are still largely present. Additionally, impoverished urban communities are far more likely to be places where there are waste facilities that directly compromise the health of residents, particularly children and the elderly.46
An especially disturbing number of the nation’s poor are children. About fifteen million children, or nearly one in five kids in the U.S., live in poverty,47 and since 2013, slightly more than half of all children in the nation’s public schools live in poverty.48 Far from an abstract concept, poverty has long-term impact on child development. Research has found that children in poverty are significantly more likely than their middle-class and affluent peers to show signs of impaired brain development in the pre-frontal cortex, which is critical for problem-solving and analytical skills.49 Independent of other factors known to impact neural development, poverty appears to have a uniquely debilitating impact on kids, due to the stresses of life in a low-income family and the subsequent lack of opportunities to which such children are exposed. Additional research finds that growing up in poverty can result in an unhealthy level of stress hormones being released into the bloodstream, which can impair neural development and contribute to a number of health problems, including heart disease, hypertension and stroke.50 The poor, and especially the extremely poor, are in many cases subjected to environments that produce a form of Post-Traumatic Stress Disorder (PTSD) similar to that experienced by combat veterans.51
Even those who aren’t “poor” are struggling to keep their heads above water. According to one recent survey, roughly three in four Americans live paycheck to paycheck, meaning they either have no savings or so little in savings that they could not withstand a layoff or medical emergency. Only one in four have sufficient savings to cover six months of expenses, half could only survive a three-month loss of income, and about twenty-seven percent have no savings at all.52 When we include those who are no more than fifty percent above the poverty line, and are therefore intensely vulnerable to a layoff or economic downturn, more than seventy-six million Americans, or nearly one in four, are poor or near poor.53
Meanwhile, even as local papers across the country herald the beginnings of a new boom in housing construction and a rejuvenated real estate market, at least fourteen million Americans continue to face the real prospect of losing their homes, equity, and access to credit due to foreclosure.54 They are unable to make their mortgage payments and have received little or no relief from the government, even as that government bailed out the very bankers whose actions helped to precipitate so much of the pain felt by homeowners. Additionally, rents in many areas have soared past the point of affordability,55 and in other cases tenants are being evicted from apartments under cover of local nuisance laws, solely for calling police “too many times”—a practice that is forcing poor women facing domestic violence to live with their abusers rather than face being put on the street.56 Having lost their homes to foreclosure, tens of thousands in the past several years have spent some portion of time without a place to live or in makeshift tent cities reminiscent of those that sprouted up with regularity during the 1930s,57 and on any given night, more than 600,000 Americans are homeless.58 Even as the nation’s wealthiest often have the option of luxuriating in one of many homes, Americans without housing security are quite literally dying on the street from exposure to the winter cold.59 As of 2013, 2.5 million children (an all-time record) were experiencing homelessness at some point in the year—approximately one of every thirty children in America.60
Despite assurances by billionaire investor Peter Schiff that “people don’t go hungry in a capitalist economy”—this from the same guy who says “mentally retarded” people should be paid $2 per hour—food insecurity and inadequate nutrition persist for far too many Americans.61 In the last few years there have been as many as seventeen million households composed of more than forty-five million people who faced real difficulties affordably meeting their nutritional needs.62 As of 2013, there were about five million people living in households with such low food security that they had to substantially reduce their food intake, skip meals altogether on certain days, or in some cases even go several days at a time without eating, all because of the financial condition of their families.63 Meanwhile, homeless Americans who rummage through garbage cans in search of food are subjected to arrest, as happened to homeless veteran James Kelly in Houston last year,64 while McDonald’s is counseling their employees to break food into smaller pieces so as to “keep them full,” rather than paying them enough of a wage to allow them to buy more food.65 As for health care, although the Affordable Care Act has removed millions from the ranks of the uninsured—with about ten million people added to the health care rolls just since 201366—there remain millions more who are falling through the cracks of the system due to the refusal of mostly conservative governors to extend Medicaid in more than twenty states.67 In a country where most personal bankruptcies are caused by a medical emergency for which patients have insufficient funds, to not ensure comprehensive and affordable care for all is to force too many of the ill into destitution.68
Still more worry about how, or if, they’ll be able to send their children to college, especially as higher education continues its three-decades-long drift toward loan-based financing and away from grants, thereby burdening students with a crushing debt. Today’s typical college graduates finish college with nearly four times the debt of their counterparts from the early 1990s—about $35,000 as compared to a bit more than $9,000.69 In 1987, tax dollars covered more than three-fourths of the cost of operating public colleges and universities, but by 2012, states had slashed their support for higher education to such an extent that only fifty-three percent of such costs were covered by taxpayers, while the rest have been made up by tuition and fee hikes.70 Average tuition and fees for both public and private colleges have more than doubled since the early 1980s, increasing the gap between the share of affluent kids and poor kids who are able to attend.71
Some of the problems that we can see so clearly in today’s economy—especially wage stagnation—have been a long time coming. Ever since the early 1970s, real wages for average American workers have been largely flat.72 This has been true, even as average worker productivity has roughly doubled in that period.73 If workers’ wages had kept pace with productivity and continued to grow along with wages at the top, as they had for the previous several decades, incomes for middle-class Americans would be about $18,000 higher than they are today.74 While standard economic theory holds that wages and productivity should rise together as workers earn a commensurate share of the value they produce, this relationship between pay levels and productivity has been shredded over the past few decades. Likewise, wages have remained flat even as employees are working more hours today than ever before. From the early 1970s until 2007, the average annual number of hours worked rose by seventeen percent.75 Workers are working harder than before and being more productive than ever, but they are making very little if any real gains in financial well-being.
Things have only gotten worse since the most recent recession. From 2007 to 2012, wages fell for the bottom seventy percent of the wage distribution despite productivity growth of 7.7 percent. When these data are combined with the wage stagnation that had already occurred since 2000, it is no exaggeration to say that for most workers, the first ten years of the new millennium was a lost decade for wages.76 Median income today is $3,600 lower than it was in 2001, adjusted for inflation, and has fallen $2,100 just since 2009.77 Things have been especially grim in terms of income stagnation for American men. In 1972, the median income for men between the ages of thirty-five and forty-four was equivalent to more than $54,000 today. But now, in large part because of the decline in manufacturing employment (a subject to which we’ll return), the median for such men stands at just above $45,000.78 The only reason that median income has been able to nudge up slightly for American families on the whole has been the entry of more women into the workforce; there are more two-earner families today than in the early 1970s. On the one hand, expanded opportunities for American women are obviously a positive and needed development. But on the other, if families today need two incomes to remain at the same level they enjoyed forty years ago with only one income-earner, something is clearly wrong with the larger economy.
Income and Wealth Inequality: Long-Term Trends and Current Realities
Among the things most Americans have long seemed to believe about our country is the idea that in some sense, we’re all part of one big team. Nods to national unity are common, and surely it isn’t hard to recall how, in the days following the terrorist attacks of 9/11, millions of Americans slapped bumper stickers on their cars sporting the slogan UNITED WE STAND. One part nationalistic and militaristic hubris, one part a genuine expression of emotional empathy with the victims and their families, the slogan and the concept behind it spoke to a deep-seated component of the nation’s ideology: the notion of reciprocity, or, more simply put, the idea that “we’re all in this together.”
Of course, in the wake of the 9/11 tragedy not all Americans shared this sentiment equally, and there was a marked gap between the willingness of white Americans to adorn their vehicles in such a manner and that of people of color. Non-whites, more viscerally aware of the ongoing inequities between their own life conditions and those of most in the white majority, were not as likely to sport such stickers or engage in the flag-waving that became so commonplace in the aftermath of the attacks. Unity, after all, is not something that can be wished into existence, or something that manifests simply because a tragedy has transpired. For many African Americans and other people of color there had been many 9/11s, so to speak, throughout their history on this continent, none of which had brought real unity or equity of experience.
That said, and with exceptions duly noted, the notion of unity, togetherness and reciprocity is something to which we have all been exposed and to varying degrees have likely internalized. While the ideology of unity and reciprocity hardly fits with the lived reality of those belonging to marginalized groups, the aspirational if not existential lure of the dominant narrative remains strong, so much so that many of our most recognizable national slogans over the years conjure this notion, from “What’s good for General Motors is good for America” to “A rising tide lifts all boats.”
Yet, in recent years, the idea that America is one big team has been increasingly difficult to accept, because of the rapidity with which disparities of income and wealth have been growing, opening up a vast chasm between the nation’s wealthiest and everyone else. Between late 2007 and 2009, the economy imploded, doubling unemployment rates and destroying more than a third of the nation’s housing value (particularly among the middle and working class), and yet Wall Street profits rose by 720 percent.79 When the majority of the American people can be thrown into the worst economic situation of the past seventy-five years, even as a small economic minority can enjoy massive profits due to their deliberate and predatory actions, the idea of America being one big unified homeland becomes almost impossible to swallow.
Economic injustice, though increasingly exposed since the onset of the Great Recession, has been emerging as a serious and intractable national problem for several decades. Whereas incomes of those in all income quintiles grew together from the late 1940s until the late 1970s, after that period, incomes for all but those at the top began to stagnate.80 By 2007, right before the collapse of the economy, the richest one percent of Americans was already receiving twenty-three percent of national income. This nearly one-quarter share of national income was the highest percentage received by the top one percent since immediately prior to the onset of the Great Depression,81 and nearly three times the share that was being received by this wealthy group just thirty-one years earlier in 1976.82 From 1979 to 2007, the richest one percent of Americans (2.5 to 2.8 million people during that time) nearly quadrupled their average incomes. Meanwhile, the middle three-fifths of Americans only saw a forty percent gain in average incomes over that time—less than 1.5 percent income growth per year.83 From 1993 to 2012, adjusted for inflation, real incomes for the bottom ninety-nine percent of American families grew by less than seven percent while incomes for the wealthiest one percent nearly doubled.84
To put income inequality in graphic terms, consider that in 2013, 165,000 Wall Street bankers took home average bonuses of $162,000 each, resulting in an overall bonus bonanza of nearly $27 billion: that’s nearly double the amount taken home annually by all 1.1 million Americans working full-time at the minimum wage combined.85 Even more disturbing, the most successful hedge fund managers—a group that manages investment portfolios for the super-rich, and about whom there will be more to say later—quite typically can make in one single hour of work what the average American family earns in twenty-one years.86
Sadly, income inequality is only the tip of the iceberg when it comes to understanding the depths of disparity that plague modern America. Much more substantial is the vast inequity in tangible assets from which families derive long-term financial security. Wealth disparities, in other words, represent the much larger portion of the iceberg—the part that remains under the metaphorical water, often unseen. Even before the economy cratered, disparities in wealth—from housing value to stocks and bonds to commercial real estate—were already significant. Once the housing bubble burst, taking with it about $6 trillion in lost assets (and often the only assets held by middle-class and working-class Americans), that gulf grew even wider.87 As of 2010, the bottom half of the American population owned only about one percent of all national wealth, while the wealthiest one percent possessed more than a third of all wealth in the nation.88 As for those assets most likely to generate substantial income, meaning investment assets like stocks, financial securities, and business equity and trusts, the wealthiest one percent of Americans own just over half of all such assets in the nation.89 Today, wealth inequality in America stands at a level double that of the Roman Empire, where the top one percent owned about sixteen percent of all assets.90
However significant this level of disparity may sound, it actually understates the problem. Within the top one percent of wealth holders there is a big difference between those who barely make it into this group, and those at the pinnacle who reside in the top one-tenth (0.1) or top one-hundredth (0.01) of one percent. As of 2012, the top one-tenth of one percent (roughly 160,000 families) owned about twenty-two percent of the nation’s assets, which is equal to the share of national wealth possessed by the poorest ninety percent of Americans. Meanwhile, the richest one-hundredth of a percent (about 16,000 families) owned 11.2 percent of all national assets.91
To visualize what this means, we can analogize the distribution of wealth to the distribution of seats in a football stadium. Let’s imagine we were going to the Super Bowl in a stadium that seats 65,000 people. If the seats in the stadium were distributed the way that wealth is in America, just sixty-five fans would get to share 14,300 of the seats in the stadium. In fact, forget sharing seats: they could knock out the seats entirely and bring in big lounge chairs, umbrellas, Jacuzzis and their own personal cabanas instead. They would have so much space they could play Frisbee during commercial breaks or time-outs if they felt like it, never worrying about bumping up against the rest of us. Six or seven of these people would actually be able to cordon off 7,280 of these seats for themselves. This would leave the other fifty-seven or fifty-eight fans within the top 0.1 percent to fight over the remaining 7,020 seats (tough, but I suppose they’d manage). Meanwhile, the poorest half of the fans, or roughly 32,500 of them, would be struggling to fit into only 650 seats, representing the one percent of the seats they own. Think of it as the absolute worst musical chairs game ever. People would have to sit on top of each other, more than fifty deep, just to make the math work. This is the extent of wealth inequality in America today; only in the real world, the disparities obviously have more consequence than the distribution of stadium seats.
For a few more examples to illustrate the astounding depths of wealth inequality in modern America, consider:
•As of 2014, the four hundred wealthiest Americans were worth $2.3 trillion. This is more than double what the same group was worth a decade ago, $300 billion more than what they were worth just one year earlier,92 and $600 billion more than in 2012.93 The average member of the Forbes 400 now has 70,000 times the wealth of the typical American family, no doubt because they have worked exactly 70,000 times harder or are exactly 70,000 times smarter.94
•As of 2013, the wealthiest thirty people in the United States owned $792 billion worth of assets, which was the same amount owned by the poorest half of Americans: about 157 million people in all.95
•From 2011 to 2014, nine of the wealthiest people in America—Bill Gates, Warren Buffet, Mark Zuckerberg, the two Koch brothers and the four principal Walton heirs—gained an average of over $13 billion from capital gains on pre-existing assets. These gains did not flow from new work on their part, nor an increase in their personal productivity or particular genius. They weren’t working more hours, and they didn’t come up with some new and innovative technological breakthrough in that time. They simply owned a bunch of stuff, and over a three-year period that stuff became more valuable because of gains in the stock market. Considering that the median income for American workers was $51,000 in 2013, it would take a quarter of a million years—which is about 50,000 more years than humans have even existed—for the typical American to earn as much as the average capital gain earned by these nine people just since 2011.96
•For a visual understanding of what all that means, consider that if the typical American stretched his or her annual income out, in one-dollar bills, from end to end, it would stretch roughly 25,500 feet, which is about 4.8 miles. Over three years at the same income, those bills would now stretch about 14.5 miles. Meanwhile, if we took the median amount of money gained by those nine super-rich Americans mentioned above over that same three-year period, and stretched it out, in one-dollar bills, from end to end, the money chain would stretch 1.2 million miles—a money chain long enough to circle the earth forty-eight times,97 or alternately, to stretch from the earth to the moon and back twice, and then stretch around the globe a few more times for good measure.98
•In all, the six heirs to the Walmart fortune are worth as much as the bottom forty percent of the American population, or roughly 120 million people.99 In fact, the Walton heirs, who are rich simply because of the family into which they were born (or in the case of Christy Walton, the one into which she married), have so much wealth at their disposal that they could buy every house, condo and townhome in Seattle or Dallas or Miami and still have $40 billion to spare, with which they could buy all the homes in Anaheim, California (if they love Disneyland), or Napa (if they really like wine). Just to put the Walton’s wealth in perspective, while the six heirs could purchase every home in these major U.S. cities, someone like Oprah Winfrey (whom most people think of as fabulously rich) could only afford to buy up all the homes in Mokena, Illinois, wherever that is.100 In fact, the combined wealth of Oprah, Steven Spielberg, Donald Trump, Ted Turner, Howard Schultz (the founder of Starbucks), Mark Cuban (owner of the Dallas Mavericks), Jerry Jones (owner of the Dallas Cowboys), Phil Knight (founder of Nike), and Mark Zuckerberg (founder of Facebook)—a total of about $77.5 billion as of 2015—does not equal even half the wealth held by the Walton heirs. Even if we added the wealth of Bill Gates to the mix—the world’s wealthiest individual—the combined wealth of these ten would still fall about $15 billion short of Walton money.101 Meanwhile, most Walmart employees work for wages that leave them near the poverty line if not below it, forcing many of them to rely on food stamps to supplement their meager incomes, as we’ll explore later.
Wealth disparities are especially stark when examined racially. Because of the nation’s history of enslavement, lynching, segregation and overt racial discrimination, families of color did not have the same opportunity as whites to accumulate land and other tangible assets. Although civil rights laws were passed in the 1960s to prohibit formal discrimination in employment and housing, the head start afforded to whites over many generations obviously did not evaporate simply because anti-discrimination laws were passed. Due to a history of unequal opportunity to accumulate assets,102 and the racially disproportionate impact of the recession on the real estate values of people of color,103 the median net worth of white American households as of 2011 stood at a level 15.7 times greater than the median for blacks and 13.3 times the median for Latinos.104 As for financial assets such as stocks and investments other than home equity, the ratio is nearly two hundred to one in favor of whites, with the median financial wealth for whites standing at about $36,000 and the median for blacks a virtually non-existent $200, which in most cases represents merely the money in their bank accounts.105 Even when black households are comparable to white households in terms of income, vast wealth discrepancies remain due to a history of unequal opportunity to accumulate and pass down assets. Comparing households that are middle class in terms of income, whites still have three times as much wealth as blacks, and among those in the top ten percent of income earners, white households have a nearly five-to-one advantage over black households.106 Most disturbing, white families with a high school dropout as the head of household have median net worth of $51,300, while the median for black families with college-educated heads of household is only $25,900.107 In other words, black households with heads who have a college degree have half the net worth of white households whose head finished tenth grade.
But it’s not only the weight of past racism that explains current wealth gaps. In recent years, wealth disparities between whites and blacks have been intensified because of blatant discrimination in mortgage lending. During the run-up to the housing collapse, even African American borrowers with solid credit were given subprime, high-interest loans, often by lenders who were deliberately targeting them for these purposes, such as Wells Fargo. While only about six percent of white borrowers with credit scores above 660 were given subprime loans, over twenty-one percent of blacks with comparable credit received these higher-cost mortgages.108 Most recently, discrimination testing conducted by the Fair Housing Justice Center in New York uncovered strong evidence of racial bias against potential homebuyers of color. According to a recent lawsuit against M&T Bank, prompted by the testing:
[The bank] sent out trained actors to explore whether white and non-white homebuyers would be treated differently when trying to prequalify for a mortgage. All followed a similar script, telling bank officers they were married with no children and were first-time home-buyers. The black, Latino and Asian testers presented slightly better qualifications when it came to income, credit and additional financial assets. In nine separate interactions recorded either with a camera or an audio device, employees at M&T Bank’s New York City loan office can be seen or heard treating the white applicants differently than the others, according to the suit. In one instance, a black candidate was told she did not have enough savings to buy a home. A white applicant with slightly lower income and credit scores and $9,000 less in savings was pre-approved for a loan. In another case, a Latina candidate was told she would qualify for a mortgage $125,000 less than the test’s white candidate with lower income, poorer credit and less cash.109
Although the type of disparate treatment evident in the M&T case may not be as egregious as that of Wells Fargo, which a few years ago had been deliberately steering low-income African Americans (whom they called “mud people”) into so-called “ghetto loans,” it nonetheless suggests ongoing obstacles to equal housing opportunity.110 As such, a significant portion of disparities in home ownership and net worth must be laid at the feet of discrimination in the present, and not seen merely as the residue of the past. The combined effects of past and present racial bias on the financial position of persons of color should not be underestimated as we examine why so few black and brown folks sit atop the nation’s economic structure. Of all persons in the top one percent of national wealth holders, ninety-six percent are white.111 Indeed, the four hundred wealthiest white people in America were worth approximately $2 trillion as of 2014: approximately the same amount as all forty million African Americans put together—no doubt because those four hundred white people have worked just as hard as all black people combined.112
Despite the evidence just examined, however, many continue to insist that America is a land of opportunity, and uniquely so, compared to the rest of the world. Such persons claim that even the poorest here are better off than virtually anyone else in the world, and that inequities between the haves and have-nots are smaller than they are elsewhere. But there is growing reason to doubt this rosy image. As for poverty, among industrialized nations, the United States has the third-largest percentage of citizens living at half or less of the national median income—the international standard for determining poverty. Only Mexico and Turkey rate worse among thirty-four modern, industrial democracies in terms of poverty rates.113 While conservatives claim that even the poor in America live better than the middle class elsewhere—a subject to which we’ll return in the next chapter—this argument simply isn’t true. Compared to those industrialized nations with which the United States likes to compare itself, not only are the poor here doing worse than the middle class elsewhere, they are also doing worse than the poor elsewhere, in large measure because of less complete safety nets in America. For instance, before the effect of taxes and various welfare benefits are considered, twenty-seven percent of Swedes are poor, which is slightly more than the twenty-six percent of Americans who are; but after the effects of taxes and transfers are considered, the poverty rate in Sweden plummets to only five percent, while safety nets in the United States only bring our poverty rate down to seventeen percent. Likewise, thirty-four percent of Germans are poor prior to the effects of social safety net efforts, but only eleven percent remain poor after them. In the U.K., where the poverty rate is the same as in the United States, safety nets cut poverty by more than two-thirds to only eight percent, which is twice as big a cut as that afforded by such programs in the United States.114
As for inequality compared to other nations, here too America’s contemporary record is not enviable. Among industrialized countries, the United States ranks fourth worst in income inequality between top earners and those at the bottom, and inequality here is actually growing much faster than in those other nations.115 At present, the poorest half of Americans own less of our nation’s wealth than the poorest half on the continents of Asia and Africa, and less than the poorest half in India, the U.K. and China. In other words, inequality is actually more severe in America than elsewhere.116 Importantly, it isn’t just the gap between the rich and poor in America that signifies our nation’s greater inequality relative to other countries; we also have the greatest wealth gaps between the middle class and the wealthy of any industrialized nation.117 Recent evidence suggests that this gap between the wealthy and the middle class is only getting larger, in fact; since 2010, middle-class wealth has remained flat, while wealth at the top has continued to grow, producing the largest gap between the affluent and the middle class in recorded U.S. history.118
But What About Mobility? Aren’t the Poor Just Temporarily Embarrassed Millionaires?
There is a long-disputed (and likely inaccurate) quote attributed to John Steinbeck to the effect that the reason socialism never took root in the United States was because workers didn’t consider themselves an exploited class, but rather, simply “temporarily embarrassed millionaires.” In other words, the poor and struggling may be poor and struggling today, but since America is a land of opportunity where one can climb from rags to riches with the right combination of effort and skill, there is no reason to fight for major social change or equality—just work harder so that you can be the one on top next time. Putting aside the inaccurate provenance of the sentiment itself, it is hard to dispute that a faith in upward mobility has always been strong in America; so too, it is hard to doubt that such faith could have the effect of dampening concern over inequality, even the incredibly deep divisions documented thus far, and the poverty, unemployment and wage stagnation that have marked the last several years. If one believes that a little extra effort will allow one to move up the ladder, after all, then one is free to view inequality and poverty as temporary way stations on the road to prosperity. Certainly, this is the mindset encouraged by Florida senator and presidential candidate Marco Rubio, when he says (echoing the phony quote from Steinbeck) in language all too real:
We have never been a nation of haves and have-nots. We are a nation of haves and soon-to-haves, of people who have made it and people who will make it.119
Yet, putting aside the sincerely held faith in American upward mobility, just how often do people rise in the economic hierarchy? Although the idea of upward mobility as a unique and central feature of the American experiment is long-standing, the sad truth is that such mobility seems to be less common in the United States today than elsewhere. According to the Global Wealth Databook for 2013, the likelihood of persons moving up in the wealth distribution is actually lower in the United States than in any other industrialized nation.120 Indeed, chronic inequality appears to be a central cause of limited mobility here. The available evidence suggests that in nations with greater inequality, intergenerational mobility is far less common than it is in more egalitarian societies. In more equitable nations such as Finland, Denmark, Sweden and Norway, the correlation between your parents’ income when you were a child and your own income as an adult is less than half as strong as in the United States and United Kingdom, both of which have far greater levels of inequality between the top and bottom.121
Although most kids born into the poorest fifth of American families will make it to a greater income level as adults, nearly four in ten will not. Even those who manage to improve their status economically don’t improve it by much: sixty percent of persons born into the poorest fifth will remain in the lower two-fifths as adults, meaning that at most they will move from poor or near-poor to lower middle class. Only one in ten will make it into the top fifth of earners. On the other end of the spectrum, nearly a third of all persons born into the top quintile will remain there, and about six in ten will remain in the top two quintiles—in other words, the upper middle class at least—while only one in nine will fall from the top to the bottom.122 For those in the bottom fifth of the income distribution, only 0.2 percent will climb into the top one percent of earners, while about eighty-three percent of those who started in the top one percent will manage to remain at least in the top ten percent as adults.123 So while some will go from rags to riches or riches to rags, the influence of parental status on one’s own status is strong. Poor families are ten times more likely to remain poor than to move into the highest income quintile, and those who start out rich are five times more likely to remain there as to fall into either of the lower two quintiles of earners.124 As for wealth, research that examined families from the 1980s through 2003, discovered that about three-fourths of the responsiblity for where an individual ends up in terms of wealth is explained by the wealth of one’s parents.125
Intergenerational mobility is especially limited for persons of color. For instance, half of black children born into families in the bottom fifth of income earners will remain there as an adult, compared to only twenty-three percent of white children. Nearly eight in ten African Americans born poor will remain in the bottom two-fifths of income earners (poor or lower-middle-class) as adults, compared to only forty-two percent of whites. Only an anemic three percent of blacks born poor will ultimately make it to the top quintile, and only one in ten will make it into the upper middle class or better. Although most whites born to impoverished families won’t gain access to the upper middle class or the ranks of the affluent either, they are five times as likely as blacks to go from the bottom to the top, and more than a third will attain upper-middle-class status at the very least.126
Mobility also plays out differently by race among those in the middle class. For whites born middle-class, forty-four percent will move up and only a third will fall backward, while for middle-class blacks, only one in seven will move up and a stunning sixty-nine percent will fall backward. In short, upward mobility is more than three times as likely for middle-class whites as for blacks, while downward mobility is more than twice as likely for middle-class blacks as for whites.127 When it comes to wealth, the ability to retain high status also differs by race: according to research that followed persons over a fifteen-to-twenty-year period, sixty percent of whites who were in the top fourth of wealth holders at the beginning of that period remained there by the end, compared to only twenty-two percent of African Americans.128
As for why mobility levels differ so markedly by race, it appears that the biggest culprit is the effect of concentrated poverty and disadvantage within the geographic spaces where most African Americans live. For whites born between 1955 and 1970, only five percent were raised in high-disadvantage neighborhoods (places with high poverty rates, lower school quality, less social capital, etc.), compared to eighty-four percent of blacks raised in such spaces. In contrast, only two percent of blacks born in that period were raised in low-disadvantage communities, compared to nearly half of whites.129 If blacks are seventeen times as likely to have been raised in high-disadvantage neighborhoods, while whites are twenty-two times as likely to have been raised in low-disadvantage neighborhoods, it isn’t hard to figure out why the opportunity structure remains skewed from year to year and generation to generation. Although rates of concentrated poverty have fallen somewhat, as noted previously, the ratio of concentrated poverty for blacks relative to whites remains high, ensuring ongoing inequity into the future.
Whodunit? Exploring the Causes of Growing American Inequality
But all of this begs the obvious question: Why? Why does America appear to be pulling apart, with ever-increasing levels of inequality in terms of both income and wealth? Why does the position of American workers seem to be declining while the position of the affluent is rising ever higher? There are several explanations for the economic maladies thus far discussed, some of which are better understood and recognized than others. Among the most commonly discussed, and certainly important, are the decline of manufacturing employment since the early 1970s—spurred in large measure by trade policies that opened up American markets to lower-cost goods from abroad—as well as the decline in the real value of the minimum wage, the weakening of the power of labor unions, and the preferential treatment afforded in the tax code to income from capital gains as opposed to labor. Less appreciated, but perhaps even more critical to the process of growing inequality, is the long-term economic trend away from a production-based economy toward an economy focused on financial services. This trend, which amounts to a casino-ization of the American economy, puts a premium on rampant speculation in stocks and various investment instruments and thereby disproportionately benefits that relatively small sliver of the population who make their living on Wall Street. Let’s examine these each in turn.
Looking first at the role of trade: Increased trade with poorer nations can exacerbate inequalities in richer countries for two reasons. First, increased trade results in an influx of low-cost goods from abroad, which undermines high-paying employment at home; and second, offshoring of production undermines the job status of workers in wealthier nations as corporations use low-wage employees abroad for work that would previously have been done by higher-paid persons in the richer nation. Far from a theoretical abstraction, this appears to have been the concrete reality in America since the 1980s, at which point trade barriers were lowered for both U.S. goods abroad and other nations’ goods here. Due to a spate of free trade agreements, inequality in the United States has increased, with as much as forty percent of the growing gap between the haves and have-nots due to trade policy and its impact on certain job sectors.130
It is estimated that trade policy accounts for about one-fifth of the decline in manufacturing employment during the 1990s, and another third of the decline that continued from 2000 to 2007. As a result of the loss of higher-paying manufacturing jobs and their replacement with lower-paying service-sector jobs, wages have stagnated at the bottom and middle of the income pyramid, contributing to the overall growth in income disparity.131 While increased trade and offshoring of production has been lucrative for corporate America and those with significant stockholdings in companies engaged in global trade, the impact on most has been quite the opposite. The ability to buy goods more cheaply—the supposed benefit of trade liberalization, which we’re told makes up for the decline in manufacturing employment in America—seems weak compensation for wage stagnation and little or no job security. After all, being able to buy things for less doesn’t matter much if the service-sector job you managed to get only pays a fraction of your previous position.
In addition to the effects of increased trade, the decline in the value of the minimum wage has contributed to growing economic injustice and chronic poverty. When the minimum wage was first created in the 1930s, it was set at a level that came to about half of the nation’s average wage. By 2010, at $7.25 per hour, the minimum pay level came to only thirty-nine percent of the average national wage. Five years later, it has continued to decline in real value. Today, the value of the minimum wage has fallen by more than a fifth since the late 1970s.132 Economists at Princeton and the University of California estimate that at least twenty percent of the recent rise in wage inequality can be traced to the falling real value of the minimum wage.133 Refusals by conservatives to support a proposed hike to just over $10 per hour (still less than what the minimum wage would be if it had kept pace with productivity), is literally keeping people poor. Although conservative talking heads like Bill O’Reilly claim that a boost in the minimum wage would have little impact on the well-being of workers (since relatively few earn it),134 such hikes in the wage floor would raise the pay of more than just the lowest-paid workers themselves. Those workers who earn slightly above the minimum would also see wage pressure push upward as their employers scrambled to keep ahead of the rising minimum. The Congressional Budget Office estimates that a higher minimum wage (somewhere in the $10-per-hour range) would boost pay levels for about 16.5 million workers and lift about a million people from the ranks of the poor,135 while the Economic Policy Institute predicts that the overall effect of such a minimum wage hike would be to lift the wages of nearly twenty-eight million workers: those currently receiving the minimum wage, those whose earnings are between the current minimum and $10 per hour, and even many whose current earnings are slightly above $10 per hour.136
Contrary to claims that such a boost might have a negative impact on overall employment—principally by raising labor costs for employers—evidence from sixty-four separate studies suggests the impact of minimum wage hikes on employment and unemployment levels is negligible at best, and certainly offset by the boost in income to those who are dependent on such wages for survival.137 In fact, the most recent evidence clearly debunks the notion that higher minimum wages destroy jobs. At the beginning of 2014, thirteen states increased their minimum wages, four because of new legislation and nine others because their minimums are pegged to the inflation rate automatically. After evaluating the impact of minimum wage increases in these states, economists found that states where the minimum wage went up actually had faster employment growth than states where it did not.
Perhaps the strongest rebuttal to the claim that a hike in minimum wage will harm the economy comes from Seattle. The city, which will be phasing in a $15-per-hour minimum wage over the next several years—and has been pilloried in the business press as signing its economic death warrant for doing so—already had a much higher minimum wage than the national average, even before the recent hike. But had that fact harmed the city? Hardly. As billionaire investor Nick Hanauer—an admitted and proud member of the nation’s one percent—explained it to his fellow plutocrats recently:
Most of you probably think that the $15 minimum wage in Seattle is an insane departure from rational policy that puts our economy at great risk. But in Seattle, our current minimum wage of $9.32 is already nearly 30 percent higher than the federal minimum wage. And has it ruined our economy yet? Well, trickle-downers, look at the data here: The two cities in the nation with the highest rate of job growth by small businesses are San Francisco and Seattle. Guess which cities have the highest minimum wage? San Francisco and Seattle. The fastest-growing big city in America? Seattle. Fifteen dollars isn’t a risky untried policy for us. It’s doubling down on the strategy that’s already allowing our city to kick your city’s ass.
Hanauer then went on to explain the economic logic behind a higher, rather than lower and stagnant minimum wage. Putting it in terms that even the most jaded of corporate executives should be able to comprehend, he notes:
If a worker earns $7.25 an hour, which is now the national minimum wage, what proportion of that person’s income do you think ends up in the cash registers of local small businesses? Hardly any. That person is paying rent, ideally going out to get subsistence groceries at Safeway, and, if really lucky, has a bus pass. But she’s not going out to eat at restaurants. Not browsing for new clothes. . . . Please stop insisting that if we pay low-wage workers more, unemployment will skyrocket and it will destroy the economy. It’s utter nonsense. The most insidious thing about trickle-down economics isn’t believing that if the rich get richer, it’s good for the economy. It’s believing that if the poor get richer, it’s bad for the economy.138
Although conservatives recently latched on to an article in Seattle Magazine, which suggested restaurants were closing their doors because of the wage hike there (and that restaurant owners were “panicked” at the new policy), an investigation by the Seattle Times debunked the claim. Indeed, the very restaurant owners whose decision to close certain locations had been chalked up by conservatives to the increase in the minimum wage, told the Times exactly the opposite. They support the wage increase and were in the process of opening entirely new locations or restaurants elsewhere in the city.139 If anything, both logic and experience tell us that policies to reduce inequality by boosting wages at the bottom can be expected to spur job creation and economic growth rather than suppress it.140 When workers have more they spend more, which in turn allows companies to produce more or provide more services to more people.
As for the importance of unions: their power from the 1940s through the 1960s allowed workers to successfully demand higher pay and ensure that as their productivity rose, so would their income and benefits. Shared prosperity between workers and owners served the nation well, as the economy was strong throughout the period of growing unionization. Indeed, most of the basic measures for human dignity that we now take for granted were the result of union efforts. As Tim Koechlin, director of the International Studies Program at Vassar explains:
A hundred years ago, U.S. workers—including millions of children—worked long hours for low wages in unsafe workplaces. Because of organized labor, the prospects for working Americans improved dramatically over the course of the twentieth century. Because of unions, millions of U.S. workers were able to achieve a middle-class life—economic security, home ownership, health insurance, vacation time and, perhaps, a college education for their children. From 1948 to 1973, the incomes of working-class families in the U.S. nearly doubled. In addition to higher wages, the struggles of organized labor have delivered virtually every protection and benefit enjoyed by U.S. workers. Unions have brought working Americans the forty-hour week, paid vacation, Social Security, Medicare and Medicaid, overtime pay, child labor laws, the Occupational Safety and Health Act, whistleblower protection laws, sexual harassment laws, lunch breaks and coffee breaks, wrongful termination protection, sick leave, the Americans with Disabilities Act, the weekend and much more. These rights, benefits and norms were not gifts from employers. They are the result of relentless organized struggle by working Americans.141
Unfortunately, union membership has declined, in large part due to deliberate efforts by corporations to break organizing drives by their employees, leaving workers vulnerable to stagnant wages, benefit give-backs, longer hours and less job security than in the past. Beginning in the 1970s, corporate America began a concerted campaign to blame unions for a loss of competitiveness in American industry. They also regularly threatened to move plants overseas if workers formed unions in places where they didn’t yet exist. Even though the evidence suggested mismanagement at the top was to blame for the slide in the domestic auto industry, for instance, blaming the United Auto Workers for the loss of domestic car sales became a dominant narrative and fed anti-union efforts.142 Although many of the resulting tactics of intimidation and union busting have been illegal, there has been little enforcement of labor law from either the Justice Department or National Labor Relations Board, both of which have been largely starved of funds for the purpose, and which are populated mostly by corporate attorneys who are disinclined to side with workers over management.143
The effects of anti-union efforts by corporate America have been substantial. While unions represented almost a third of the nation’s workforce in the late 1940s, by 2010, fewer than twelve percent of all workers were unionized. Even this number exaggerates union strength, because it encompasses both private sector unions and public unions, like those for municipal workers, teachers, police and firefighters. Looking only at private sector workers, membership in unions has fallen from about one in four in the early 1970s to less than seven percent today.144 As rates of union membership have declined, wages for most workers have stagnated, in part because the relative strength of labor and its ability to obtain a fair share of their increased productivity has been diminished.
As for taxes, over the last several decades the nation’s tax burden has shifted off the backs of wealthy individuals and corporations and onto those of average workers and families, thereby contributing to overall income inequality. Income taxes were steeply progressive for most of the mid-century period, with the top marginal rate reaching ninety-one percent in 1957 on all income over $300,000 for an individual and $400,000 for a married couple.145 But by the 1970s, conservatives were pushing for drastic cuts in income taxes for those at the top, as well as cuts in corporate income taxes and the creation of loopholes to allow companies to avoid taxes they would otherwise owe. Among the most important policies contributing to rising inequality, the preferential treatment of capital gains income ranks at the very top. A recent analysis of rising inequality of incomes since the early 1990s, for instance, found that capital gains income, along with the preferential tax treatment such income receives, has been the largest single contributor to growing income disparities during that time.146
The wealthy have long argued that income from capital gains—that is, income derived from increases in the value of investments—should be taxed either not at all, or at a lower rate than regular labor income. According to those who advocate such policies, low or no taxes on capital gains will spur the wealthy to invest more of their resources, thereby creating jobs and boosting the economy. In actuality, there is no evidence that lower capital gains tax rates spur investment or economic growth and quite a bit of evidence to negate the theory. Indeed, the economy has generally performed better when capital gains and labor income have been taxed at the same rate (and higher) as opposed to years when taxes on capital gains have been lower.147 Nevertheless, policymakers have been taken in by the argument for years—lobbied by the business class to create such a two-tiered tax structure—and so today, capital gains are taxed at a maximum rate of twenty percent (and most often as low as fifteen percent), as opposed to a 39.6 percent maximum for all regular labor income above roughly $407,000.
The effect of preferential treatment for capital gains income has been to treat the wealthy minority far better than the rest of the country’s people when it comes to taxation. Although middle-income families occasionally possess assets that produce capital gains, the overall distribution of financial assets remains incredibly unequal. Due to the far greater likelihood that the wealthy will own stocks, pooled investment funds or other income-producing assets, the median value of financial assets for the middle fifth of income earners is only about $17,000, while the median value of such assets held by the wealthiest tenth of earners is about $551,000.148 The top tenth of income earners are four times as likely as middle income earners to own stock, and more than twelve times as likely as persons in the bottom fifth of income earners to do so.149 So economic policies that favor income derived in the stock market will produce a disproportionate benefit for the wealthiest Americans, while meaning little in practical terms to the rest of us.
Because capital gains income receives such preferential treatment, America’s wealthiest families—a group whose incomes average more than $345 million annually and stem mostly from rents, interest and dividends, rather than active labor—actually pay taxes at a lower rate than households with average incomes as low as $75,000. In fact, the nation’s wealthiest four hundred families have a total effective tax rate averaging about 16.6 percent of their income, essentially the same as the 16.3 percent average paid by households with annual earnings as low as $50,000.150 Fully ninety percent of all benefits from the preferential treatment of capital gains income accrue to taxpayers with more than $200,000 in annual income, with seventy-eight percent of benefits received by those earning more than a half-million dollars per year. By 2015, it is estimated that the typical taxpayer with more than $1 million in income will save more than $131,000 in taxes each year on average, all because of this provision in the tax code that treats investment income more favorably than income earned from work.151 Currently, the wealthiest one percent of Americans make more annually from capital gains—not from actual work, but from interest, dividends and rents on things they already possess, even if they didn’t work one hour of the year—than the entire cost of all safety net program payouts in the United States combined, including Social Security, Medicare and Medicaid.152 And yet this income, received by about three million people, is taxed at a lower rate than the income earned by construction workers, physicians, food inspectors or law enforcement officials.
Other tax policies, including corporate tax loopholes that shelter offshore profits from taxation, or allow for accelerated depreciation write-offs, or permit deductions for the cost of advertising, have lowered the tax burden on American companies relative to average American families and individuals. Companies can avoid taxes in the U.S. by claiming huge losses at home while declaring massive profits abroad. Even though eighty-two percent of Bank of America’s revenue is earned in the United States, the company was recently allowed to claim that all of its profit was made overseas, where it is untouched by U.S. taxes, while it supposedly suffered $7 billion in losses stateside. By rigging its balance sheets this way—a practice that is entirely legal under existing law—BoA was able to avoid billions in tax liability, as did other large corporations. Pfizer, for instance, made more than forty percent of its revenue domestically and had $31 billion in profits overseas in 2011–2012, but declared $7 billion in American losses so as to avoid taxes.153 In all, according to the Wall Street Journal, sixty of the largest corporations in the United States “parked a total of $166 billion offshore” in 2012, thereby protecting more than forty percent of their profits from domestic taxation.154
No doubt it is programs and policies like this that help explain why corporate taxes as a share of overall taxes, as a share of national income, and as a share of corporate profits have all declined dramatically. Although financial aristocrats and their media defenders often complain about U.S. corporate taxes being too high—since the thirty-five percent statutory rate is higher than the rate in other industrialized nations—few companies actually pay anywhere near that percentage of their income in taxes, due to generous loopholes, shelters and gimmicks that allow them to substantially reduce their burdens. For instance, the 288 corporations in the Fortune 500 that were consistently profitable from 2008 to 2012 ultimately paid taxes at a rate that was only 19.4 percent of their income over that period, with one-third of these paying less than a ten percent rate. Twenty-six companies, including Boeing, General Electric and Verizon, paid no federal income tax at all over that five-year period, and roughly forty percent of the companies that remained profitable from 2008 to 2012 had at least one year during which they paid no taxes.155
No matter how one examines the data, there is simply no doubt that the tax picture for U.S. corporations is an increasingly rosy one. Whether examined as a share of the economy,156 as a share of all income taxes,157 or as a share of all federal tax revenue, corporate taxes are at historic lows. Today, corporations are contributing only about one-fifth as much of overall federal revenue as they did in the mid-1940s.158 Finally, in 2012 corporate taxes fell to only 12.1 percent of profits, the lowest level since 1972, and about half the norm that held from the late 1980s until the economic collapse in 2008.159
Indeed, while America’s wealthy minority complains about corporate taxes, many American companies actually pay less in taxes to the government than they pay to their chief executives each year. In 2013, for instance, seven of the nation’s largest firms, despite earning more than $74 billion in pre-tax profits in the United States, received nearly $2 billion in federal tax refunds, while paying their CEOs over $17 million in average compensation. Of the one hundred highest-paid CEOs in the U.S., twenty-nine of them were paid more in 2013 than their companies paid in federal income taxes. Their companies made average pre-tax profits of $24 billion that year, while raking in, on average, $238 million in tax refunds. In other words, these mega-wealthy corporations had effective tax rates of negative one percent.160 No wonder inequality is increasing in a tax environment such as this.
Finally, the increased financialization of the American economy has dramatically skewed economic returns to the rich relative to the rest of us. By financialization I mean the move away from productive economic activity—the manufacture of tangible goods for sale or the provision of basic services used by broad swaths of the public—and towards speculative investment, the buying and selling of companies to pump up stock values and, in particular, the increasingly popular practice of company stock buy-backs, in which companies use their profits not to increase production or hire new workers but to repurchase their own stock, thereby driving up stock prices by artificially inflating company value relative to what it would be if those shares remained in the open market. This stock overvaluation then pays substantial dividends to executives and shareholders while doing absolutely nothing for average Americans.
In the 1960s, forty percent of company earnings and borrowing went to investments in new production, equipment and company growth. By the 1980s, only ten percent of earnings was going back into investment, thanks to the growing power of shareholders demanding dividend payouts and higher stock values, which could be procured via stock buy-backs. Today, companies are literally borrowing money so as to pay shareholders and buy back their own stock, rather than invest in new production.161 According to economist William Lazonic, co-director of the Center for Industrial Competitiveness at the University of Massachusetts–Lowell, the 449 publicly traded companies in the S&P 500 Index used more than half of their profits—nearly $2.5 trillion in earnings—from 2003 to 2012 simply to repurchase their own stock shares, and thirty-seven percent more to pay dividends to shareholders. Less than one dollar in ten earned by these firms was put back into production, expansion, hiring, research or development. And some companies were actually spending more to repurchase stock than what their firms were making in net income in a given year.162
But as disturbing as those numbers may sound, 2014 was even worse. Last year, the same companies spent ninety-five percent of profits on stock repurchases and dividend payouts.163 Obviously, when earnings are used to buy up stock or pay shareholders, rather than expand a company or produce goods (which would require more workers or raises for those already working), the result will be increasing inequality between those at the top who make their money from things like stock value and the masses who make their money from labor income. Although the strategy works well for companies in the short term—by inflating stock values, share buy-backs help firms meet quarterly “earnings-pershare” prices—but in the long run can undermine their financial health by diverting resources from growth and development into activity that benefits only a very narrow stratum. Because the practice of open-market stock buy-backs has been largely unregulated since 1982, companies are increasingly using the practice less to help stabilize undervalued shares (one of the arguments made in favor of the practice) than to manipulate stock prices for the benefit of executives and short-term stockholders,164 contributing significantly to rising incomes and wealth at the top and stagnant income growth for everyone else.
Ultimately, the post–World War II consensus—a social contract of sorts—which held that working class persons should have access to a growing piece of the economic pie, and safety net protections for when they fell through the cracks, has been largely abandoned. Throughout the mid-twentieth century there was widespread and largely shared agreement from members of both main political parties in America as to the importance and value of strong unions, rising minimum wages, Social Security, government housing and jobs initiatives, and even progressive taxation. Far from left-wing concepts, these were seen as fully American concepts. However hard this may be to believe, the evidence is right there, embedded in the 1956 Republican Party platform, which included the following line:
We are proud of and shall continue our far-reaching and sound advances in matters of basic human needs—expansion of social security—broadened coverage in unemployment insurance—improved housing—and better health protection for all our people. We are determined that our government remain warmly responsive to the urgent social and economic problems of our people.
Later in the platform, the GOP bragged about the fact that under the leadership of President Eisenhower, “the Federal minimum wage has been raised for more than 2 million workers. Social Security has been extended to an additional 10 million workers and the benefits raised for 6½ million. The protection of unemployment insurance has been brought to 4 million additional workers,” and there had been “wage increases and improved welfare and pension plans for federal employees.” Not content to stop there, Republicans trumpeted the fact that union membership was up by two million since 1952. Later, the platform called for “equal pay for equal work regardless of sex,” maintaining prevailing union wages for employment on public contracts, extending minimum wage laws even further, and providing “assistance to improve the economic conditions of areas faced with persistent and substantial unemployment.” It also called for revisions to existing labor law that would protect the “rights of workers to organize into unions and to bargain collectively,”165 and actually called for an Equal Rights Amendment so as to essentially illegalize institutional sexism in the labor market.
Of course today, the Republican Party leadership would reject their 1956 platform, and those conservative commentators who hold sway on talk radio would roundly condemn it. It is hard to imagine such a pro-union platform, for instance, surviving amid the likes of reactionary talking heads such as Ann Coulter, who says the nation’s largest labor federation represents “useless” workers—including kindergarten teachers—rather than “men who have actual jobs,”166 or the even more influential Rush Limbaugh, who insists that unionized workers are “freeloaders,” as opposed to “real, working, non-unionized people.”167 But conservative hostility to unions and those who belong to them is not limited to the merely rhetorical; indeed, lawmakers in New Jersey, Wisconsin and Illinois have been leading an active assault on the rights of unionized workers, raiding pension funds for teachers and child welfare caseworkers (among others), and seeking to break unions altogether by allowing workers who are covered by a union contract to avoid paying dues even as they reap the benefits of collective bargaining. Additionally, the requirement that workers who choose not to join a union should still have to pay their “fair share” portion of collective bargaining costs—given that they benefit from bargaining on their behalf—is under attack. If successful, such an attack on union funding would functionally destroy organized labor by giving workers the benefits of unionization without asking them to shoulder any of the expense.168
Opposition to organized labor is so intense, in fact, that a conservative state senator in Tennessee recently complained about a plan by Volkswagen to bring over 200,000 jobs to the state, precisely because the German company is supportive of labor unions. As lawmaker Bo Watson put it, the VW plant would be a “magnet for unionized labor,” which might alter the “culture” of Tennessee—yes, apparently by creating jobs and boosting wages and benefits for its autoworkers, thereby undermining the “culture” of low-wage employment preferred by reactionaries like Watson.169 This is how far to the right the Republicans have moved in a half-century or so, and how glaringly “under the affluence” the nation’s mindset toward working people has become in the same period.
The longstanding and relatively liberal post-war consensus had been developed in large measure to co-opt the rising militancy of the working class in the wake of the Depression—in other words, to limit the threat of class warfare from below—but by the mid-1970s and 1980s, the rich had opted to abandon the consensus and wage their own brand of class warfare from above. Today, rather than supporting previously settled matters like the value of the minimum wage, politicians and commentators on the right often openly question the very existence of a minimum earnings floor—a position that was historically associated only with the most extreme and marginal libertarians.170 Even basic protections against the use of child labor are no longer considered sacred, with Maine’s far-right Governor Paul LePage proposing changes in labor law that would allow twelve-year-olds to work up to fifteen hours weekly.171 It is almost as if—as several other commentators have put it—conservatives are seeking to repeal the twentieth century in the interests of the affluent minority and with no concern for the well-being of the masses, who increasingly suffer the consequences of rising inequality and economic insecurity. Though such a charge may sound hyperbolic, it’s hard to avoid such a conclusion when Eric Bolling of FOX can say, as he did recently, that rather than emulating European nations that are seeking to cut back on hours in the workweek—a step that has been shown to boost productivity—we should emulate China by repealing all labor laws, including minimum wage protections, child labor laws and any upper limits on how many hours an employee can be made to work.172
Distressingly, as the social contract between the business class and working class has been torn up—at least insofar as aristocratic obligations to the public are concerned—that same public is still expected to do for the wealthy, as with the $800 billion no-strings-attached taxpayer-bailout of the very financial institutions that were responsible for the economic crisis in the first place. While defenders of the bailout insist it was necessary and that every dollar has been paid back, or is being paid back with interest, such an argument misses the larger point: namely, if America can bail out the wealthiest individuals and institutions on the face of the earth as a way to prevent financial catastrophe, why can’t that same nation bail out homeowners facing foreclosure? Why can’t we bail out the long-term unemployed, or those working at minimum wage? The rich may well pay back all the bailout money and then some—they surely should, and hardly deserve thanks when they do, as if they had done us some great favor—but it must mean something that so many policymakers think nothing of forking over hundreds of billions of dollars to institutions already noted for their illegal, unethical and irresponsible behaviors, while resisting the same for the poor and struggling. After all, such a bailout for the rest of us would also likely “pay the country back” in economic stimulus, consumer spending, greater tax revenues, reduced reliance on safety net programs, reduced health care costs and a host of other benefits. But in a society with an increasingly tattered social contract, it is apparent that sacrifice is only expected to run in one direction.
Some Final Words About Race and the Economic Crisis
Before concluding this chapter, it is important to note that for millions of Americans the downturn of the past several years and its effects in terms of wage stagnation, persistent unemployment, and struggles with affordable health care, housing and higher education, are nothing particularly new. For millions of people of color, such economic insecurity has been distressingly normal, generation in and generation out, for all of American history. Regardless of the health of the economy, it is virtually a truism that African American unemployment and poverty levels continually hover around or beyond recession levels.
In fact, it could be argued that part of why so many have taken notice of the crisis in recent years, and why it has become such a topic of concern, is precisely due to the way that normatively black and brown economic conditions have bled over into the white community. So long as economic pain was localized in subgroups with less power and influence—especially when those subgroups have long faced a history of discrimination and stigma—it failed to register on the radar screens of the larger citizenry. But once the insecurity began to be shared a bit—even then not equitably, but more so than white Americans had been used to—the magnitude of the problem suddenly appeared more obvious. Double-digit unemployment in the white community, even for a brief time, was truly new for many. White Americans on the whole had not experienced that kind of insecurity in the job market for three generations, since the Great Depression.
While people of color fared far worse during the recent collapse than whites—they were still the first to lose their jobs and the last to be hired back, and saw the vast majority of their already minimal wealth levels wiped out, particularly in terms of home value—the downturn seems to have had a greater psychological impact on whites. Precisely because of the relative advantage most white Americans have long taken for granted, we were less prepared for the kind of setback to which we were subjected in recent years. This is no doubt part of the reason why recent surveys have found that despite ongoing relative advantages over persons of color in the job market, housing market, educational system and elsewhere—about which I have written extensively in my previous books, and which I further document herein—white Americans are more pessimistic about the future than ever, and whites are far more pessimistic than members of other racial groups who are doing quite a bit worse.173
For a graphic and telling consideration of just how distressing the downturn seems to have been, especially for whites, one need look no further than the lead story in Newsweek from mid-April 2011 concerning what the cover referred to as “Beached White Males.” Therein, the author suggested that now the economic meltdown was really a crisis, because whites—even whites in the managerial class—were feeling the pinch, with some even experiencing the long-term unemployment that had previously been seen as the purview of only the lesser classes.174 There is a distressing and even heart-breaking irony to the article once one sifts through the self-loathing of corporate executives who can’t seem to cope with having to pound the pavement looking for work like mere mortals. Reading the piece, it becomes obvious just how dangerous it can be to have blind faith in the system, as apparently many of the men in the article long had. Once they came to realize that hard work and playing by the rules was not enough—something people of color and even poor whites have long understood—they were ill prepared for it.
None of this is to dismiss the real stresses faced by white Americans because of current economic conditions; rather, it is to say that part of our current predicament may indeed be worse precisely because we paid so little attention to the crisis when it was only affecting those other people. In fact, not only did we pay insufficient attention, but in many cases, the government helped facilitate the crisis directly by way of its actions. So, for instance, in 1999 North Carolina passed a law prohibiting banks from offering predatory and deceptive loans to homeowners, in large measure because lenders were targeting the poor and people of color with these instruments. Rather than applaud the ruling and seek to extend it nationwide or with comparable national legislation, the federal government overrode the new law, paving the way for several more years of these kinds of loans, which ultimately became the fulcrum of the economic meltdown.175 Had we cared more, attended to the warning signs, and resisted the growing culture of cruelty with regard to the needy, perhaps we wouldn’t be in the predicament we’re in at all. It is that culture of cruelty to which I now turn.