IRONICALLY, IT WAS a former Hollywood star who came to embody Rocky Balboa in real life; and at the same time, to embody the racist backlash to Black Power in politics. This real-life Rocky decided to challenge incumbent Gerald Ford for the presidential seat on the Republican ticket in 1976. Reagan fought down all those empowerment movements fomenting in his home state of California and across the nation. Hardly any other Republican politician could match his law-and-order credentials, and hardly any other Republican politician was more despised by antiracists. When Reagan had first campaigned for governor of California in 1966, he had pledged “to send the welfare bums back to work.” By 1976, he had advanced his fictional welfare problem enough to attract Nixon’s undercover racists to his candidacy, gaining their support in cutting the social programs that helped the poor. On the presidential campaign trail, Reagan shared the story of Chicago’s Linda Taylor, a Black woman charged with welfare fraud. “Her tax-free cash income is over $150,000,” Reagan liked to say. Actually, Taylor had been charged with defrauding the state of $8,000, an exceptional amount for something that rarely happened. But truth did not matter to the Reagan campaign as much as feeding the White backlash to Black Power.1
Gerald Ford used every bit of his presidential incumbent power to narrowly stave off Ronald Reagan’s challenge at the 1976 Republican National Convention. But Nixon’s pardoner and the steward of a poor economy lost to the “untainted” and unknown former Democratic governor of Georgia, Jimmy Carter. Black hopes were high until the austere Carter administration, to boost the economy, started unprecedented cuts in social welfare, health care, and educational programs while increasing military spending. From the lowest Black poverty rate in US history in 1973, the decade ended with record unemployment rates, inflation, falling wages, rising Black poverty rates, and increasing inequality. At the local level, struggling activists and residents partially or totally blamed corporate-friendly Black politicians for the growing poverty. There was supposedly something wrong with Black politicians. Unsurprisingly, no one ever uncovered any evidence to substantiate this political racism of Black politicians. Black politicians and the Black elites they largely served were hardly different from the politicians and elites of other races, selling out to the highest bidders or sticking to their antiracist and/or racist principles.2
While racist Blacks blamed Black politicians—and increasingly Black capitalists—for their socioeconomic struggles, racist Whites blamed Black people and affirmative action for their struggles in the 1970s. Racist ideas put all of these Americans out of touch with reality—as out of touch as one White male aerospace engineer who wanted to be a doctor. Allan Bakke was over thirty-three when the medical school at the University of California at Davis turned him away a second time in 1973, citing his “present age” and lukewarm interview scores as the main factors in the rejection. By then, more than a dozen other medical schools had also turned him away, usually because of his age. In June 1974, Bakke filed suit against the University of California Regents—the body that had fired Angela Davis four years earlier. He did not allege age discrimination. He alleged that his medical school application had been rejected “on account of his race,” because UC Davis set aside sixteen admissions slots out of one hundred for “disadvantaged” non-Whites. Agreeing, the California courts struck down the “quota” and ordered his admission.
The US Supreme Court decided to take Regents v. Bakke. Bakke’s lawyers argued that the quota system had reduced his chances for admission by forcing him to compete for eighty-four slots instead of the full one hundred. The Regents’ lawyers argued the state had a “compelling . . . interest” in increasing California’s minuscule percentage of non-White doctors. Since they generally received inferior K–12 educations, non-Whites tended to have lower college grade point averages (GPAs) and test scores than Whites—thus the need to set aside sixteen seats. And despite their lower scores, these non-White students were indeed qualified, said the Regents’ lawyers. Ninety percent of them graduated and passed their licensing exams, only slightly less than the White percentage.
The biggest irony and tragedy of the Regents v. Bakke case—and the affirmative action cases that followed—was not Allan Bakke’s refusal to look in the mirror of his age and interviewing prowess. Instead, it was that no one was challenging the admissions factors being used: the standardized tests and GPA scores that had created and reinforced the racial disparities in admissions in the first place. The fact that UC Davis’s non-White medical students had much lower Medical College Admission Test (MCAT) scores and college GPAs than their fellow White medical students, but still nearly equaled their graduation and licensing exam passage rates, exposed the futility of the school’s admissions criteria. Since segregationists had first developed them in the early twentieth century, standardized tests—from the MCAT to the SAT and IQ exams—had failed time and again to predict success in college and professional careers or even to truly measure intelligence. But these standardized tests had succeeded in their original mission: figuring out an “objective” way to rule non-Whites (and women and poor people) intellectually inferior, and to justify discriminating against them in the admissions process. It had become so powerfully “objective” that those non-Whites, women, and poor people would accept their rejection letters and not question the admissions decisions.
Standardized exams have, if anything, predicted the socioeconomic class of the student and perhaps a student’s first-year success in college or in a professional program—which says that the tests could be helpful for students after they are admitted, to assess who needs extra assistance the first year. And so, on October 12, 1977, a White male sat before the Supreme Court requesting slight changes in UC Davis’s admissions policies to open sixteen seats for him—and not a poor Black woman requesting standardized tests to be dropped as an admissions criterion to open eighty-four seats for her. It was yet another case of racists v. racists that antiracists had no chance of winning.3
With four justices solidly for the Regents, and four for Bakke, the former Virginia corporate lawyer whose firm had defended Virginia segregationists in Brown decided Regents v. Bakke. On June 28, 1978, Justice Lewis F. Powell sided with four justices in viewing UC Davis’s set-asides as “discrimination against members of the white ‘majority,’” allowing Bakke to be admitted. Powell also sided with the four other justices in allowing universities to “take race into account” in choosing students, so long as it was not “decisive” in the decision. Crucially, Powell framed affirmative action as “race-conscious” policies, while standardized test scores were not, despite common knowledge about the racial disparities in those scores.4
The leading proponents of “race-conscious” policies to maintain the status quo of racial disparities in the late 1950s had refashioned themselves as the leading opponents of “race-conscious” policies in the late 1970s to maintain the status quo of racial disparities. “Whatever it takes” to defend discriminators had always been the marching orders of the producers of racist ideas. Allan Bakke, his legal team, the organizations behind them, the justices who backed him, and his millions of American supporters were all in the mode of proving that the earth was flat and the United States had moved beyond racism in 1978. These racists happily consumed the year’s most prominent and acclaimed race relations sociological text, The Declining Significance of Race, and spun William Julius Wilson’s arguments to proclaim that race no longer mattered. The University of Chicago sociologist attempted to solve the racial paradox of the late 1970s: the rise of the Black middle class and the fall of the Black poor. Wilson characterized the post–World War II era “as the period of progressive transition from racial inequalities to class inequalities.” The “old barriers” of racial discrimination that restricted “the entire black population” had transformed into the “new barriers” restricting the Black poor. “Class has become more important than race in determining black access to privilege and power,” Wilson wrote.
Wilson did not acknowledge the racial progress for some and the progression of racism for all. As Wilson’s antiracist critics pointed out, he neglected the evidence showing the rising discrimination faced by rising middle-income Blacks—a point Michael Harrington’s The Other America had already made in 1962. Wilson focused his scholarly lens on the economic dynamics that created an urban Black “underclass,” a class made inferior, behaviorally, by its wrenching poverty.5
Assimilationist underclass scholarship in the late 1970s and early 1980s looked over at “ghetto ethnography,” those assimilationist anthropologists reconstructing the supposed substandard cultural world of non-elite urban Blacks. “I think this anthropology is just another way to call me a nigger,” complained a factory worker in the introduction to the classic antiracist ethnography of the era, Drylongso (1980). Syracuse anthropologist John Langston Gwaltney—who is blind—allowed his Black interviewees to construct their own cultural world. The New York Times characterized Drylongso as “the most expansive and realistic exposition of contemporary mainstream black attitudes yet published.”6
On the thirty-third anniversary of The Declining Significance of Race, when scholars were once again pitting class over race to explain racial inequities, Wilson did what only the best scholars have found the courage to do: he admitted the book’s shortcomings and confessed that he should have advanced “both race- and class-based solutions to address life chances for people of color.”7
It was these race- and class-based solutions that Justice Thurgood Marshall had tried to will into existence in his separate dissenting opinion for Regents v. Bakke. The dissenting opinion of Harry Blackmun, the decider in Roe v. Wade, came last. Blackmun gave America a timeless lesson: “In order to get beyond racism, we must first take account of race. There is no other way. And in order to treat some persons equally, we must treat them differently. We cannot—we dare not—let the Fourteenth Amendment perpetuate racial supremacy.” But that was exactly what racists intended to do. Supporters of affirmative action were “hard-core racists of reverse discrimination,” argued Yale law professor and former solicitor general Robert Bork. In the Wall Street Journal, Bork ridiculed the Supreme Court’s decision to keep a limited form of affirmative action. Bork and others like him used the Fourteenth Amendment to attack antiracist initiatives over the next few decades, leaving behind only the wreckage of widening racial disparities. Four years after Regents v. Bakke, White students were two and a half times more likely than Black students to enroll in highly selective colleges and universities. By 2004, that racial disparity had doubled.8
AS 1960S GAINS unraveled and poverty spread in the late 1970s, a growing number of Black people grew alienated from the US political system. As their alienation grew, the racist ideas about them grew. Black voters looked down on Black nonvoters as inferior. The nonvoters, they believed, had callously disregarded the blood shed for Black voting rights, had stupidly given up their political power, and as such were immoral and uncaring. Black nonvoters—or third-party Black voters like Angela Davis—clearly were not being driven to the polls by fear of Republican victories. They seemed to be only willing to vote for politicians, as Angela Davis began to realize.9
On November 19, 1979, the Communist Party announced its presidential ticket for the 1980 election. Sixty-nine-year-old Gus Hall, the longtime head of the CPUSA, was once again running for president. His newest running mate had reached the constitutionally required age of thirty-five on January 26. She had just joined the faculty at that historic campus where Black Studies had been born thirteen years earlier, San Francisco State University. Angela Davis agreed to partake in her first campaign for public office. But that does not mean Davis and other non-White members were totally happy with the CPUSA. The lack of diversity in the CPUSA leadership remained a source of conflict within the party in the 1980s.10
Nor was Davis happy with the decline of antiracist activism, which was slowing in the midst of—or rather, because of—the growing production and consumption of racist ideas in the late 1970s. “In a racist society it is not enough to be non-racist, we must be antiracist,” thundered Angela Davis in September 1979 at the Oakland Auditorium. She joined with Bay Area politicians and activists in urging protests against the upcoming Nazi rally nearby. All decade long, Davis’s National Alliance Against Racist and Political Repression had steadily challenged the growing Klan and Nazi groups. The Klan almost tripled its national membership between 1971 and 1980, unleashing its gun-toting terrorism in more than one hundred towns to try to destroy the gains of the 1960s. Lynchings were still occurring—at least twelve were committed in Mississippi in 1980, twenty-eight Black youngsters were killed in Atlanta from 1979 to 1982, and random street-corner executions took place in Buffalo in 1980. But Klan violence and lynchings by private citizens paled in comparison to the terror being perpetrated by gangs of policemen across the nation, from strip-searches and sexual abuse of Black women to pistol-whipping of Black males. By the early 1980s, one study showed that for every White person killed by police officers, police killed twenty-two Black people.11
“We can break this vicious cycle of racism, sexism, unemployment and inflation created by those who always put profits before people,” Davis blared on posters announcing her campaign rallies in 1980. The Communist politicos had to get the word out about their campaign stops because their party received much less media attention than President Jimmy Carter, who was campaigning for reelection, and Ronald Reagan, who had finally secured the Republican nomination. In early August 1980, Angela Davis brought her “People Before Profits” campaign back to the place where her public life had begun: UCLA. She lamented about the poor turnout of the media. “It’s part of a conspiracy to prevent us from getting our message to the people,” she said, sitting at a table with undistributed press packets. “If Ronald Reagan were holding a press conference here you wouldn’t have been able to see anything for blocks, there would have been so much press here.”12
Days earlier, on August 3, 1980, the press did show up in full force when the former California governor more or less opened his presidential campaign at the Neshoba County Fair. The event was just a few miles from Philadelphia, Mississippi, where three civil rights activists had been killed in 1964. It was a clever strategy that improved on the tactics Nixon had mastered before him. Reagan never mentioned race when he looked out at some of the descendants of slaveholders and segregationists, people who had championed “states’ rights” to maintain White supremacy for nearly two centuries since those hot days in the other Philadelphia, where the US Constitution had been written. Reagan promised to “restore to states and local governments the power that properly belongs to them.” He then dodged Carter’s charges of racism. Thanks in part to southern support, Reagan easily won the presidency.13
Reagan wasted little time in knocking down the fiscal gains that middle- and low-income people had made over the past four decades. Seemingly as quickly and deeply as Congress allowed and the poor economy justified, Reagan cut taxes for the rich and social programs for middle- and low-income families, while increasing the military budget. Reagan seemingly did offscreen what Sylvester Stallone had done on-screen, first knocking out elite Blacks the way Rocky had knocked out his opponent Apollo Creed in Rocky II (1979). And then, amazingly, Reagan befriended these Creeds—these racist or elite Blacks he had knocked down in previous fights—and used them to knock down the menacing low-income Blacks, as represented by Rocky’s opponent in Rocky III (1982), Clubber Lang, popularly known as Mr. T.14
During Reagan’s first year in office, the median income of Black families declined by 5.2 percent, and the number of poor Americans in general increased by 2.2 million. In one year, the New York Times observed, “much of the progress that had been made against poverty in the 1960s and 1970s” had been “wiped out.”15
As the economic and racial disparities grew and middle-class incomes became more unstable in the late 1970s and early 1980s, old segregationist fields—like evolutionary psychology, preaching genetic intellectual hierarchies, and physical anthropology, preaching biological racial distinctions—and new fields, like sociobiology, all seemed to grow in popularity. After all, new racist ideas were needed to rationalize the newly growing disparities. Harvard biologist Edward Osborne Wilson, who was trained in the dual-evolution theory, published Sociobiology: The New Synthesis in 1975. Wilson more or less called on American scholars to find “the biological basis of all forms of social behavior in all kinds of organisms, including man.” Though most sociobiologists did not apply sociobiology directly to race, the unproven theory underlying sociobiology itself allowed believers to apply the field’s principles to racial disparities and arrive at racist ideas that blamed Blacks’ social behavior for their plight. It was the first great academic theory in the post-1960s era whose producers tried to avoid the label “racist.” Intellectuals and politicians were producing theories—like welfare recipients are lazy, or inner cities are dangerous, or poor people are ignorant, or one-parent households are immoral—that allowed Americans to call Black people lazy, dangerous, and immoral without ever saying “Black people,” which allowed them to deflect charges of racism.16
Assimilationists and antiracists, realizing the implications of Sociobiology, mounted a spirited reproach, which led to a spirited academic and popular debate over its merits and political significance during the late 1970s and early 1980s. Harvard evolutionary biologist Stephen Jay Gould, who released The Mismeasure of Man in 1981, led the reproach in the biological sciences against segregationist ideas. Edward Osborne Wilson, not to be deterred, emerged as a public intellectual. He no doubt enjoyed hearing Americans say unproven statements that showed how popular his theories had become, such as when someone quips that a particular behavior “is in my DNA.” He no doubt enjoyed, as well, taking home two Pulitzer Prizes for his books and a National Medal of Science from President Jimmy Carter. Wilson’s sociobiology promoted but never proved the existence of genes for behaviors like meanness, aggression, conformity, homosexuality, and even xenophobia and racism.17
Angela Davis joined other antiracist scholars in fighting back against these segregationist claims inside (and outside) of the academy. Her most influential academic treatise, Women, Race & Class, appeared in 1981. It was a revisionist history of Black women as active historical agents despite the prevailing sexism and exploitation they had faced, and despite the racism they had faced from White feminists in the suffrage struggles and the recent reproductive and anti-rape struggles. Davis showcased the irony of the most popular pieces of anti-rape literature in the 1970s—Susan Brownmiller’s Against Our Will, Jean MacKeller’s Rape: The Bait and the Trap, and Diana Russell’s Politics of Rape—for reinvigorating the “myth of the Black rapist.” This myth, Davis said, reinforced “racism’s open invitation to white men to avail themselves sexually of Black women’s bodies. The fictional image of the Black man as rapist has always strengthened its inseparable companion: the image of the Black woman as chronically promiscuous.” Davis’s wide-ranging account of Black women activists provided a powerful response to Michele Wallace’s—and patriarchal historians’—racist pictures of Black women as “passive” during racial and gender struggles. Along with bell hooks’s Ain’t I a Woman: Black Women and Feminism, also published in 1981, Davis’s Women, Race & Class helped forge a new method of study, an integrative race, gender, and class analysis, in American scholarship. As hooks indelibly penned, “racism has always been a divisive force separating black men and white men, and sexism has been a force that unites the two groups.”18
But no great work of antiracist feminist scholarship—and Ain’t I a Woman and Women, Race & Class were instant classics—stood any chance of stopping those producers of the segregationist ideas that were defending Reagan’s racist and classist policies. In 1982, Reagan issued one of the most devastating executive orders of the twentieth century. “We must mobilize all our forces to stop the flow of drugs into this country” and to “brand drugs such as marijuana exactly for what they are—dangerous,” Reagan said, announcing his War on Drugs. Criminologists hardly feared that the new war would disproportionately arrest and incarcerate African Americans. Many criminologists were publishing fairytales for studies that found that racial discrimination no longer existed in the criminal justice system.
“We can fight the drug problem, and we can win,” Reagan announced. It was an astonishing move. Drug crime was declining. Only 2 percent of Americans viewed drugs as the nation’s most pressing problem. Few considered marijuana to be a particularly dangerous drug, especially in comparison with the more addictive heroine. Substance-abuse therapists were shocked by Reagan’s unfounded claim that America could “put drug abuse on the run through stronger law enforcement.”19
REELING FROM THE ANNOUNCEMENT, Angela Davis ran again for vice president on the CPUSA ticket in 1984. “Bring to victory the defeat of Ronald Reagan,” the “most sexist[,] . . . racist, anti–working class[,] . . . bellicose president in the history of this country,” she charged at a Black women’s conference in August. But the racial story of the 1984 elections was the stunning primary-campaign success of Martin Luther King Jr.’s former aide, the spellbinding orator and civil rights leader Rev. Jesse Jackson. Neither Jackson nor Davis garnered enough votes. Too many Americans fell for the myth of the good “morning in America” Reagan was selling them about the better economy.20
It may have been morning in America again in certain rich and White neighborhoods, which had awakened to prosperity repeatedly over the years. But it was not morning in America again in the communities where the CIA-backed Contra rebels of Nicaragua started smuggling cocaine in 1985. Nor was it morning in America for Black youths in 1985. Their unemployment rate was four times the rate it had been in 1954, though the White youth employment rate had marginally increased. Nor was it morning in America when some of these unemployed youths started remaking the expensive cocaine into cheaper crack to sell so they could earn a living. And the Reagan administration wanted to make sure that everyone knew it was not morning in America in Black urban neighborhoods, and that drugs—specifically, crack—and the drug dealers and users were to blame.
In October 1985, the US Drug Enforcement Administration (DEA) charged Robert Stutman, the special agent in charge of the DEA’s New York City office, with drawing media attention to the spreading of crack (and the violence from dealers trying to control and stabilize drug markets). Stutman drew so much attention that he handed Reagan’s slumbering War on Drugs an intense high. In 1986, thousands of sensationally racist stories engulfed the airwaves and newsstands describing the “predator” crack dealers who were supplying the “demon drug” to incurably addicted “crackheads” and “crack whores” (who were giving birth to biologically inferior “crack babies” in their scary concrete urban jungles). Not many stories reported on poor White crack sellers and users. In August 1986, Time magazine deemed crack “the issue of the year.” But in reality, crack had become the latest drug addicting Americans to racist ideas.21
If Reagan’s take on drugs was the overreported racist issue of the year, then the Free South Africa Movement (FSAM) made apartheid—and Reagan’s fiscal and military support of it—the under-reported antiracist issue of the year. The FSAM movement brought out into the open the long-standing ethnic racism between African Americans and African immigrants, an ethnic racism Eddie Murphy displayed in his box-office breaker of 1988, which became one of the most beloved Black comedies of all time. Coming to America, the love story of a rich African prince coming to Queens in search of a wife, hilariously mocked African Americans’ ridiculously untrue racist ideas of animalistic, uncivilized, corrupt, and warlike people in Africa, racist ideas that Roots had not managed to fully expunge.
Weeks after passing the most antiracist bill of the decade over Reagan’s veto—the Comprehensive Anti-Apartheid Act with its strict economic sanctions—Congress passed the most racist bill of the decade. On October 27, 1986, Reagan, “with great pleasure,” signed the Anti-Drug Abuse Act, supported by both Republicans and Democrats. “The American people want their government to get tough and to go on the offensive,” Reagan commented. By signing the bill, he put the presidential seal on the “Just say no” campaign and on the “tough laws” that would now supposedly deter drug abuse. While the Anti-Drug Abuse Act prescribed a minimum five-year sentence for a dealer or user caught with five grams of crack, the amount typically handled by Blacks and poor people, the mostly White and rich users and dealers of powder cocaine—who operated in neighborhoods with fewer police—had to be caught with five hundred grams to receive the same five-year minimum sentence. Racist ideas then defended this racist and elitist policy.22
The bipartisan act led to the mass incarceration of Americans. The prison population quadrupled between 1980 and 2000 due entirely to stiffer sentencing policies, not more crime. Between 1985 and 2000, drug offenses accounted for two-thirds of the spike in the inmate population. By 2000, Blacks comprised 62.7 percent and Whites 36.7 percent of all drug offenders in state prisons—and not because they were selling or using more drugs. That year, the National Household Survey on Drug Abuse reported that 6.4 percent of Whites and 6.4 percent of Blacks were using illegal drugs. Racial studies on drug dealers usually found similar rates. One 2012 analysis, the National Survey on Drug Use and Health, found that White youths (6.6 percent) were 32 percent more likely than Black youths (5 percent) to sell drugs. But Black youths were far more likely to get arrested for it.23
During the crack craze in the late 1980s and early 1990s, the situation was the same. Whites and Blacks were selling and consuming illegal drugs at similar rates, but the Black users and dealers were getting arrested and convicted much more. In 1996, when two-thirds of the crack users were White or Latina/o, 84.5 percent of the defendants convicted of crack possession were Black. Even without the crucial factor of racial profiling of Blacks as drug dealers and users by the police, a general rule applied that still applies today: wherever there are more police, there are more arrests, and wherever there are more arrests, people perceive there is more crime, which then justifies more police, and more arrests, and supposedly more crime.24
Since heavily policed inner-city Blacks were much more likely than Whites to be arrested and imprisoned in the 1990s—since more homicides occurred in their neighborhoods—racists assumed that Black people were actually using more drugs, dealing more in drugs, and committing more crimes of all types than White people. These false assumptions fixed the image in people’s minds of the dangerous Black inner-city neighborhood as well as the contrasting image of the safe White suburban neighborhood, a racist notion that affected so many decisions of so many Americans, from housing choices to drug policing to politics, that they cannot be quantified. The “dangerous Black neighborhood” conception is based on racist ideas, not reality. There is such a thing as a dangerous “unemployed neighborhood,” however. One study, for example, based on the National Longitudinal Youth Survey data collected from 1976 to 1989, found that young Black males were far more likely than young White males to engage in serious violent crime. But when the researchers compared only employed young males, the racial differences in violent behavior disappeared. Certain violent crime rates were higher in Black neighborhoods simply because unemployed people were concentrated in Black neighborhoods.25
But Reagan’s tough-on-crime Republicans had no intention of committing political suicide among their donors and redirecting the blame for violent crime from the lawbreakers onto Reaganomics. Nor were they willing to lose their seats by trying to create millions of new jobs in a War on Unemployment, which would certainly have reduced violent crime. Instead, turning the campaign for law and order into a War on Drugs enriched many political lives over the next two decades. It hauled millions of impoverished non-White, nonviolent drug users and dealers into prisons where they could not vote, and later paroled them without their voting rights. A significant number of close elections would have come out differently if felons had not been disenfranchised, including at least seven senatorial races between 1980 and 2000, as well as the presidential election of 2000. What an ingeniously cruel way to quietly snatch away the voting power of your political opponents.26
Even the statistics suggesting that more violent crime—especially on innocent victims—was occurring in urban Black neighborhoods were based on a racist statistical method rather than reality. Drunk drivers, who routinely kill more people than violent urban Blacks, were not regarded as violent criminals in such studies, and 78 percent of arrested drunk drivers were White males in 1990. In 1986, 1,092 people succumbed to “cocaine-related” deaths, and there were another 20,610 homicides. That adds up to 21,702, still lower than the 23,990 alcohol-related traffic deaths that year (not to mention the number of serious injuries caused by drunk drivers that do not result in death). Drug dealers and gangsters primarily kill each other in inner cities, whereas the victims of drunk drivers are often innocent bystanders. Therefore, it was actually an open question in 1986 and thereafter whether an American was truly safer from lethal harm on the inner city’s streets or on the suburban highways. Still, White Americans were far more likely to fear those distant Black mugshots behind their television screens than their neighborhoods’ White drunk drivers, who were killing them at a greater rate.27
Since Reagan never ordered a War on Drunk Driving, it took a long and determined grassroots movement in the 1980s, forged by Mothers Against Drunk Driving (MADD), and countless horrible incidents—such as the drunk driver who killed twenty-seven schoolbus passengers in 1988—to force reluctant politicians to institute stronger penalties. But these new penalties for DUIs and DWIs still paled in comparison with the automatic five-year felony prison sentence for being caught for the first time with five grams of crack.
AS IT WAS, the media’s attention in 1986 was not on the drunk drivers but focused narrowly on sensational crack crime stories and the subsequent effects on the Black family. In a CBS special report on “The Vanishing Family: Crisis in Black America,” the network presented images of young welfare mothers and estranged fathers in a Newark apartment building, stereotypical images of Black female promiscuity, Black male laziness, and irresponsible Black parenting—the pathological Black family. It was these types of tales that prompted an aggravated Angela Davis to write an essay on the Black family in the spring of 1986. The percentage of children born to single Black women had risen from 21 percent in 1960 to 55 percent in 1985, Davis said. Black teenager birthrates could not explain this increase (those figures had remained virtually unchanged from 1920 to 1990). Davis explained that the “disproportionate number of births to unmarried teenagers” had been caused by the fact that older, married Black women had started having fewer children in the 1960s and 1970s. Therefore, it was the overall percentage of babies born to young and single Black mothers as opposed to married mothers—not the sheer number of babies born to single Black mothers—that dramatically rose.28
But to Reagan propagandists, welfare caused the nonexistent spike in single Black mothers, and the nonexistent spike had made the Black family disappear. “Statistical evidence does not prove those suppositions [that welfare benefits are an incentive to bear children],” admitted Reagan’s chief domestic policy adviser, Gary Bauer, in The Family: Preserving America’s Future (1986). “And yet, even the most casual observer of public assistance programs understands there is indeed some relationship between the availability of welfare and the inclination of many young women to bear fatherless children.” Evidence hardly mattered when convincing Americans that there was something wrong with Black welfare mothers—and therefore, with the Black family.29
Even the adored civil rights lawyer Eleanor Holmes Norton felt the need in 1985 to urge the restoration of the “traditional Black family.” “The remedy is not as simple as providing necessities and opportunities,” Norton explained in the New York Times. “The family’s return to its historic strength will require the overthrow of the complicated predatory ghetto subculture.” Norton provided no evidence to substantiate her class racism that “ghetto” Blacks were deficient in values of “hard work, education, respect for the Black family and . . . achieving a better life for one’s children,” in comparison to Black elites or any other racial class.30
This racist drug of the declining Black family was as addicting to consumers of all races as crack—and as addicting as the dangerous Black neighborhood. But many of the Black consumers hardly realized they had been drugged. And they hardly realized that the new television show they thought was so good at counteracting unsavory thoughts of Black people was just another racist drug.