BY THE TIME Biggie Small’s “Juicy” was released in 1994, a growing number of academics was accepting the truth that “intelligence” was so transient, so multifaceted, so relative, that no one could accurately measure it without being biased in some form or fashion. And these revelations were threatening the very foundation of racist ideas in education (as well as sexist and elitist ideas in education). These revelations were endangering the racist perceptions of the historically White schools and colleges as the most intelligent atmospheres; the contrived achievement gap (and actual funding gap); the privileged pipelines for Whites into the best-funded schools, colleges, jobs, and economic lives; and the standardized testing that kept those pipelines mostly White. Harvard experimental psychologist Richard Herrnstein and political scientist Charles Murray watched the growth of these endangering ideas in the 1980s and early 1990s. In response, they published The Bell Curve: Intelligence and Class Structure in American Life, a landmark book that gave standardized tests—and the racist ideas underpinning them—a new lease on life.
In the first sentence, Herrnstein and Murray took aim at the spreading realization that general intelligence did not exist, and as such, could not vary from human to human in a form that could be measured on a single weighted scale, such as a standardized test. “That the word intelligence describes something real and that it varies from person to person is as universal and ancient as any understanding about the state of being human,” Herrnstein and Murray wrote at the beginning of their Introduction. They went on to dismiss as “radical” and “naïve” those antiracists who rejected standardized test scores as indicators of intelligence and thus the existence of the racial achievement gap. For Hernnstein and Murray, that left two reasonable “alternatives”: “(1) the cognitive difference between blacks and whites is genetic” (as segregationists argued); “or (2) the cognitive difference between blacks and whites is environment” (as assimilationists argued). Actually, Hernnstein and Murray reasoned, “It seems highly likely to us that both genes and the environment have something to do with racial differences.” They claimed that “cognitive ability is substantially heritable, apparently no less than 40 percent and no more than 80 percent.”
The increasing genetically inferior “underclass” was having the most children, and as they had the most children, the great White and wealthy “cognitive elite” was slowly passing into oblivion. “Inequality of endowments, including intelligence, is a reality,” Hernnstein and Murray concluded. “Trying to eradicate inequality with artificially manufactured outcomes has led to disaster.”1
In fact, it was the resistance to egalitarian measures by those all-powerful beneficiaries of inequality and their producers of racist ideas, like Hernnstein and Murray, that had led to disaster. The book was well marketed, and initial reviews were fairly positive. It arrived during the final straightaway to the 1994 midterm elections, around the time the New Republicans issued their extremely tough “Contract with America” to take the welfare and crime issue back from Clinton’s New Democrats. Charles Murray started the midterm election cycle whipping up voters about the “rise of illegitimacy,” and ended by rationalizing the “Contract with America,” especially the New Republicans’ tough-on-crime “Taking Back Our Streets Act” and tough-on-welfare “Personal Responsibility Act.”2
The term “personal responsibility” had been playing minor roles for some time. In 1994, Georgia representative Newt Gingrich and Texas representative Richard Armey, the main authors of the “Contract with America,” brought the term to prime time—to the lexicon of millions of American racists—targeting not just Black welfare recipients. The mandate was simple enough: Black people, especially poor Black people, needed to take “personal responsibility” for their socioeconomic plight and for racial disparities, and stop blaming racial discrimination for their problems, and depending on government to fix them. The racist mandate of “personal responsibility” convinced a new generation of Americans that irresponsible Black people caused the racial inequities, not discrimination—thereby convincing a new generation of racist Americans to fight against irresponsible Black people.
It made sense to encourage a Black individual (or non-Black individual) to take more responsibility for his or her own life. It made racist sense to tell Black people as a group to take more personal responsibility for their lives and for the nation’s racial disparities, since the irresponsible actions of Black individuals were always generalized in the minds of racists. According to this racist logic, Black people and their irresponsibility were to blame for their higher poverty and unemployment and underemployment rates, as if there were more dependent and lazy Black individuals than dependent and lazy White individuals. Slaveholders’ racist theory of African Americans as more dependent had been dusted off and renovated for the 1990s, allowing racists to reside in the hollow mentality of thinking that African Americans were not taking enough personal responsibility, and that’s why so many were dependent on government welfare, just as they used to be dependent on their masters’ welfare.
It was a popular racist idea—even among Black people who were generalizing the individual actions of someone around them. In the 1994 midterm elections, voters handed Republicans and their dictum on personal responsibility control of Congress. After the New Democrats got tougher than the New Republicans by passing the toughest crime bill in history, New Republicans pledged to get even tougher than the New Democrats. Both angled to win over one of the oldest interest groups—the racist vote—which probably had never before been as multiracial as it was in 1994.
As 1995 began, the critical and affirming responses of The Bell Curve began to cross fire. It is hard to imagine another book that sparked such an intense academic war, possibly because the segregationists, in their think tanks, and the assimilationists, in universities and academic associations, and the antiracists, in their popular Black Studies and critical race theory collectives, were all so powerful. In his revised and expanded 1996 edition of The Mismeasure of Man, Stephen Jay Gould maintained that no one should be surprised that The Bell Curve’s publication “coincided exactly . . . with a new age of social meanness.” The Bell Curve, said Gould, “must . . . be recording a swing of the political pendulum to a sad position that requires a rationale for affirming social inequalities as dictates of biology.” He criticized the proponents of this new meanness for their calls to “slash every program of social services for people in genuine need . . . but don’t cut a dime, heaven forbid, from the military . . . and provide tax relief for the wealthy.” British psychologist Richard Lynn defended the social meanness and The Bell Curve, asking, in an article title, “Is Man Breeding Himself Back to the Age of the Apes?” The “underclass” was only “good” at “producing children,” and “these children tend to inherit their parents’ poor intelligence and adopt their socio-pathic lifestyle, reproducing the cycle of deprivation.” The American Psychological Association (APA)—representing the originators and popularizers of standardized intelligence testing—convened a Task Force on Intelligence in response to The Bell Curve. “The differential between the mean intelligence test scores of Blacks and Whites does not result from any obvious biases in test construction and administration, nor does it simply reflect differences in socio-economic status,” the assimilationist and defensive APA report stated in 1996. “Explanations based on factors of caste and culture may be appropriate, but so far there is little direct empirical support for them. There is certainly no such support for a genetic interpretation. At this time, no one knows what is responsible for the differential.” No one will ever know what doesn’t exist.3
While congratulating and lifting up Hernnstein and Murray for The Bell Curve, Republican politicians tried to unseat Angela Davis after UC Santa Cruz’s faculty awarded her the prestigious President’s Chair professorship in January 1995. “I’m outraged,” California state senator Bill Leonard told reporters. “The integrity of the entire system is on the line when it appoints someone with Ms. Davis’ reputation for racism, violence, and communism.” Davis, he said, was “trying to create a civil war between whites and blacks.” Southern segregationists had said that northern integrationists were trying to create a civil war between the races in the 1950s. Enslavers had said that abolitionists were trying to create a civil war between the races back in the 1800s. Both northern and southern segregationists had regarded Jim Crow and slavery as positively good and claimed that discrimination had ended or never existed. As much as segregationist theory had changed over the years, it had remained the same. Since the 1960s, segregationist theorists, like their predecessors, were all about convincing Americans that racism did not exist, knowing that antiracists would stop resisting racism, and racism would then be assured, only when Americans were convinced that the age of racism was over.4
After Hernnstein and Murray decreed that racial inequality was due not to discrimination, but to genetics, Murray’s co-fellow at the American Enterprise Institute, almost on cue in 1995, decreed “the end of racism” in his challenging book, which used that phrase as its title. “Why should groups with different skin color, head shape, and other visible characteristics prove identical in reasoning ability or the ability to construct an advanced civilization?” asked the former Reagan aide Dinesh D’Souza. “If blacks have certain inherited abilities, such as improvisational decision making, that could explain why they predominate in certain fields such as jazz, rap, and basketball, and not in other fields, such as classical music, chess, and astronomy.” These racist ideas were not racist ideas to D’Souza, who wrapped himself in his Indian ancestry on the book’s first page in order to declare that his “inclinations” were “strongly antiracist and sympathetic to minorities.” D’Souza, the self-identified antiracist, rejected the antiracist notion that racism was “the main obstacle facing African Americans today, and the primary explanation for black problems.” Instead, he regarded “liberal antiracism” as African Americans’ main obstacle, because it blamed “African American pathologies on white racism and opposes all measures that impose civilization standards.”5
With D’Souza’s incredible writing and speaking and marketing talents—and powerful backers—he had managed to get many Americans to ponder the issues discussed in The End of Racism. But discrimination was everywhere in 1995 for people who cared enough to open their eyes and look at the policies, disparities, and rhetoric all around them. How could anyone claim the end of racism during one of the most racially charged years in US history, with racist ideas swinging back and forth like Ping-Pong balls in the media coverage of the criminal trial of the century? From the opening statements on January 24 to the live verdict on October 3, 1995, the O. J. Simpson murder trial and exoneration became the epitome of softness on crime for upset racist Americans.6
The O.J. case was not the only evidence for the progression of racism that D’Souza wisely omitted. Florida’s Don Black established one of the earliest White supremacist websites, Stormfront.org, in 1995. Informing the views of this new crop of “cyber racists,” as journalist Jessie Daniels termed them, were segregationists like Canadian psychologist J. Phillippe Ruston, who argued that evolution had given Blacks different brain and genital sizes than Whites. “It’s a trade-off; more brain or more penis. You can’t have everything,” Ruston told Rolling Stone readers in January 1995. In March, Halle Berry starred in Losing Isaiah as the spiraling debate over interracial adoptions hit theaters. The film was about a Black mother on crack whose baby is adopted by a White woman. And while the idea of Black parents adopting a White child was beyond the racist imagination, assimilationists were not only encouraging White savior parents to adopt Black children, but claiming that Black children would be better off in White homes than they were in Black homes.7
When asked in 1995 to “close your eyes for a second, envision a drug user, and describe that person to me,” 95 percent of the respondents described a Black face, despite Black faces constituting a mere 15 percent of drug users that year. But racist Americans were closing their eyes to these studies, and opening them to pieces like “The Coming of the Super Predators” in the Weekly Standard on November 27, 1995. Princeton University’s John J. Dilulio—a fellow at the Manhattan Institute, where Charles Murray had resided in the 1980s—revealed the 300 percent increase in murder rates for Black fourteen- to seventeen-year-olds between 1985 and 1992, a rate six times greater than the White increase. He did not explain this surge in violence by revealing the simultaneous surge in unemployment rates among young Black males. Nor did Dilulio explain the violent surge by revealing that drug enforcement units were disproportionately mass incarcerating young Black drug dealers, in some cases knowing full well that the consequence of breaking up a drug ring was a violent struggle for control of the previously stabilized market. Dilulio explained this violent surge by sensationalizing the “moral poverty” of growing up “in abusive, violence-ridden, fatherless, Godless, and jobless settings.” When we look “on the horizon,” he said, there “are tens of thousands of severely morally impoverished juvenile super-predators” who “will do what comes ‘naturally’: murder, rape, rob, assault, burglarize, deal deadly drugs, and get high.” What was Dilulio’s solution to “super-predators”? “It’s called religion.”8
In the eyes of Dilulio, in the eyes of millions of people of all races, the baggy-clothes wearing, Ebonics-swearing, Hip Hop–sharing, “Fuck tha Police”–declaring young Black male did not have to wear a costume on Halloween in 1995. He was already a scary character—a “menace to society”—as a 1993 film had depicted (Menace II Society). And his young mother was a menace for giving birth to him. The main female and male prey of predatory racism were effectively stamped “super-predators.” As an antiracist teacher in Menace II Society told young Black males, “The hunt is on and you’re the prey!”9
In the midst of all of these proclamations about the end of racism in 1995, African Americans engaged in the largest political mobilization in their history, the bold Million Man March on Washington, DC. It had been proposed by Louis Farrakhan after the smoke cleared from the 1994 midterm elections. March fever quickly enraptured Black Americans. Antiracist feminists, Angela Davis included, ridiculed the gender racism of the march’s unofficial organizing principle: Black men must rise up from their weakened state of emasculation to become heads of households and communities and uplift the race. “Justice cannot be served, by countering a distorted racist view of black manhood with a narrowly sexist vision of men standing ‘a degree above women,’” Davis said at a Midtown Manhattan press conference on the eve of the march. But some critics went too far. As some Black feminists were erroneously calling march organizers sexist for mobilizing just Black men, some White assimilationists were erroneously calling march organizers racist for mobilizing just Black men.10
Some activists who split over the Million Man March did come together in the summer of 1995 to defend the life of the world’s most famous Black male political prisoner, Mumia Abu-Jamal, who had been convicted of killing a White police officer in Philadelphia in 1982. “These are America’s death row residents: men and woman who walk the razor’s edge between half-life and certain death,” Mumia said in Live from Death Row, a collection of his commentaries. “You will find a blacker world on death row than anywhere else. African-Americans, a mere 11 percent of the national population, compose about 40 percent of the death row population. There, too, you will find this writer.”11
Weeks after Live from Death Row appeared to a shower of reviews in May 1995, and days before Mumia’s lawyers filed an appeal for a new trial, law-and-order Pennsylvania governor Thomas Ridge, a Republican, signed Mumia Abu-Jamal’s death warrant. His execution would be August 17, 1995. Protests erupted around the world that summer for Mumia’s life and for the death of capital punishment. Among the protesters were graying activists, some of whom had screamed “Free Angela” decades ago, and younger ones, some of whom had helped to mobilize the Million Man March. But before the National Day of Protest was to take place, scheduled for August 12, Mumia was granted an indefinite stay of execution.12
At the end of that volcanic summer, the vast majority of African Americans were supportive of the doubly conscious Million Man March, doubly conscious of racist and antiracist ideas. Arguably, its most pervasively popular organizing principle was personal responsibility, the call for Black men to take more personal responsibility for their lives, their families, their neighborhoods, and their Black nation. Many of the roughly 1 million Black men who showed up on the National Mall on October 16, 1995, showed up believing the racist idea that something was wrong with Black men and Black teens and Black boys and Black fathers and Black husbands. But many of those marchers who stood there and listened to the fifty speakers also believed the antiracist idea that there was something wrong with rampant discrimination. As Louis Farrakhan thundered at the climax of his two-and-a-half-hour oration, “The real evil in America is not white flesh or black flesh. The real evil in America is the idea that undergirds the setup of the Western world, and that idea is called white supremacy.”13
Bill Clinton did not greet the million Black men or hear their exclamations of racism’s persistence on October 16. Instead he gave a racial progress speech at the University of Texas, pleading in the heart of evangelical America for racial healing, egging on the mass evangelical crusade for racial reconciliation in 1996 and 1997. Crusading evangelicals would go on to preach that the so-dubbed problem of mutual racial hate could be solved by God bringing about mutual love. Clinton, at least, did recognize in his Texas speech that “we must clean the house of white America of racism.” But he surrounded one of the most antiracist statements of his presidency with two of the most racist statements of his presidency. Instead of relaying statistics that Whites usually suffered violence at the hands of Whites, Clinton legitimized the “roots of white fear in America” by saying that “violence for . . . white people too often has a black face.” And then he went on the defensive: “It’s not racist for whites to assert that the culture of welfare dependency, out-of-wedlock pregnancy, and absent fatherhood cannot be broken by social programs unless there is first more personal responsibility.”14
Clinton officially declared himself a supporter of the racist idea of personal responsibility when he signed the Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA) into law on August 22, 1996, with the next presidential election on the horizon. The bill was a compromise between Newt Gingrich’s New Republicans and Clinton’s New Democrats. It limited federal control of welfare programs, required work for benefits, and inserted welfare time limits. Even though programs for the poor represented only 23 percent of the non-defense budget, and had suffered 50 percent of the spending cuts over the past two years, welfare reform remained the leading domestic issue for the majority of White Americans. From Barry Goldwater’s “animal creature” to Reagan’s “welfare queen,” producers of racist ideas had done their job on non-Black Americans. Republican congressman John L. Mica of Florida held up a sign that said it all during the congressional debate on the bill: “Don’t Feed the Alligators / We post these warnings because unnatural feedings and artificial care creates dependency.”15
The same producers of racist ideas had also done their job on Black Americans, averting a march against welfare reform, and causing some African Americans to hate irresponsible, dependent, violent “niggers” as much as racist non-Blacks did. “I love black people, but I hate niggers,” jabbed a relatively unknown Black comedian, Chris Rock, on HBO’s “Bring the Pain” on June 1, 1996. The unforgettable performance began with a litany of antiracist jabs at Blacks and Whites over their reactions to the O.J. verdict and catapulted Chris Rock into the pantheon of American comedy. It marked the beginning of a revolution in Black comedy and introduced the three main comedic topics for a new generation: relationships, the racism of White people, and what was wrong with Black people. Out of “Bring the Pain,” doubly conscious Black comedy emerged as one of the most dynamic arenas of antiracist and racist ideas, with listeners laughing at, or with, the comedians.16
ANTIRACISTS SUFFERED A crushing loss in California on election night in 1996. California voters banned affirmative action, or “preferential treatment,” in public employment, contracts, and education. Neither funding allocation policies for public colleges and K–12 schools nor standardized tests—both of which preferentially treated White, rich, and male students—were banned. The percentage of African Americans at University of California campuses began to decline.
The campaign for California’s Proposition 209 ballot initiative displayed the progression of racist ideas in their full effect: its proponents branded antiracist affirmative action as discriminatory, named the campaign and ballot measure the “civil rights initiative,” evoked the “dream” of Martin Luther King Jr. in an advertisement, and put a Black face on the campaign, University of California regent Ward Connerly. It was a blueprint Connerly would take on the road to eliminate affirmative action in other states, but not before receiving a public rebuke from the sixty-nine-year-old Coretta Scott King. “Martin Luther King, in fact, supported the concept of affirmative action,” she said. “Those who suggest he did not support affirmative action are misrepresenting his beliefs and his life’s work.”17
On November 6, 1996, a day after passage of the proposition and the reelection of Clinton and a Republican Congress, quite possibly the most sophisticated, holistically antiracist thriller of the decade appeared in theaters. Directed by twenty-seven-year-old F. Gary Gray, who was already well known for Friday (1995), written by Kate Lanier and Takashi Bufford, and starring Jada Pinkett, Queen Latifah, Vivica A. Fox, and Kimberly Elise—Set It Off showcased just how and why four unique Black women could be motivated by Los Angeles’s job, marital, and gender discrimination; class and sexual exploitation; and racist police violence to commit a violent crime—in their case, well-planned armed bank robberies—in an attempt to better their lives and get back at those who were trying to destroy them. Set It Off did what law-and-order and tough-on-crime racism refused to do: it humanized inner-city Black perpetrators of illegal acts, and in the process forced its viewers to reimagine who the real American criminals were. While Pinkett played an erudite, independent, sexually empowered heterosexual woman in all her normality among male lovers and abusers, Latifah portrayed a mighty butch lesbian in all her normality among poor Blacks. In the end, three women die, but the shrewd Pinkett escapes with the stolen money into the sunset away from American racism.
Critics and viewers fell in love with the tragedy and triumph of Set It Off. Even film critic Roger Ebert “was amazed how much I started to care about the characters.” If only law-and-order America, seeing the structural racism, had started to care about the real characters. But the producers of racist ideas seemed determined to make sure that never, ever happened.18
BILL CLINTON WAS sadly mistaken about the root of the “problem of race” when he made a stunning announcement on the subject on June 14, 1997. In his commencement address at Angela Davis’s alma mater, UC San Diego, Clinton pledged to lead “the American people in a great and unprecedented conversation on race.” Racial reformers applauded Clinton for his willingness to condemn prejudice and discrimination and for his antiracist ambitions of building “the world’s first truly multiracial democracy.”19
Upward of 1 million Black women made sure to inject their ideas into the conversation, gathering in Philadelphia on October 25, 1997. Congresswoman Maxine Waters, Sister Souljah, Winnie Mandela, Attallah and Ilyasah Shabazz (daughters of Malcolm X), and Dorothy Height spoke to the Million Woman March. At one point, a helicopter flew down low to drown out their words. Thousands shot up their arms, trying to almost shoo the helicopter away like a fly. It worked. “See what we can do when we work together,” intoned the passionate director of ceremonies, Brenda Burgess of Michigan.
The calls for Black unity resounded in Philadelphia as they had two years earlier among those million men in Washington, DC—as if Black people had a unity problem, as if this disunity was contributing to the plight of the race, and as if other races did not have sellouts and backstabbers. The nation’s most unified race behind a single political party was never the most politically divided race. But, as always, racist ideas never needed to account for reality.20
“Racism will not disappear by focusing on race,” House speaker Newt Gingrich argued in the wake of Clinton’s national race conversation. This reaction to Clinton’s conversation synthesized into a newly popular term: color-blind. “Color-blindness” rhetoric—the idea of solving the race problem by ignoring it—started to catch on as logical in illogical minds. “Color-blind” segregationists condemned public discussions of racism, following in the footsteps of Jim Crow and slaveholders. But these supposedly color-blind segregationists were much more advanced than their racist predecessors, announcing that anyone who engaged Clinton’s national discussion in any antiracist way was in fact racist. In his 1997 book Liberal Racism, journalist Jim Sleeper argued that anyone who was not color blind—or “transracial”—was racist. In their runaway success of the same year, America in Black & White, Manhattan Institute Fellow Abigail Thernstrom and Harvard historian Stephan Thernstrom said that “race-consciousness policies make for more race-consciousness; they carry American society backward.” “Few whites are now racists,” and what dominates race relations now is “black anger” and “white surrender,” the Thernstroms wrote, echoing the essays in The Race Card, an influential 1997 anthology edited by Peter Collier and David Horowitz. Criers of racial discrimination were playing the fake “race card,” and it was winning because of liberal “white guilt.”21
All this color-blind rhetoric seemed to have its intended effect. The court of public opinion seemed to start favoring the color-blind product nearly a century after the Supreme Court had ruled in favor of the product “separate but equal.” The millennium was coming, and people were still being blinded to human equality by colors.