3
The Unintended Consequences of the Law
Over the course of five decades, the civil rights movement has come to enjoy the prestige of a national epic: its leaders are revered as heroes and as saints; its pivotal moments have become the stuff of legend and myth; its accomplishments, canonized; its guiding principles have acquired the status of scripture. Law is, of course, at the very center of the civil rights movement, and the defining legal text of the civil rights movement is the opinion of the Supreme Court in Brown v. Board of Education. If Martin Luther King Jr.’s “I Have a Dream” speech is the movement’s Sermon on the Mount, the Court’s taut opinion in Brown is its Golden Rule.
In 2004, the nation celebrated the fiftieth anniversary of Brown v. Board of Education. Most reasonably well-educated Americans knew that racial segregation in public schools ended when the Supreme Court decided Brown in 1954. Even for those who did not pore over every line of celebratory text in the scores of books, editorials, blogs, magazines, newspapers, academic journals, and law reviews—even for those who did not sit glued to their televisions and radios, basking in the glow of patriotic self-congratulation—it would have been hard not to notice how far we’ve come as a nation. Even for those people cloistered in convents without access to media or holed up inside log cabins and tin shacks in remote locations, it would have been difficult to avoid the contagious swelling of pride and sense of goodwill. And for this remarkable progress in social justice, this unprecedented moral achievement, we have to thank our singular, wise, and time-honored tradition of constitutional rights, guided by the careful, learned, and even hand of the Supreme Court. Before Brown ours was a racially segregated nation in which public education reinforced racial hierarchy and undermined the dignity of black students. Since Brown we have become a racially integrated society in which, as the Court in Brown had hoped, public education is “a principal instrument in awakening the child to cultural values, in preparing him for later professional training, and in helping him to adjust … to his environment.”1
But there were a few naysayers and malcontents spoiling Brown’s fiftieth-birthday party. Some people had to focus on the empty half of the glass, as the Harvard Civil Rights Project did when it pointed out that “a substantial group of American schools … are virtually all non-white … These schools educate … one-fourth of black students in the Northeast and Midwest.”2 Critics complained that in the decades since Brown the courts had slowly but steadily backpedaled, weakening Brown’s integrationist mandate. Others groused that far from awakening children to healthy cultural values and preparing them for professional training, many public schools—especially those that served poor blacks—introduced their pupils to dysfunctional values and even crime while failing to teach such rudimentary skills as reading, writing, and arithmetic. According to such detractors, many students left school with a first-rate education in vice and unable to write a simple sentence. Marring Brown’s fiftieth-birthday celebration like a preacher at a stag party, the law professor and veteran civil rights lawyer Derrick Bell went so far as to suggest that the nation’s minority children might have been better off if the plaintiffs in Brown had lost and the nation had committed itself to fulfilling the breached promise of segregation: separate but equal. He lamented that “Brown [is] a magnificent mirage … to which all aspire without any serious thought that it will ever be attained.”3
What Brown accomplished was and is important. But its accomplishments are more modest, the progress toward racial justice it inspired more halting, and the meaning of Brown both more limited and more mixed than the typical triumphal account suggests. Brown did not integrate public education with the strike of a gavel: in fact, in 2006, more than fifty years after Brown, two of every five black and Latino public school students attended a school that was over 90 percent nonwhite.4 Brown provided the civil rights movement with renewed energy and sharper focus, but it also helped to divert the attention of the movement away from the economic injustices affecting working-class black people and toward the social and psychological preoccupations of black elites. Brown did not undo Jim Crow segregation; in fact, Brown may actually have strengthened Jim Crow by inspiring a backlash among Southern whites, many of whom were turning against racial segregation before the Brown litigation. Brown v. Board of Education—perhaps the most celebrated victory of the American civil rights movement—is not an unimpeachable example of the virtues of legal rights. In many ways it is a cautionary tale of the limitations and hazards of legal rights, a story of rights gone wrong. Indeed, the malcontents were more completely vindicated than they could have imagined possible, three years after Brown’s fiftieth anniversary. In 2007 the Supreme Court held that the legal precedent of Brown v. Board of Education—which in 1954 had required the end of segregation—prohibited policies that advanced racial integration.
* * *
When Brown was decided in 1954, many people attacked the Court’s decision for all-too-predictable reasons: they opposed racial mixing and resented the inevitable demise of Jim Crow, of which Brown was a conspicuous symbol, if not an indispensable cause. But others criticized Brown for more surprising and more respectable reasons. The most famous of the respectable critics was the eminent Columbia Law School professor and labor rights lawyer Herbert Wechsler. Wechsler was one of the nation’s most formidable legal minds: an expert in constitutional law, criminal law, and the federal courts and a technical adviser to the judges in the Nuremberg trials. He would go on to win one of the century’s most important cases involving freedom of expression, New York Times v. Sullivan, and to write the American Law Institute’s Model Penal Code, which fulfilled the ambition that its name suggests and prompted criminal law reform in dozens of states.
In 1959, Wechsler delivered an address at Harvard Law School. In it he attacked the now-familiar idea that “racial segregation is … a denial of equality” and quoted approvingly from Plessy v. Ferguson, which had established the doctrine of “separate but equal” that Brown had overturned. “Is there not a [valid] point in Plessy,” Wechsler asked, “that if ‘enforced separation stamps the colored race with a badge of inferiority’ it is solely because its members choose ‘to put that construction upon it’?” Wechsler questioned the reasoning in Brown, attacking the psychologist Kenneth Clark’s contention—central to the Court’s holding in Brown—that segregation was psychologically damaging to black children. He queried:
Was he [Clark] comparing the position of the Negro child in a segregated school with his position in an integrated school where he was happily accepted and regarded by the whites; or was he comparing his position under separation with that under integration where the whites were hostile to his presence and found ways to make their feelings known? And if the harm that segregation worked was relevant, what of the benefits that it entailed: sense of security, the absence of hostility?… Suppose that more Negroes in a community preferred separation than opposed it? Would that be relevant to whether they were hurt or aided by segregation as opposed to integration?5
Wechsler insisted that the strongest case against segregation wasn’t that it denied blacks equality—equal resources could be provided in segregated schools—it was that it curtailed the freedom of association guaranteed by the First Amendment. But this rationale had its problems too, Wechsler cautioned:
If the freedom of association is denied by segregation, integration forces an association upon those for whom it is unpleasant or repugnant. Is this not the heart of the issue involved…? Given a situation where the state must practically choose between denying the association to those individuals who wish it or imposing it on those who would avoid it, is there a basis in neutral principles for holding that the Constitution demands that the claims for association should prevail? I should like to think there is, but I confess that I have not yet written the opinion.6
Future generations have not been kind to Wechsler’s critique of Brown: the more charitable treat it as the work of a great intellect with sadly limited vision; the less charitable, as an apologia for bigotry. Because of the Harvard speech, published in the Harvard Law Review under the title “Toward Neutral Principles of Constitutional Law,” Wechsler—despite his many impressive accomplishments—is probably best known today as a man on the wrong side of history. But in the decades since Brown, many of Wechsler’s concerns have been proven valid.
For instance, many psychologists today would question the Brown Court’s confidence that segregated schools were in and of themselves psychologically damaging to black children.7 It was the entire social structure of the Jim Crow South and indeed of mid-century America generally—the pervasive racial stereotypes, the contempt of whites in positions of power and influence, and the daily humiliations, of which segregated schools were hardly the most severe—that undermined the self-esteem of black children, as it did of blacks generally. Moreover, Brown’s focus on psychological injury was a symptom of a new and questionable shift for the racial justice struggle, away from a long-standing commitment to economic justice and in pursuit of the less tangible goals of personal dignity and that elusive grail of pop psychotherapy, self-esteem. In the decades that followed, as pop psychotherapy gave the veneer of science to the preoccupations of the newly affluent Me Generation, self-esteem became a major preoccupation of the American mainstream. The obsession with self-esteem and psychotherapy grew to feed not only a significant cadre of medical professionals but also a multimillion-dollar semipro industry of 12-step programs, TV talk show confessionals, encounter groups, self-actualization clinics, lecture circuit gurus, and uncredentialed self-help experts. The culture of therapy nurtured and exploited the idea that self-esteem was a natural entitlement. The ruling in Brown made it into a civil right, giving it the moral authority of the racial justice struggle and the imprimatur of law.
The legal historian Risa Goluboff points out that civil rights litigation before Brown had emphasized “the economic harms that segregation entailed: less work, worse work, inadequate salaries, and economic insecurity and lack of advancement.”8 Civil rights lawyers in the early twentieth century saw the exclusion of blacks from the labor market as the greatest evil of Jim Crow segregation. The “right to work” in the face of whites-only labor unions and racially stratified workplaces was the paradigmatic civil rights goal. Given the nature of American racism, the centrality of labor rights was inevitable and appropriate. Labor, beginning with slavery, defined the black experience. After emancipation and the brief period of Reconstruction, Southern farmers and Southern state governments joined forces to keep blacks on plantations in conditions that mirrored slavery. Sharecropping replaced slavery as the involuntary labor regime of American agriculture. In theory, a sharecropper freely agreed to work for a farmer on mutually acceptable terms. But in practice, sharecropping was a trap: farmers “advanced” sharecroppers their meager wages at usurious rates of interest, sold them supplies and the necessities of life at drastically inflated prices on credit, and insisted that the sharecropper was obligated to work until his debts had been paid. Because the sharecropper would need to buy more goods on credit in order to keep working, each additional day on the job could actually put him deeper into debt. The sharecropping system relied on laws in place throughout the American South that made it illegal to leave a job before working off an advance from one’s employer. The legal fiction justifying these laws—which effectively enforced indentured servitude—was that an employee who accepted an advance from an employer and quit without repaying it must have intended to defraud the employer from the outset. Once convicted of criminal fraud, the hapless worker would be forced back into the unpaid labor force on a prison chain gang. Despite such ingenious rationalizations, many employers dispensed with legal contrivances and simply held black laborers by force.
When World War II created labor shortages throughout the United States, Southern blacks moved north in droves to find employment in wartime industries. Southern states responded with beefed-up enforcement of nineteenth-century vagrancy laws and new wartime “work or fight” laws that empowered police to arrest able-bodied people who appeared to be unemployed and forcibly put them to work.9 Civil rights lawyers, both in government and in nonprofit civil rights organizations such as the NAACP, attacked these practices with a variety of political and legal tools. Mobility and access to jobs were crucial civil rights issues: if blacks were to escape slave-like conditions as agricultural laborers, they would have to find work as wage laborers in the industries of the North. The NAACP Legal Defense Fund sued racially exclusive unions under federal and state laws, arguing that the federal laws that allowed them to organize and represent workers in collective bargaining obliged them not to discriminate on the basis of race.10 But shortly after World War II, civil rights organizations soured on labor-focused litigation. The NAACP became politically allied with organized labor and was anxious to demonstrate that it was not antiunion. At the same time, organized labor was losing its former clout: as a result, the right to join a closed shop was worth less than it once had been. And some historians speculate, perhaps uncharitably, that the relatively privileged NAACP lawyers simply began to find the concerns of working people less compelling than the kinds of injuries to dignity and status that professional blacks like themselves suffered under Jim Crow in racially segregated bathrooms, public transportation, restaurants, and schools.
As civil rights lawyers moved away from the focus on equal access to jobs, their bête noire became the 1896 case Plessy v. Ferguson, which upheld the racial segregation of Louisiana railroads under the doctrine of “separate but equal.”11 Infamously, the Court in Plessy opined that in the absence of objective differences between black and white facilities, segregation stigmatized blacks only because “the colored race chooses to put that construction upon it.” Earlier challenges to Jim Crow had accepted Plessy and attacked the objective inferiority of segregated black facilities. But in Brown v. Board of Education, the NAACP Legal Defense Fund lawyers wanted to demonstrate that segregation by its very nature harmed blacks. The growing prestige of social science—and of psychology in particular—offered civil rights lawyers a battery of evidence that was unavailable to Homer Plessy.
Today popular psychotherapy is so commonplace as to be background noise, but in the mid-twentieth century psychotherapy was new and fashionable. The rigors and deprivations of World War II had forced Americans to focus on the needs of the body, but in the postwar years Americans had the leisure and the resources to devote to the health of the mind. The American Psychiatric Association began work on the first Diagnostic and Statistical Manual: Mental Disorders in 1948 and published it in 1952. The number of licensed psychiatrists in the United States doubled in the 1950s, and the therapist’s couch became a regular staple of popular culture. Psychotherapy grew from a somewhat suspect indulgence to an accepted part of bourgeois hygiene. The psychotherapy zeitgeist inspired public policy: Congress created the National Institute of Mental Health in 1946. A 1950 White House conference promoted “a healthy personality for every child.”
The Supreme Court was primed to treat psychological evidence that segregation caused mental injury as hard scientific fact. In the 1930s and 1940s the psychologists Kenneth and Mamie Clark had gathered just such evidence. They conducted a series of heartbreaking experiments with black children in the South. They presented the children with two dolls, one black and one white. When asked which doll was the nice doll and which he or she would like to play with, the typical child chose the white doll. When asked which doll was the bad doll, the child chose the black doll. When asked which doll was most like her, she chose the black doll. The Clarks concluded that segregation undermined the self-esteem of black children, warping their personalities and damaging their ability to learn.
The Clarks’ psychological studies provided the previously missing link between segregation and tangible inequality: segregation harmed the self-esteem of black children, which in turn harmed their ability to learn, thereby depriving them of equal educational opportunities. Because they stigmatized black children, segregated schools necessarily offered them an inferior education.
Brown established the idea that such psychological injury was the defining evil of Jim Crow segregation—and by extension of racism generally. Echoes of the earlier civil rights emphasis on employment remained audible in the historic 1963 March on Washington for Jobs and Freedom, and Martin Luther King’s attempt to help lead the Poor People’s Campaign in the late 1960s was arguably a return to these concerns. But the post-Brown civil rights movement shifted away from this focus on economic inequality born of the specific historical practices of American racism and set its sights on more abstract targets: the psychology of racism, the mental state of the bigot, and the stigma that results from racial discrimination. This new approach was dramatically successful at first, but it proved costly in the long run. By downplaying economic justice, civil rights lawyers escaped being vilified as communists—a charge segregationists had made persistently—and successfully won over the courts and eventually the American mainstream. But they left some of the most severe racial injustices unaddressed. And because their choices shaped civil rights law—and, for many Americans, defined racial justice—they may have also made it harder for future generations to confront those injustices.
The Civil Rights Act of 1964 prohibits discrimination in employment and housing, addressing these economic considerations—but only to a limited extent. The Civil Rights Act was organized around the post-Brown idea that racial injustice was a matter of stigmatizing animus and stereotypes. Its effectiveness hinged on elusive and conceptual evidentiary questions. How could a plaintiff establish the prohibited mental state when an employer or landlord declined to announce it? Did unexplained racial segregation or exclusion imply discriminatory animus? Was discrimination unlawful because it was motivated by animus or because of its effects, or was it objectionable per se? Such conceptual ambiguities continue to limit the effectiveness of the Civil Rights Act today.
Civil rights inspired by psychotherapy evolved in a very different—and less useful—way than they might have, had the movement remained focused on economic justice and the tangible evils of the American racial hierarchy. Stigmatic injury, stereotypes, and subjective emotional harm acquired a centrality in civil rights law they never enjoyed in other areas of law and that they did not merit. Today the influence of psychology is felt in age discrimination law, where “stereotypes” define the legally actionable discrimination instead of tangible injury; in sex harassment doctrine, where emotional harm due to sexual expression outweighs objective impediments to women’s advancement; and in disability law, where legal intervention responds less to social injustice than to the extent a disability limits self-actualization.
* * *
The courts heard challenges to segregated schools against a political backdrop of intensive resistance and desperate hope—and one in which the always-fragile legitimacy of the federal courts was at stake. Overreaching might lead the courts into administrative complexities that they could not manage and that other governmental officials would refuse to take on; it could even lead to outright defiance and a crisis of authority for the courts. On the other hand, retreat would encourage recalcitrance, again undermining the authority of the courts.
As a result, the courts adopted a series of aggressive half measures that managed to infuriate whites, disappoint civil rights advocates, and do little to address inequalities in education or segregation of the races. Brown’s bold pronouncement that separate schools were inherently unequal and offended the constitutional guarantee of equal protection was quickly limited as a practical matter by the Court’s holding one year later that desegregation should proceed with “all deliberate speed”—a statement that sounded assertive but was in fact a calculated retreat that allowed schools to remain segregated for years to come. This did not mollify Southern bigots, who threatened “massive resistance” to integration. In 1957, Governor Orval Faubus of Arkansas called out the National Guard to prevent the menace to public order and decency threatened by nine black students who were set to attend the all-white Little Rock Central High School. A federal court ordered the National Guard to stand down, but an angry mob took their place for several weeks until President Dwight Eisenhower, fearing for America’s public image overseas, sent in the 101st Airborne Division to enforce desegregation.
In 1968 the Court began to answer the most important question posed by Brown: Did the Constitution require racially integrated schools, or did it simply forbid explicit, state-enforced segregation? New Kent County in eastern Virginia was a perfect place to pose this question. Roughly half black and half white, New Kent County had only two public schools, each of which combined elementary and high school. Before Brown, one—the New Kent School—had been reserved for whites and the other—Watkins—for blacks. Virginia had required racial segregation in public schools until Brown was decided in 1954 and continued to encourage segregation in resistance to Brown. As of 1964 no black student had ever attended the New Kent School, and no white student had ever attended Watkins.
The NAACP Legal Defense Fund sued the New Kent County School Board, which responded by adopting a new “freedom of choice” plan. Students in the first through eighth grades were required to choose a school; high school students would attend the school they had previously attended unless they requested a transfer. No whites requested to attend Watkins, and only 15 percent of blacks requested New Kent: the county still ran an exclusively black school and an overwhelmingly white one.
In Green v. County School Board of New Kent County, the Supreme Court invalidated this plan. The Court held that under Brown public schools had an “affirmative duty to take whatever steps might be necessary to convert to a unitary system in which racial discrimination would be eliminated root and branch.”12 For the first time the Court explicitly required schools to do more than eliminate formal race-based exclusion; it also required the schools to counteract the racial patterns that decades of segregation had put in place.
Ironically, the Court found support for this more aggressive desegregation requirement in the “all deliberate speed” language, which allowed school districts to desegregate gradually instead of ending race discrimination immediately. Under “all deliberate speed,” instead of ordering immediate desegregation, the federal courts were to “consider problems related to administration, arising from the physical condition of the school plant, the school transportation system, personnel, revision of school districts and attendance areas … and revision of local laws and regulations … to effectuate a transition to a racially nondiscriminatory school system.”13 The federal courts had to evaluate the details of the more gradual desegregation efforts to assess their compliance with the “all deliberate speed” mandate. They discovered numerous and varied attempts to delay or skirt Brown’s mandate and avoid desegregation. Unsurprisingly, such foot-dragging and recalcitrance often relied on private intimidation and inertia to keep segregation in place as a matter of fact, even as it was slowly abolished as a matter of law. For instance, assigning students to the school they had previously attended wasn’t explicitly discriminatory, but there was no doubt that it would—and was intended to—keep the schools racially segregated.
In order to fulfill Brown’s promise to eliminate segregation, the courts had to do more than outlaw explicit racial classifications. Because the courts couldn’t evaluate the policies on form alone, they had to look to their likely effects and reject policies that would predictably maintain racial segregation. Practically speaking, this came close to requiring integration.
It came closer still in 1971 when the Supreme Court considered the pupil assignment policies of North Carolina’s Charlotte-Mecklenburg Board of Education. The city of Charlotte merged its public schools with those of the surrounding Mecklenburg County in 1961 to create one of the nation’s largest school districts. The district served 71 percent white and 29 percent black students in 1968. In response to earlier legal challenges, the district assigned students to neighborhood schools without regard to race. But almost all of the district’s black students lived in the city of Charlotte, most in a single section of the city; as a result, two-thirds of the district’s black students attended schools that were 99 to 100 percent black.
The federal district court that first heard the challenge to the Charlotte-Mecklenburg district’s school assignment policy considered several desegregation plans. The school board proposed a relatively modest plan that divided the district into pie-shaped zones that joined the heavily black inner city with white neighborhoods in suburban Mecklenburg County. This plan would have desegregated the district’s ten high schools, but more than half of black elementary school students would attend schools that were over 85 percent black, and half of white students would attend schools that were over 85 percent white. More ambitious plans combined rezoning with long-distance busing from inner-city schools to suburban schools to achieve between 9 percent and 38 percent black enrollment at every school in the district. The district court required the board to choose between three more aggressive plans, all of which required busing. The Court of Appeals for the Fourth Circuit worried that the long-distance busing was too burdensome and sent the case back to the district court for reconsideration. But the Supreme Court stepped in and affirmed the district court’s order in full in Swann v. Charlotte-Mecklenburg Board of Education.14
Was this a logical extension of Brown? The Court in Green had invalidated New Kent County’s race-neutral “freedom of choice” plan because it perpetuated past racial segregation. One could say the same of the Charlotte-Mecklenburg school assignment policy: given the residential segregation of the county, well-known to school officials, neighborhood schools would inevitably be racially homogeneous schools. Moreover, the district court found that past federal, state, and local policies had contributed to that residential segregation. The school board’s race-neutral assignment policies worked hand in glove with the segregated housing patterns put in place by government: “Locating schools in Negro residential areas and fixing the size of the schools to accommodate the needs of immediate neighborhoods, resulted in segregated education.”15
But the freedom-of-choice plan invalidated in Green seemed designed to facilitate private racial intimidation and anxiety. By contrast, the Charlotte-Mecklenburg schools had adopted a commonplace policy of assigning pupils to schools in their neighborhoods—precisely the kind of policy one would expect a color-blind school district to adopt. Indeed, one commonly voiced objection to segregation was that students were required to attend geographically distant schools, often walking past more convenient schools in their neighborhoods that were reserved for another race.
Swann was only a mild extension of the Court’s logic in Green, but it required a dramatic extension of the intervention of the federal courts as a practical matter. Because racial discrimination had been widespread and severe, especially in the South, one could plausibly attribute almost any racial disparity to past governmental action. Did the Court’s admonishment in Green to eliminate discrimination “root and branch” entail eliminating all racial imbalances? The Supreme Court offered only obscure guidance: it insisted that “the constitutional command to desegregate schools does not mean that every school … must always reflect the racial composition of the school system as a whole” but also opined that “the racial composition of the whole school system is likely to be a useful starting point in shaping a remedy to correct past constitutional violations.”
As civil rights litigators pressed for integration after Brown, black parents reluctantly but consistently voiced concerns that echoed those of Wechsler. Following Swann, the civil rights lawyers who sued recalcitrant school districts across the United States insisted that Brown guaranteed nothing less than complete racial integration of the public schools. In most cities, integration would require packing black kids onto buses and driving them long distances to schools in white neighborhoods. In some school districts, integration required busing black children out of town to lily-white suburbs.
The time, inconvenience, and expense involved in mass busing were not the greatest prices to be paid for integration. Whites in many schools were hostile and often violent: the black children who integrated schools in the South after Brown faced angry and violent white mobs on their way to school. Once inside the classrooms, they were taunted and often attacked by their white classmates.16 The legal historian Michael Klarman makes a convincing case that Brown breathed new life into a dying segregationism. Brown, seen by many white Southerners as a new form of Yankee carpetbag interference, inspired resentment and drove moderate politicians into the segregationist camp or out of politics.17 In the early 1950s, many Southern cities and states had relaxed or eliminated Jim Crow laws involving public transportation, sports, higher education, and voter registration, either voluntarily or without great resistance, in response to court orders; after Brown, voters and politicians reimposed strict segregation and intensified discriminatory practices. The Ku Klux Klan, which was slouching toward extinction before Brown, “reappeared in states such as South Carolina, Florida, and Alabama, where it had rarely been observed in recent years.”18
White hostility to racial integration was hardly exclusive to the South. In fact, some of the worst cases were in the heart of liberal New England. In South Boston, some whites boycotted the public schools in response to desegregation, and white mobs threw stones at school buses carrying children from the predominately black Roxbury district. Protesters planned to overturn and burn school buses carrying black students, a plan that was foiled only because an anonymous tip warned black community leaders, who arranged for the black children to spend the school day elsewhere.19 Similar incidents occurred in the Charlestown section of Boston.
According to the law professor Derrick Bell, who consulted with Boston’s black community groups at the time, many black parents were wary of ambitious busing plans: “They did not wish to back away [from] … efforts to desegregate Boston’s schools, but they wished to place greater emphasis on upgrading the school’s educational quality … and to minimize busing to the poorest and most violent white districts.”20 Yet civil rights lawyers, convinced that Brown required integration at all costs, pressed for a plan that required extensive busing.
Similarly, in Atlanta, the local NAACP hammered out a compromise with the school board that called for full desegregation of faculty and employees and an increased number of blacks in top administrative positions but only partial desegregation of students. Local black leaders and school board members supported the plan for pragmatic reasons. But the national NAACP and Legal Defense Fund lawyers saw the compromise as a betrayal of civil rights principles. They fired the Atlanta-branch NAACP president, who had supported the compromise, and filed an appeal to block it.
Black parents, teachers, and students were often ambivalent about desegregation for sensible, practical reasons. Black schools, even when they suffered from lack of funds, often offered black students intangible benefits that integrated schools did not. Black faculty and administrators were positive role models for black students, and all-black schools gave black students a safe haven from white racism and its debilitating effects on self-esteem. Ironically, although desegregation was premised on the belief that school segregation hurt the self-esteem of black students, Kenneth and Mamie Clark’s studies cited in Brown actually showed that the self-esteem of black children in segregated schools was typically higher than that of black children who attended integrated schools in the North.21 It was not segregation in and of itself that damaged the hearts and minds of black students, but rather the racism that underlay it. Black students who could escape that racism while in school had healthier self-esteem than those who were constantly exposed to it.
After desegregation, it was almost always the underfunded black schools that were closed, and the black faculty and administrators often lost their jobs. In many cases, the black community bitterly complained about the loss of a neighborhood school in which they took great pride. For instance, former students of the black Second Ward High School in Charlotte remembered “nurturing in the classroom without the strife of racial overtones.” When the school closed, one graduate recalls, “it was heartbreaking.” Another insisted, “We thought that it was the utmost in betrayal.” A teacher recalled, “When we made the transition from Second Ward to West Charlotte, you had … trauma. I still kept contact with those kids from Second Ward, and they would call and sometimes cry.” A former student from Tampa, Florida, complained, “It was bad. It was terrible … What we thought is that they would improve our school and bus in some white kids.” A former teacher at a Florida high school insists, “Blacks were seeking for equality of facilities, equipment and on and on. I know they never asked for their schools to be closed.”22
Desegregation also faced practical impediments. White flight to surrounding suburbs in the years since Brown had drastically changed the demographics of many American cities, making integration more costly and less effective: for instance, in 1952 the Atlanta school district was 32 percent black; by 1974, when the desegregation compromise was proposed, it was 82 percent black. The Detroit school district was over 70 percent black in the early 1970s when civil rights lawyers pressed for desegregation there. Economic and technological changes were pushing people out of the cities and into the suburbs across the country. Encouraged by recently built highways and inexpensive real estate, both the middle class and industry left the inner city to relocate in roomier and less costly digs in the suburbs. Integration was unworkable in such cities given the small number of whites, and potentially counterproductive given the ease of white flight to neighboring suburbs.
The unusual facts of 1971’s Swann case had allowed the Court to avoid the widespread problem of white flight and racially homogeneous school districts. The Charlotte-Mecklenburg school district was the result of the consolidation of the formerly separate Charlotte and Mecklenburg County school districts in 1961. The new, merged district combined the overwhelmingly white suburbs of Mecklenburg County with the racially mixed city of Charlotte. But for the consolidation, any desegregation plan would almost certainly have been limited to Charlotte alone—eliminating many of the white schools included in the Charlotte-Mecklenburg district, without which meaningful integration would have been impossible. This made Swann somewhat misleading as precedent: the fortuity of a recently consolidated district that combined a heavily black city and heavily white suburban and rural communities made a roughly 70 percent white to 30 percent black mix mathematically possible within a single district. But in cities such as Atlanta and Detroit, the urban school districts had not been consolidated with their surrounding suburbs.
Civil rights lawyers convinced a federal district court that because “relief of segregation in the public schools of the City of Detroit cannot be accomplished within the corporate geographical limits of the city,” the law required a desegregation plan that included suburban school districts in the Detroit metro area. The federal Court of Appeals for the Sixth Circuit agreed, noting that “if we [were to] hold that school district boundaries are absolute barriers to a Detroit school desegregation plan, we would be opening a way to nullify Brown v. Board of Education.”23
Apparently unafraid of blazing this trail, the Supreme Court reversed, in Milliken v. Bradley, holding that federal courts could not require a district that had never discriminated to participate in a desegregation plan. In his dissenting opinion, Justice Thurgood Marshall lamented, “In Brown v. Board of Education, this Court held that segregation of children in public schools on the basis of race deprives minority group children of equal educational opportunities … After 20 years of small, often difficult steps toward [equality], the Court today takes a giant step backwards.”
By the mid-1970s the Supreme Court was already in retreat from its earlier commitment to widespread integration. Most American public schools remained both separate and unequal. A cruel irony of the Court’s conflicted desegregation jurisprudence is that while it ended legally mandated segregation, it also may have sped the process of white and middle-class flight from many cities, making matters worse for those left behind. Civil rights required aggressive desegregation measures such as long-distance busing in districts that had been segregated by force of law, but not in suburban school districts that had not used the law to enforce segregation. As a result, desegregation was effectively stymied in metropolitan areas marked by the familiar “chocolate city, vanilla suburb” pattern. The people responsible for de jure segregation in the urban districts had, in many cases, decamped for the suburbs, in part in order to avoid desegregation, leaving the victims of segregation behind to face the lawsuits and clean up the mess.
Consequently, busing integrated only schools attended by the relatively poor whites who lived in inner cities; wealthier suburban whites continued to attend virtually all-white schools. This created the impression that racial integration was another of the many indignities and inconveniences that the rich could afford to avoid. Region-wide integration would have brought busing to the wealthier neighborhoods, possibly improving its prestige. And practically speaking, it would have reduced the effect of busing on urban schools by dramatically increasing the number of white students in the pool of students to be integrated. The combination of aggressive desegregation within urban districts and exemption for suburban districts created a powerful incentive to leave the inner city. Perhaps worst of all, the Supreme Court’s desegregation jurisprudence encouraged the nation to prematurely celebrate victory over racial inequality: because racial justice had been defined in terms of formal civil rights and desegregation, unequal resources in the nation’s still effectively segregated schools looked like a secondary consideration.
This may have been the most the courts could do. Busing was unpopular and cumbersome even when limited to a single school district; it could easily become unmanageable if extended to several districts or an entire metropolitan region. Students would be forced to travel greater distances, white resentment might be even greater, and inner-city black students would almost certainly feel even more out of place in leafy all-white suburbs than in white urban neighborhoods. Evaluating these complex policies and their likely consequences would have required detailed sociological study. Courts faced with live controversies have neither the time nor the resources for such analyses, and conventional civil rights analysis does not easily accommodate such considerations. Given these daunting conditions in the trenches, the courts beat a face-saving retreat from Brown’s more ambitious implications. In the heady and optimistic days of the 1950s and 1960s, the Warren Court promised unequivocal victory over racial injustice. By the mid-1970s, a chastened Burger Court was ready to accept a thinly disguised defeat, the judiciary’s iteration of peace with honor.
Some had hoped for a more pragmatic approach. Judge J. Braxton Craven was appointed by President John F. Kennedy to the federal District Court for the Western District of North Carolina and had heard one of the cases that led up to the Swann litigation. Judge Craven suggested that the Civil Rights Act of 1964, which enforced Brown’s mandate by withholding federal funding from segregated school districts, offered a chance to take desegregation out of the courts and place it in the hands of “administrators, especially if they have some competence and experience in school administration[, who] can likely work out … the problems of pupil and teacher assignment in the best interests of all concerned better than can any District Judge operating within the adversary system. The question before this court … is not what is best for all concerned but simply what are plaintiffs entitled to have as a matter of constitutional law.”24 After over fifty years of frustrating struggle, resulting in controversial and often ineffectual judicial remediation, this suggestion seems apt. An administrative approach may not have done as much as quickly as the courts did. But administrators might have accomplished more profound and more durable change in the long run by encouraging compromise and skirting backlash.
* * *
In 2007 the Supreme Court of the United States, inspired by the legacy of Brown v. Board of Education, invalidated the racial school assignment policies of two school districts—one in Seattle, Washington, and one on the edge of the Old South, in Louisville, Kentucky. Both districts had been sued for racial segregation, and both had developed integrationist policies in response. In Louisville the school district did so under court order; in Seattle, to settle a civil rights lawsuit. Over time, both districts abandoned aggressive desegregation policies such as busing in favor of milder policies. In Seattle, the school board tried to combine individual choice with racial integration by allowing incoming high school students to choose from among any of the district’s high schools, ranking them in order of preference. For the most popular schools, places in the entering class would be filled using a series of tiebreakers: the district gave preference to students with a sibling attending the school, to students who lived nearby, and to students who would contribute to the racial diversity of the student body. In Louisville, the district adopted a desegregation plan of its own design after a court order imposing busing was lifted in 2001. Like the Seattle plan, it considered student preferences, residential proximity to the school, and racial integration in making school assignments and approving transfers. These desegregation policies were weak tea compared with the strong medicine of mandatory busing that they replaced. And their effect was correspondingly mild: according to Chief Justice John Roberts, in the 2000–2001 school year only fifty-two Seattle students were assigned to a school they hadn’t chosen because of the district’s mild attempt to promote integration. Similarly, only about 3 percent of school assignments in Louisville were affected by the district’s racial integration plan.
Like Chief Justice Earl Warren more than fifty years earlier, Chief Justice John Roberts offered a ringing endorsement of racial equality in condemning these policies. He quoted the plaintiffs in Brown v. Board of Education, the landmark case that brought an end to school segregation, at length. But Parents Involved in Community Schools v. Seattle School District No. 1 wasn’t the typical civil rights case. This time the school districts were under attack for promoting racial integration. The problem, according to a majority of the Supreme Court, wasn’t that the districts hadn’t done enough to integrate the schools; it was that they had done too much. Using the arguments developed in Brown, Roberts insisted that “‘the Fourteenth Amendment prevents states from according differential treatment to American children on the basis of their color or race.’” Using the legacy of the civil rights movement as its guide, the Supreme Court held that the Constitution of the United States prohibits racial integration.
After John Roberts had been nominated to lead the high court by President George W. Bush and confirmed by the Senate, liberals and civil rights lawyers held their breath. During his confirmation hearings, Roberts came across as the exemplary jurist: humble, self-effacing, mild mannered. In a comment that would become famous, he compared the role of a Supreme Court justice to that of an umpire: “Umpires don’t make the rules; they apply them … [This] is a limited role. Nobody ever went to a ball game to see the umpire.” But the umpire had come out swinging for the bleachers, attacking long-standing liberal precedents and establishing new conservative ones with a vigor that was well captured by the word “active”—if not “activist.” With Parents Involved, Roberts had stolen home base, turning the most cherished liberal Supreme Court precedent against liberals. With the definitiveness of an umpire declaring a pitch outside the strike zone, he insisted that “the way to stop discrimination on the basis of race is to stop discriminating on the basis of race.”25 Roberts also quoted directly from the historic argument of the plaintiffs’ lawyers in Brown: “‘We have one fundamental contention which we will seek to develop in the course of this argument, and that contention is that no State has any authority under the equal-protection clause of the Fourteenth Amendment to use race as a factor in affording educational opportunities among its citizens.’”26
The segregated schools at issue in Brown classified students by race in order to keep the races separate; the Seattle schools in Parents Involved classified students by race in order to bring the races together. Yet according to the Supreme Court, both uses of race are unconstitutional. Justice Roberts warned against unwarranted “confidence in [the] ability to distinguish good from harmful … uses of racial criteria.”27 Justice Clarence Thomas’s concurring opinion drove the point home: “Can we really be sure that the racial theories that motivated Dred Scott and Plessy [notorious opinions that reinforced slavery and Jim Crow segregation] are a relic of the past or that future theories will be nothing but beneficent and progressive? That is a gamble I am unwilling to take, and it is one the Constitution does not allow.”28
* * *
Is it really so hard to tell the difference between defensible and pernicious uses of race? When Brown was decided, everyone understood that the holding called for an end to a specific practice: segregation. Segregation was a brute and tangible fact of life, and the reason for prohibiting it was obvious: it was inspired by racial contempt and was a crucial medium of such contempt. “Racial classification,” by contrast, is a lawyer’s construction that covers a host of very different policies: de jure segregation and affirmative action, racial profiling by police, and the use of race in the census. It’s not at all obvious that we should categorically condemn “racial classification.” Race discrimination was demeaning and hurtful in the context of a social system that was designed to demean and injure. Race discrimination is objectionable because of the history of slavery, Jim Crow segregation, and widespread invidious prejudice. Civil rights, at their best, are a response to such concrete injustices, not a mechanical application of an abstract principle. In the social context surrounding Brown it’s clear that segregation—not racial classification generally—was the problem: the Court in Brown explicitly noted that “the policy of separating the races is usually interpreted as denoting the inferiority of the Negro group.”29 Brown v. Board of Education, read in its proper historical context, is inconsistent with Justice Roberts’s bloodless interpretation of “equal protection of the laws” as prohibiting segregation and integration alike, equally hostile to the poison and its antidote.
After Brown, federal courts struggled to devise desegregation plans that local communities would accept. They often failed, but eventually some cities like Seattle and Louisville authored desegregation plans of their own. The local democratic process yielded proposals that most people supported, and local administration promised a smoother and more efficient implementation than any court order could. These plans are just what one would hope a mature civil rights tradition would produce: local communities embracing their constitutional obligations, taking desegregation out of the courts and, as Judge Craven suggested over thirty years earlier, handing it off to “administrators … [with] some competence and experience in school administration [who] can likely work out … the problems of pupil and teacher assignment in the best interests of all concerned better than can any … Judge operating within the adversary system.” But tragically, civil rights have often required cumbersome, unworkable, and counterproductive judicially imposed mandates, and today they prohibit the egalitarian policies developed and blessed by the people through the democratic process.
* * *
Civil rights litigation has never been the main engine of social justice in the United States; at most, it has occasionally jump-started a stalled political process or boosted sluggish change in social attitudes. But judges and lawyers have taken credit for the accomplishments of social activists who struggled against racism and of average Americans who struggled against their worst prejudices and nurtured their best instincts. And because civil rights have become synonymous with social justice, the courts, by virtue of their legal authority, have also asserted authority over the cultural meaning of social justice itself. When Justice Roberts insisted that the true meaning of Brown v. Board of Education was to prohibit racial classifications—including those that would advance racial integration—he opined not only on a question of constitutional law but also on the meaning of decades of historical struggle against racism. As a result, men and women who spent much of their lives suffering through and resisting the evils of segregation can be told that in fact they struggled to prohibit integration.
The civil rights movement is heroic and profound because of its specifics: the struggle against Jim Crow, the lunch counter sit-in, the freedom ride, and the public demonstration, all of which required personal sacrifice and entailed ever-present danger. The constitutional guarantee of equal protection of the laws would mean something very different—and something much less profound—but for the struggles and triumphs of the civil rights movement. If the struggle for racial justice relied on rights and the courts for its victories, today rights and the courts rely on the racial justice movement for their prestige and their legitimacy. Yet, with alarming frequency, civil rights and courts now bite the hand that feeds them, undermining racial justice and equality. When civil rights deprive long-oppressed groups of their still young and meager victories, they have gone very badly wrong.
Sex as a Weapon
When Lois Robinson started working as a welder at Jacksonville Shipyards in September 1977, she was one of a tiny vanguard of women working in such a position there—or anywhere. In 1980 she was one of two women in “skilled crafts” in a place where 958 men were employed. Despite the challenges of breaking through the gender line, Robinson was a success: she was promoted twice, attaining the status of first-class welder while at Jacksonville Shipyards.
But being a woman in what her co-workers described as a “man’s world” wasn’t always easy. At Jacksonville Shipyards, as at many predominantly male workplaces, the etiquette of the locker room prevailed. Pornography was the decoration of choice: “pictures of nude and partially nude women appear[ed] throughout the … workplace,” including “two plaques of naked women,” lovingly glued to a wood backing and varnished for long-term durability. As if the monthly centerfold weren’t sufficient, many of Jacksonville Shipyards’ business partners supplied the shop with advertising calendars that featured naked women striking sexually suggestive poses while holding various power tools, valves, gauges, and other equipment. Robinson was treated to daily demonstrations of her co-workers’ refined appreciation of the female nude, such as a photograph of a woman with the words “USDA Choice” stamped on her naked chest and a picture of a woman with a meat spatula pressed against her crotch. Her co-workers’ artistic evaluation of this dime-store erotica was no less subtle than the images themselves, including observations such as “I’d like to get in bed with that” and “I’d like to have some of that.”
One day when Robinson was on the job, a co-worker jovially passed around a photograph of a woman with long blond hair wearing high heels and holding a whip: Robinson had long blond hair and regularly worked with a welding tool known as a whip. Another co-worker extended Robinson this friendly invitation: “Hey, pussycat, come here and give me a whiff.” Another shared his wish that Robinson’s shirt would blow over her head so he could admire her unadorned form, and yet another remarked that for similar reasons, he wished her shirt were tighter. Robinson’s co-workers and supervisors also reserved distinctive endearments just for her, such as “honey,” “dear,” “baby,” “sugar,” and “momma.”
When Robinson told them that she found the pet names, ribald witticisms, and ubiquitous porn offensive, her co-workers responded to her concerns immediately. They put up more pornographic images, making sure to leave them in Robinson’s workstation and on her toolbox. One ambitious co-worker found a photograph of a nude woman with a welding shield, remarking, “Lois would really like this.” Robinson complained to management and the union; both ignored her. Finally, she sued Jacksonville Shipyards for sex discrimination. In 1991, a federal court agreed with Robinson that Jacksonville Shipyards was a discriminatory, hostile environment for women and ordered the company to clean up its act—and its workplace.
Robinson v. Jacksonville Shipyards was a significant victory for feminists, who had agitated for a more aggressive response to sex harassment for years.30 Although the law had acknowledged that a hostile environment could be a distinctive form of discrimination for over a decade, many courts had required plaintiffs to prove tangible injury or show that they had been subject to explicit sexual overtures or targeted intimidation. Some courts opined that the open display of pornography was a form of constitutionally protected free expression and therefore beyond the reach of the law. Jacksonville Shipyards seemed to vindicated the feminist contention that pornography could be, in and of itself, harmful to women.
In this, Jacksonville Shipyards indirectly furthered a more ambitious feminist agenda that targeted pornography wherever it appeared, whether in the workplace or outside it. Feminists had lobbied against the porn industry for years, working to close down peep shows and topless clubs, shutter theaters that specialized in blue movies, and stymie the sale of girlie magazines. They argued that porn led to sexual assault, citing studies that found that sexual predators were disproportionate consumers of pornography. (Skeptics questioned the relationship of cause and effect: maybe sexual predators were more likely to read porn because they were generally obsessed with sex. Some argued that porn might actually provide a release for urges that would lead to rape if left unsatisfied.) They argued that the average smutty movie or Playboy centerfold was on a moral continuum with the “snuff film,” in which an actual sexual assault that climaxed in the murder of the female victim was captured on celluloid (skeptics doubt that any such snuff films exist or have ever existed).
For these antiporn feminists, suppressing pornography was a civil rights issue. The law professor Catharine MacKinnon authored local ordinances that would have virtually banned pornography in Minneapolis; two such ordinances were passed by the city council, but the mayor vetoed both. In Indianapolis, a similar law was enacted with the support of an odd coalition of feminists and religious conservatives. The “antipornography civil rights ordinance” made it illegal to “traffic in … produce, sell, exhibit or distribute pornography.” A federal court, predictably, invalidated the Indianapolis ordinance as a violation of the First Amendment, taking the air out of the antiporn movement (at least in the United States: the Supreme Court of Canada accepted feminist antiporn arguments and upheld the obscenity conviction of a pornography distributor in 1992). Jacksonville Shipyards gave the antiporn movement new momentum: if pornography could not be outlawed outright, at least it could be punished as sex harassment.
Many feminists advanced the idea that sexually suggestive or erotic expression is distinctively harmful for women and should be banned from the workplace. For instance, the law professor and former Democratic Party adviser Susan Estrich insisted that sexual conduct is “more offensive, more debilitating, and more dehumanizing … precisely [because] it is sexual” and argued in favor of “rules which prohibited men and women from sexual relations in the workplace” whether or not both parties wanted the relationship.31
Two years after Lois Robinson won her discrimination lawsuit against Jacksonville Shipyards, Jerold Mackenzie lost his job after nineteen years at the Miller Brewing Company. Mackenzie started down the road to unemployment when he turned on his television on March 18, 1993, and sat down to watch the situation comedy Seinfeld. The sexually suggestive episode shocked Mackenzie, and the next day he took another step closer to a pink slip when he decided to discuss the Seinfeld show with his co-worker Patricia Best. Mackenzie was surprised that the racy episode had made it past the network censors and asked whether Best shared his opinion. Best said she hadn’t seen the show, so Mackenzie filled her in: Jerry Seinfeld has forgotten the name of his latest date. He can remember only that her name rhymes with a part of the female anatomy. Eventually, the woman begins to suspect that Jerry can’t remember her name and demands that he repeat it. Jerry makes several desperate guesses—“Gipple?” “Mulva?” As the woman storms out in disgust, vowing never to see him again, Jerry calls after her: “Dolores!”
Best said she didn’t get the joke. Then Mackenzie took the final unwitting leap into joblessness when he copied a dictionary page with the entry “clitoris” and gave it to Best. Best complained to her supervisor, who suggested she confront Mackenzie directly. She did so and he apologized, but also remarked that because Best herself frequently used vulgar language at work, he had assumed she wouldn’t be offended. The half apology didn’t satisfy Best, who complained again. This time Best’s supervisor reported the incident to the personnel department. Before Mackenzie had any inkling that his job was in jeopardy, the personnel department had discussed the incident with Miller’s in-house counsel, its CEO, and a private employment law firm. The next day Mackenzie was summoned to a meeting with a personnel manager and two lawyers. After an hour he was fired and shown to the front door of the building.
Mackenzie sued and convinced a jury that he was treated unfairly and deserved a total of over $26 million. But Miller appealed the verdict, arguing that Mackenzie was an employee “at will,” which means that Miller wasn’t legally required to be fair to him: it was legally entitled to fire him for a good reason, a bad reason, or no reason at all. An appeals court reversed the judgment for Mackenzie, and the Wisconsin Supreme Court affirmed the reversal. Mackenzie wound up with nothing.
Mackenzie v. Miller Brewing Company was the tip of an iceberg, visible only because Mackenzie sued his employer rather than quietly accept a severance package. Below the choppy surface of a volatile labor market, the threat of sex harassment litigation inspired thousands of employers to crack down on off-color jokes, racy artwork, and office romances. In the fearful minds of employers, civil rights law no longer simply prohibited sex discrimination; it had become a general civility code that potentially outlawed any expression with sexual or erotic content. One commentator described Mackenzie’s case as an example of “the inconsistency between the anything goes freedom of expression that reigns in the media and the sexually neutral atmosphere that [the] law requires employers to enforce in the workplace … the law’s insistence on sexual neutrality in the workplace has reached an absurd and unworkable limit. Certainly it seems unfair for a man to lose his job for saying in an office in Wisconsin what Jerry Seinfeld can say on national television.”32 This fear was confirmed by a common slippage in nomenclature: what was originally called sex harassment had come to be known, in popular and in legal parlance, as sexual harassment. This small difference was telling: sex harassment suggests that the legal wrong is harassing a woman because of her sex. By contrast, sexual harassment suggests that it’s the sexual content of the harassment that makes it unlawful.
Emphasizing the threat of expensive and embarrassing lawsuits, human resources consultants advised a “zero tolerance” approach to sexual expression and fraternization in the workplace. The prudent employer would prohibit dirty jokes, racy gossip, sexy pictures, any physical contact more intimate than a handshake, and unchaperoned social contact between male and female employees after hours. Office doors were to be kept open when two members of the opposite sex met, and if possible a third party would be present to ensure that no one felt uncomfortable (and to provide a witness to events in case of possible litigation).
If employers didn’t forbid office romances outright, lawyers and consultants recommended that the lovers sign “love contracts,” which would explicitly spell out the terms of the relationship—and clarify what would happen if Cupid’s spell wore off. For example, an employment law firm advises its clients to draft letters based on this template:
Dear [Name of object of affection]:
As we discussed, I know that this may seem silly or unnecessary to you, but I really want you to give serious consideration to the matter as it is very important to me. [Add other material as appropriate.]
It is very important to me that our relationship be on an equal footing and that you be fully comfortable that our relationship is at all times fully voluntary and welcome. I want to assure you that under no circumstances will I allow our relationship or, should it happen, the end of our relationship, to impact on your job or our working relationship. Though I know you have received a copy of [our company’s] sexual harassment policy, I am enclosing a copy [Add specific reference to policy as appropriate] so that you can read and review it again. Once you have done so, I would greatly appreciate your signing this letter below, if you are in agreement with me.
[Add personal closing.]
Very truly yours,
[Name]
I have read this letter and the accompanying sexual harassment policy, and I understand and agree [that] … my relationship with [Name] has been (and is) voluntary, consensual and welcome. I also understand that I am free to end this relationship at any time and, in doing so, it will not adversely impact on my job.
[Signature of object of affection]33
Love contracts should have surprised no one. They arrived, on cue, a few years after colleges and universities began to regulate sexual intimacy between students and professors and, in some cases, between students and other students. The most notorious of the many collegiate sexual regulations was Antioch College’s Sexual Offense Policy. A student group known as the Womyn of Antioch agitated for and drafted the policy, which required explicit, verbal consent to any and all intimate contact. Like any good legal document, the policy was drafted to deal with a dynamic relationship between the parties: new and unanticipated hazards lurked behind every whispered endearment and affectionate coo. The policy hence required renewed consent with “each new level” of sexual activity: explicit, verbal consent to kiss on the lips did not imply acquiescence to use of the tongue; heavy petting did not imply agreement to unbutton garments; fellatio did not—contrary to the almost unanimous advice of sex therapists—suggest a reciprocal desire for cunnilingus. Nor could the universal language of lovers take the place of unambiguous and literal verbiage. “Body movements and non-verbal responses such as moans are not consent,” the policy warned. As students emerged from the protective jurisdiction of campus sexual codes such as Antioch’s, many expected, and some demanded, similar regulation in the work world.
Sex harassment law developed in response to blatant sexism: bullying and intimidation designed to drive women out of the workforce and sexually predatory behavior, such as demands for sexual favors backed by blunt threats of retaliation. The law also addressed variations on these ugly themes: amorous supervisors who hinted at career advantages or impediments they could bestow or impose, sexual taunts used as a way of maintaining male dominance. But as sex harassment evolved into sexual harassment, the target of litigation was no longer just sex discrimination or male chauvinism; it was sexuality generally. Sexual harassment law didn’t just treat sexuality as a means to the prohibited end of sex discrimination; it insisted that the offending jokes and images were demeaning and dehumanizing because they were sexual. This reformulation made any sexually charged encounter, image, or statement a target of legal prohibition, whether it reinforced male privilege and sex segregation or not.
This was a direct reaction against the sexual revolution that took hold of the nation in the Roaring Twenties and reached its apotheosis with the free love of the youth counterculture in the 1960s. Some feminists insisted that sexuality was a dangerous force that had to be kept in check with strict rules and attacked modern sexual freedom as little more than a ruse by unscrupulous men out for an easy lay. This attempt to repress sexuality in general was one of the most successful feminist agendas because it, unlike much of modern feminism, was extremely familiar. The religious right tilted at the windmill of the modern sexual revolution for most of the twentieth century, but it was radical feminists, armed with the lance of civil rights, who did real damage to modern sexual licentiousness. If conservative sexual repression tried to toss tepid water on the sexuality of other people, antisex feminists volunteered to take the cold shower themselves as they hosed down their peers.
Feminists could not maintain a monopoly on—or control over—the idea that sexual expression was a violation of civil rights. Prudish men used sex harassment law to attack women who made crude sexual jokes in predominantly female workplaces, proving that turnabout is, if not fair play, then inevitable. Effeminate men sued for sex harassment when they were made the butt of locker room jokes. Straight men who were propositioned by gay supervisors or overheard the raunchy sexual conversations of their gay co-workers made a federal case of their injured—and quite likely homophobic—sensibilities. Indeed, a legal regime that punished unwelcome sexual expression regardless of its objective effects predictably fell most heavily on unconventional forms of sexual expression, such as those typically exhibited by minority groups, gay men, and lesbians. As Harvard law professor Janet Halley put it, “Sexual harassment law has become … sexuality harassment.”34 None of this bothered committed antisex feminists; in fact, some argued that these claims were really claims for women’s rights because anyone who suffered unwelcome sexual advances or expression was “feminized” by the unwelcome encounter.
But some claims of sexual victimization were in direct conflict with feminism.
* * *
Part of Yale College’s extraordinary prestige owes to its unique extracurricular traditions. Yale’s Skull and Bones society, an exclusive club for undergraduates, is shrouded in intrigue and mystery. Since the 1970s its membership has been a secret, although very high-ranking military officers, heads of the CIA, CEOs of Fortune 100 firms, and American presidents are said to be members. Its members are sworn to secrecy and are reportedly instructed to make sure no one sees them enter or leave the club’s well-known windowless clubhouse, or “tomb”—in its many years of existence few students have seen anyone do so (giving rise to further rumors that the building is served by subterranean entrances, or that it is an unused decoy, left vacant to fool the uninitiated masses). Skull and Bones does not accept applications: members are “tapped” in a selection process as secretive as the society itself.
But Yale does not leave the less conspiratorial or less well connected adrift in the chilly New Haven social climate. Every Yale undergraduate is assigned a residential college and resides there for his freshman and sophomore years. The residential colleges are more than dormitories: each college has a distinctive collective personality, and students retain a relationship with their college throughout their undergraduate years and beyond. Each of the twelve colleges has its own dean, lectures, tea parties—even its own endowment (some of which are larger than those of entire universities). The residential colleges are a big part of the distinctive Yale experience, a proudly and jealously guarded tradition. When the university admitted its first female undergraduates in 1969, it also integrated the residential colleges, assigning the handful of female students randomly, just as it had long done with the male students. Traditionalists worried that Yale would suffer a decline in the esprit de corps that made it the envy of the Ivy League. It didn’t, but the change did sow the seeds of a threat to Yale’s residential colleges, which came to fruition almost thirty years later.
Because of the residential colleges’ centrality to the carefully cultivated Yale experience, affiliation with one is not optional. But in 1996 five admitted students refused to live in their assigned colleges. Their complaint: Yale’s coed residential colleges were a Sodom and Gomorrah of sexual temptation. According to the Yale Five, as they came to be known, the erotically charged and permissive atmosphere of coed living offended their religious sensibilities and was therefore a form of antireligious discrimination. Conservative commentators quickly championed the cause of the Yale Five. The noted antifeminist Phyllis Schlafly praised the Yale Five’s “courage to challenge the rule that requires students to live in coed dormitories, where many engage in casual sex without shame and most use coed showers and toilets.”35 In fact, Yale’s residential colleges had not only single-sex bathrooms but single-sex floors available for students who wanted them. One suspects that the real cause of the Yale Five’s distress was not a university policy forcing them into intimate contact with the opposite sex, but rather the reaction of their classmates: in the heady atmosphere of sexual opportunity that saturated the colleges, their religious commitments to modesty and chastity would provoke ridicule and social isolation—or, worse, those commitments would yield to the daily onslaught of carnal temptation. The Yale Five made a federal case of that universal tribulation of adolescence: peer pressure.
When Yale admitted its first female students, sex integration was a civil rights imperative. Now it was a civil rights violation. Religious conservatives turned the idea of a sexually hostile environment, first advanced by feminists, against a fundamental commitment of feminism. But religious conservatives hadn’t distorted the feminist antisexuality agenda; they had simply taken it to its logical conclusion. If erotically charged situations were dangerous because of their sexual electricity, then wasn’t the sexual polarity—and mutual attraction—of male and female the underlying cause of the hazard? If so, then the solution was obvious: keep the sexes safely insulated. If this happened to coincide with a long-standing and independent conservative position on appropriate gender roles, well, that was just a happy coincidence.
The Yale Five were just the beginning. Across the nation, evangelical Christians, fundamentalist Muslims, and ultra-Orthodox Jews sued their employers to avoid working with women in jobs that required shared sleeping arrangements or changing rooms, such as fire stations, emergency room quarters, and trucking dormitories. Such claims—often inspired by ambiguous religious admonishments to avoid “temptation”—effectively discourage employers from hiring women in jobs where gains in sex equality have been recent and hard-won.
Of course, such claims were only a small part of a larger agenda of sexual repression—an agenda that consistently disadvantaged women. For instance, religious and political leaders have suggested that pharmacists with religious objections should have a right to refuse to fill prescriptions for contraception. Devout police officers have argued that religious liberty entitles them to refuse to protect abortion clinics. And psychological counselors have argued that offering relationship counseling to gay and lesbian couples offends their religious convictions, turning the right to religious freedom into a right to discriminate, in direct conflict with the laws of many states that guarantee equal opportunity for gay men and lesbians. Title VII has always required employers to accommodate the religious practices of their employees, but typically this has entailed things like reorganizing work schedules to allow religious employees to observe a holy day or prayer break. These new claims were more aggressive, expanding on the idea—borrowed from sex harassment law—that offended sensibilities in and of themselves could constitute a civil rights violation.
When sex harassment law morphed into sexual harassment law, with the implication that the enemy was sexuality in general and not just male chauvinism, employers responded by implementing rules that prohibited any and all erotic expression or activity, real or perceived. As the Yale Law School professor Vicki Schultz has observed: “Sex harassment policies now provide an added incentive and an increased legitimacy for management to control and discipline relatively harmless sexual behavior without even inquiring into whether that behavior undermines gender equality on the job.”36
Because most such activity involves male-female relations, these rules scrutinized and regulated sex-integrated interactions much more severely than same-sex interactions. In workplaces and professions where men held most of the positions of influence and authority—in other words, in most workplaces and most professions—this reinforced the existing inclination of many powerful men to prefer other men as their protégés and co-workers. Professional women reported being unable to have private, job-related conversations with their male superiors because of “open door” policies that required male-female interactions to be conducted in public in order to avoid the appearance of sexual impropriety. Because male supervisors worried about inadvertently offending their female employees—and thereby violating increasingly strict workplace rules against sexual misconduct—conversations between male supervisors and female subordinates were often stilted and tense, and comfortable working relationships took longer to develop, if they developed at all. Cautious employers hesitated to send a male and a female on out-of-town business travel together, for fear of what might happen after a long night in a strange city. Because the more senior members of the potential teams were most often men, it was usually women who lost out on valuable work opportunities. Strictly speaking, all of this was sex discrimination, and therefore unlawful. But proving that a mentoring relationship was cool and distant or that a plum assignment went to someone else because of sex is hard. Proving that a boss made a racy joke or that an out-of-town trip mixed business and pleasure is relatively easy. Employers correctly assumed that the risk of sexual harassment litigation was greater than the risk of being sued for subtle sex discrimination.
When the law that protected women from harassment because of their sex morphed into the idea that anything sexual must be harassment, a legal theory that was designed to help women get ahead in the workforce wound up restricting their chances to interact with men as equals. Civil rights law began with a modern, enlightened demand for respect and fairness, but wound up reinforcing the old, chauvinistic idea that women are frail hothouse flowers that require delicate handling and protective isolation from the hardscrabble working world. This perverse metamorphosis of sex harassment law into sexual harassment law—butterfly into worm—is a case of rights gone wrong.
* * *
There’s an argument to be made for most of the decisions I’ve discussed and criticized in this chapter. Mandatory busing was the only way to achieve meaningful integration, and perhaps the objective costs and widespread resentment that came with it could have been managed. Race-conscious public policies, like those invalidated in Parents Involved, strike many people as unfair, even when their effects are mild, and sometimes they do more harm than good to race relations, reinforcing racial divisions. Sexual expression is offensive to many people of both sexes, and it is often used to belittle and threaten women: maybe, on balance, a virtual ban on racy talk and romantic overtures on the job makes sense.
But the advocates of these changes didn’t have to make the arguments in these practical terms, which would have required detailed study of the social context and likely effects of the policies in question. Instead, they bypassed the discussion any sensible person would want to have before making a sweeping policy change and argued their cases in terms of abstract rights and rigid rules: public schools must be integrated as quickly as possible, regardless of the costs to everyone involved or the possibility of mutually preferable compromises; the government may not make “racial classifications,” regardless of the benefits in terms of social justice; romantic overtures and crude jokes are presumptively discriminatory, even if both parties directly involved sought out the interaction.
A more pragmatic approach to these problems might have been more effective and less costly. School desegregation could have employed a range of tools and remedies: integration when feasible, supplemented with better funding for the best predominantly black schools and improved job opportunities for black teachers and administrators in integrated schools. Instead of a sharp distinction between districts that once practiced formal segregation and those that did not, which unreasonably put inner-city schools under strict desegregation orders while leaving nearby suburbs free from any obligation to integrate, administrators could have developed a range of remedies that took local demographics and local preferences into account. Such a practical approach to civil rights would be the antithesis of the simplistic idea that the law forbids “racial classifications” regardless of their effects or the motivations underlying them. And had schools gradually integrated on terms more people would accept, the backlash against integration that inspired reverse-discrimination lawsuits might have been less severe. Likewise, a civil rights approach that focused on a practical goal of improving women’s access to jobs would not have been so easily captured by an antisex moralism. Any viable approach to women’s equality must confront and punish sex harassment like that in Jacksonville Shipyards, but more ambiguous cases involving hurt feelings, offended sensibilities, and consensual relationships are probably better left to the normal forces of the market, informal social pressure, compromise, common decency, and common sense.
At their best, civil rights guarantee both individual and collective justice, ensuring that every citizen gets a fair shake and reversing the effects of widespread bigotry and entrenched social hierarchies. When asserted and interpreted with sensitivity to social context and guided by responsible policy goals, rights improve democratic politics and markets by correcting unexamined biases and unjustified prejudices. But rights go wrong when abstractions take precedence over real-life experience and conceptualism wins out over common sense.
Like cloistered medieval theologians who derive their worldview from the intangibles of ancient scripture rather than from empirical observation, judges and lawyers at their worst allow legalistic conceptualism to blind them to both practical constraints and logical consequences. As a result, rights to racial justice now prohibit some of the most promising means of achieving it, and rights to women’s equality reinforce chauvinistic condescension and women’s isolation.