Chapter 2

The Growth of the Law

The Liability Explosion:
Workers’ Compensation

One of the most striking developments in the twentieth century was the so-called liability explosion: the vast increase in liability in tort, mostly for personal injuries. The nineteenth century—particularly the early part—had built up the law of torts, almost from nothing, as we have seen; courts (and it was mainly courts) created a huge, complicated structure, a system with many rooms, chambers, corridors, and an ethos of limited liability. The twentieth century worked just as busily to tear the whole thing down. The process had started much earlier, as we noted. One of the first doctrines to go was the fellow-servant rule. By 1900, it no longer worked very well. It was still heartless, it still cut off most lawsuits and claims by workers; but it no longer had the virtue of simplicity or efficiency. Lawyer’s fees, insurance, litigation costs: the system was costly and slow, and the middlemen took much or most of the money; the system was like a body infested with tapeworms. Congress swept the rule away for railroad workers, in the Federal Employers’ Liability Act, in the first decade of the century.1 Many states also abolished the rule, or limited it severely. Between 1900 and 1910, there was vigorous debate and discussion about alternatives—notably, some sort of compensation scheme. Industry on the whole was opposed to the idea; but business resistance gradually tapered off. Perhaps a compensation scheme would make sense, even for employers. It might buy a measure of industrial peace. England and Germany already had compensation schemes, which seemed to work. New York passed a compensation statute in 1910. It was declared unconstitutional.2 Next came Wisconsin (1911), this time successfully. Other states now passed their own versions, trying to avoid the pitfalls of the New York law. The Supreme Court later held the most common types of plan constitutional. By 1920, almost all states had adopted a workers’ compensation law, of one sort or another. The last holdout, Mississippi, joined the chorus in 1948.

Workers’ compensation did not come out of nowhere.3 Besides the foreign models, there were also domestic models: schemes of insurance, including cooperative insurance plans for workmen.4 In any event, workers’ compensation, when it emerged, was something of a compromise system. Each side gave a little, got a little. The worker got compensation. If you were injured on the job, you collected; you no longer had to show that somebody else was at fault. Nor was contributory negligence an issue: even a foolish, careless worker had a right to claim compensation. In Karlslyst v. Industrial Commission (1943),5 a Wisconsin case, the worker, a truck driver’s assistant, urinated off the side of a moving truck, fell off, and was injured. This was stupid behavior, to say the least. His case would have been doubly hopeless in the nineteenth century: the fellow-servant rule, plus contributory negligence, would have robbed him of any chance at a claim. But this was now, not then, and the worker recovered: He was injured on the job; his own carelessness was completely irrelevant.

Compensation laws did away with the fellow-servant rule and also with the doctrine of assumption of risk. Unless the worker was drunk, or deliberately trying to hurt herself, she was entitled to her compensation. On the other hand, the statute set up a formula that would determine what the employer had to pay: medical expenses and a fixed percentage of the wages she lost; but with a definite ceiling. In Wisconsin, for example, under the 1911 law, a totally disabled worker could collect, as long as her disability lasted, 65 percent of her average weekly wage, up to four times her average annual earnings. But this was as high as she could go.6 The statutes also commonly set a definite price for “permanent partial disability”; and also fixed a price for damaged or missing body parts—an arm, a leg, a finger, a foot, an eye, or an ear. The employee could get no less, but also no more. No more jury trials. No more chance at the litigation lottery. No big recoveries. No money for pain and suffering. For all this the employers were grateful. Both sides, in other words, won and lost.

Only the middlemen—this was the theory—would be the losers. Workers’ compensation would do away with the plague of accident cases. It would eliminate the need for lawyers, claims adjusters, insurance brokers, and so on. In the main, it did exactly what it was supposed to do. Most work accidents—the overwhelming majority, in fact—were, from the start, handled smoothly, and without great fuss, by boards and commissions. But the new statutes did generate enough case law to surprise, and disappoint, some people who had championed the system. The case law—and later, the statutes—also began to push the system into new areas, in directions that would have amazed reformers in the early part of the century. What fueled the movement for workers’ compensation was the classic industrial accident: the thousands of workers mangled, maimed, killed, crippled, destroyed, in mines, factories, railroad yards—the dead and wounded victims of the industrial revolution. Indeed, one early statute, in Oklahoma, was specifically limited to “hazardous occupations,” and even listed what these were: factories, blast furnaces, and so on. There was to be no coverage for “clerical workers.”7 Nor did the early statutes, typically, cover occupational diseases—if the job poisoned the worker, or just slowly wore him out, the worker could not recover. These harms were not “injuries” and they did not come from “accidents.” The “radium girls,” who painted dials on watches that would glow in the dark, and who began to die of cancer, in the 1920s, collected almost nothing.8

But over time, the system evolved toward more and more liability—case law and statutes alike. The statutes, as originally worded, covered injuries “arising out of” (caused by) and “in the course of” (during) employment. These words, or their equivalent, appeared in most of the statutes. The courts gradually stretched the meaning of these words as if they were made out of rubber. If a secretary strains her neck at the office, turning to talk to her girlfriend, she collects compensation. Dozens of cases dealt with accidents at company parties or picnics. Often these were held to be injuries “in the course of” employment. A crazed killer storms into a restaurant, firing at random; bullets hit a bus boy. He collects workers’ compensation.9 Some states began to cover occupational diseases; and this coverage expanded over the years; indeed, in New Jersey, the plight of the “radium girls” helped spur reform of this sort. The courts even began upholding or mandating awards in cases where workers had heart attacks on the job. Sometimes they required evidence of an unusual strain; but as time went on, this doctrine became more and more tenuous, and more and more workers who had heart attacks at work were able to recover.10

How rational is a system where a man who has a heart attack on Sunday, watching football on television, gets nothing; while a man who has a heart attack on Monday, sitting at his desk reading a report, gets coverage under workers’ compensation? In a country without a system of cradle-to-grave security, and in particular without national health insurance, piecemeal bits of what would be a full welfare system start sticking like burrs to existing institutions. In any event, the compensation explosion marched on through the last half of the century. Courts began upholding awards for psychological injury—people who said the job had stressed them to the point of illness, or that the trauma of getting fired, or of losing a promotion, drove them into deep depression, or triggered mental illness, or the like.11 Awards of this type, in the 1980s and onward, drove up the costs of the system. These added costs alarmed and enraged the business community. Business brought its lobbying muscle to bear on the issue. A number of states cut back sharply. In California, for example, from 1989 on, no worker could recover compensation for “psychiatric injury” brought about by a “lawful, nondiscriminatory, good faith personnel action.”12 No longer was it possible to claim that a layoff had driven the worker crazy.

More Explosions: Tort Law

These developments in workers’ compensation ran parallel to, and were dwarfed by, changes in the tort law system in general. These changes did not happen overnight. Early in the century, the tort system was still stingy and withholding; procedural and legal obstacles kept most victims of accident out of court. When a Seaboard Air Line train was wrecked in 1911, carrying 912 black passengers on an excursion, ten blacks were killed and eighty-six injured. Liability was obvious; but the company sent out agents, and settled for tiny amounts—from $1 to $1000.13 These were blacks, in the South, and they settled out of court. But Northern whites were not much better off. The famous Triangle Shirtwaist Disaster, in the same year, in which scores of young women died—needlessly—created a furor, influenced the course of legislation, but resulted in almost nothing at all for the victims. The wrongful death claims settled for about $75 each.14 Here the fellow-servant rule was one of the villains. In general, the culture of total justice did not develop overnight. What produced it was a combination of factors—the development of insurance, the timid but real beginnings of a genuine welfare state; factors which led people to want, and expect, some kind of system of compensation, for disasters that were not their own personal fault.15

The automobile was one of the great innovations of the twentieth century. At the beginning of the century, automobiles were, in essence, curiosities, expensive gadgets. By the 1920s, as we said, this was fast on the way to becoming a society of people with cars. There were forty million cars registered by 1950; by 1990, 123 million cars. The automobile remade America. Its social (and legal) implications are endless. The automobile helped create suburbia. It revolutionized tourism. It led to a demand for roads, more roads, highways, more highways; a federal Road Aid Act was passed in 1916, and another in 1921; and the interstate highway system was created after the Second World War. The automobile transformed the cities. First, it helped them, by getting rid of the horse. Horses at one time dumped two and half million pounds of manure, sixty thousand gallons of urine, day in and day out, in New York City alone. Cities had to dispose of thousands and thousands of dead horses a year—not to mention the stench, the flies, the dirty straw, and the thousands of stables.16 By the end of the century, the tables were turned: now the automobile was choking the city; and creating its own forms of pollution.

The automobile also created a whole new field of traffic law. There were rules of the road before the automobile, but they were of minor legal importance. By the middle of the century, traffic law touched everybody’s life: drivers’ licenses, stoplights, parking regulations, speed limits, auto insurance—no branch of the law was more familiar (or more commonly avoided and violated). Speed limits and drivers’ licenses came in early in the century. Traffic violations run into the millions every year; they are the plankton, the krill, of criminal justice. Most of these violations are trivial. But drunk driving is taken seriously—indeed, since the 1970s, very seriously, helped along by organizations such as Mothers Against Drunk Driving (MADD). And the automobile accident replaced the train accident as the staple of personal injury law, the bread and butter of the tort lawyer. Here was a machine, tons of metal and rubber, racing along streets and roads, in the millions, and at high speeds. To be sure, the overwhelming majority of accident cases, the fender-benders, the smashed rear ends, and the whiplash injuries, were settled out of court, “adjusted,” compromised, or simply handled with a phone call and a check.17 The real parties in interest were the insurance companies. But a residue of serious and contested cases remained for the personal injury bar to handle. These lawyers, and tort lawyers in general, worked (as we have seen) on the basis of the contingent fee—they earned nothing if they lost the case; but if they won, they took a healthy cut (a quarter, a third, perhaps even a half). The elite bar hated the contingent fee, and looked down on “p.i.” lawyers. But the practice flourished.

Even more dramatic changes took place in aspects of tort law that hardly existed before 1900. One of these was products liability. In 1916, in a decision clearly written for posterity, Benjamin Cardozo, of the New York Court of Appeals, seized an opportunity, and helped change the direction of liability law. The case was MacPherson v. Buick Motor Co.18 MacPherson bought a Buick from a dealer. The wheel was “made of defective wood.” Its spokes “crumbled into fragments.” MacPherson was seriously injured. He sued the car company. There was a long-standing technical barrier to MacPherson’s claim. No “privity,” no direct relationship, connected plaintiff and defendant. MacPherson never dealt directly with the Buick Motor Company; he dealt only with an auto dealer. In the past, that would have made his lawsuit dubious if not impossible. But Cardozo cut through this technicality. When a manufacturer makes a product that is or can be dangerous, and knows that the product will end up, not with a dealer or a middleman, but with a consumer, the manufacturer must suffer the consequences. And the injured party can sue him directly.

This was the core of the decision, cleverly disguised in cautious and craftsman-like language. Within a generation or so, other states had followed Cardozo’s lead; and products liability law had taken a giant step forward (or backward, if you will). “Followed” may be the wrong word. Other courts did cite Cardozo, and refer to his famous decision. His written opinion was seductive, persuasive. But the doctrine spread, not because of Cardozo’s reputation and skill, but because it struck judges as intuitively correct. This was an age of mass production, an age of advertising, of brand names. People associated products with their makers—with Buick, not with an auto dealer. If a can of tainted soup poisoned someone, did it make sense to hold the grocer liable, and only the grocer? Why not sue the manufacturer, the company that made the soup, put it in a can, and sealed the can airtight? It was easy for people in the twentieth century—including judges—to accept the basic idea of products liability. Companies that make a product must bear the responsibility, if the product hurt the ultimate consumer. In later times, courts have carried products liability further than Cardozo ever dreamt of. The original standard was the usual one: negligence. But the courts have come closer and closer to a kind of absolute liability. There is a tendency to make the company pay for what a defective product does, whether or not the plaintiff can prove there was negligence in its manufacture. In a few dramatic cases, recoveries have soared into millions of dollars. Companies that mass-produce goods obviously cannot guarantee absolute safety. They cannot guarantee that one widget out of millions made won’t be defective and cause harm. The court put it this way, in a case involving a glass jar of peanuts that shattered, “a seller…is liable for defects in his product even if those defects were introduced, without the slightest fault of his own.”19

The liability explosion was not confined to manufactured products. Particularly in the last half of the twentieth century, more people began to sue their doctors, their lawyers, and their accountants; they sued cities, hospitals, businesses, and labor unions; occasionally, they even sued ministers and priests. Old doctrines that stood in the way were swept ruthlessly aside. Suing a (nonprofit) hospital was at one time impossible: Charities were not liable in tort. But in the second half of the twentieth century state after state abandoned charitable immunity. By 1964, only nine states still were clinging to the doctrine.20 Later, these nine also gave up the ghost. Now nothing stands in the way of lawsuits against hospitals, universities, or other nonprofits.

Medical malpractice—suing your doctor for careless treatment—in a way broke no new ground, doctrinally speaking. A negligent doctor, like anybody else who was negligent, had to take the consequences. But dragging the friendly family doctor into court had been uncommon and somewhat disreputable; moreover, doctors (at least so it was said) were reluctant to testify against each other. Their instinct was to stick together against the lay public. Malpractice cases were rare in the first part of the century. But in the late twentieth century, medicine had become much more impersonal (and more high-tech). Malpractice suits also became more common. In a study of New York city cases (1910), only a little more than 1.1 percent of the tort cases were malpractice cases; in San Francisco, between 1959 and 1980, 7 percent of the civil jury trials came under this heading.21 In one regard, too, the standard did shift: it was also malpractice if a doctor filed to tell his patient about risks and dangers in medicines or procedures. The doctor had to get what came to be called “informed consent.” If not, the doctor might be liable if something went wrong—regardless of fault. This doctrine appeared in the late 1950s, and later spread and became more general. The doctrine of “informed consent” reflected two social norms that were dominant in the late twentieth century: a kind of free-wheeling individualism, which stressed the right to choose; and, a related norm, a suspicion of experts and elites. Contrary to what most people believed, juries in malpractice cases were not pushovers: most plaintiffs lost their cases. Some plaintiffs won only small recoveries. A tiny minority collected huge amounts. But this was enough, apparently, to send insurance premiums through the roof; and to frighten and enrage thousands of doctors.

The liability explosion is real and powerful. But it has not gone unchallenged. Money for plaintiffs and for their lawyers did not grow on trees. The money came from insurance companies and from the coffers of corporations. There was backlash and counterrevolution in the 1980s. Doctors threatened to stop delivering babies. Companies claimed plaintiffs were driving them into bankruptcy. Urban legends spread like wildfire: the burglar who sued a landlord for defective premises; the foolish old woman who won millions because she spilled hot coffee. The Republicans denounced the hungry, parasitic army of lawyers. All this had some results. Some states cut back: on recoveries for pain and suffering, on punitive damages. But the core of the system remained solid.

One particular type of tort case, statistically insignificant but socially and economically important, was the so-called mass toxic tort. The paradigm case is asbestos. Clarence Borel, dying of lung cancer at the end of the 1960s, had been an “industrial insulation worker.” He sued the asbestos companies; he blamed them for the illness that was killing him. By the time he won his case (1973), he was already dead.22 But there were armies of the living that followed him. By the mid-1980s, there were 30,000 claims against asbestos companies. The number rose to over 100,000, and the cases drove every asbestos company into bankruptcy. There were other mega-cases, huge affairs with hundreds or thousands of plaintiffs: about the Dalkon Shield, a birth control device; about Agent Orange, a pesticide used in Vietnam; about diethylstilbestrol (DES), a drug used to prevent miscarriages. These cases lumbered through the courts for years. At the end of the century, cancer victims and states were suing tobacco companies, for sums that would dwarf the gross national product of most small countries. On the horizon were mass lawsuits against gun companies; and even against companies that sold junk food and made people fat.

The mass toxic tort cases accentuate a peculiarity of the American tort system. No other common law country—and probably no other country at all—makes such heavy use of the tort system, such heavy use of private lawsuits. The big awards—rare though they are—are unheard of in other countries. Part of this is due to the fragmentary nature of American government; and to the deep streak of stubborn resistance to government. In other countries, the welfare system carries more of the burden. The point we made about worker’s compensation, applies to the whole tort system. People are much less likely to sue a doctor in a country with a national health system. America regulates business, sometimes quite effectively; but the country still relies on tort cases to do work and carry out policies that have a more central, comprehensive solution in other countries. The tort system creaks and lumbers under the burden; it is a costly, inefficient way to do the heavy lifting.

The faults of the tort system have led, as we saw, to a major backlash. The pathologies of litigation are a campaign issue; and the trial lawyers are, in some circles, convenient scapegoats. Nobody could really argue that the tort system is perfect or even half way to perfect. But it is, perhaps, better than nothing.

The Constitution, Rights, and Civil
Liberties in the Twentieth Century

The rush of power to the center meant not only that the president and the executive gained power; so too did the federal courts. The Supreme Court had always been important, always been in the eye of the storm. But it was even more so in the twentieth century. The twentieth century, from start to finish, was a golden age for constitutional law.

Just as state governments gained in absolute power, even as they lost relative to the federal power, so too did state courts. And state constitutional law. This is, to be sure, a rather obscure subject. The state high courts have the final word on the meaning of their own constitutions; and they decide many important, even vital, cases. The public largely ignores them. In one survey, in 1991, only about half the population even knew they had a state constitution.23 The state constitutions, as in the past, were much more brittle than the federal constitution. They were constantly tinkered with. Actually constitution-making—adopting whole constitutions—slowed to a crawl in the twentieth century. Only twelve states adopted new constitutions in the twentieth century. (Five new states—Oklahoma, Arizona, New Mexico, Hawaii, and Alaska—adopted their first constitution.) The amending process, however, ran wild. The federal constitution is hard to amend, and is rarely amended (the failure, in 2004, of a proposed amendment to ban gay marriages is only the latest example). But this reticence does not apply to the states at all. In New York, between 1895 and 1937, there were 76 amendments to the constitution, between 1938 and 1967, there were 106, between 1968 and 1995, 46.24 In some states (Georgia, for example) there were literally hundreds of amendments—654, by one count, by the end of the 1960s. The result was an incredible amount of constitutional bloat. The Louisiana Constitution contains 254,000 words—it rivals War and Peace, though is nowhere near as readable. In short, there was nothing fixed and sacred about the state constitutions. Nonetheless, these were significant documents—on most issues, they were the highest law of the state.

The first part of the century was a low point, in many ways, for race relations in the United States. It was a period extremely conscious of race—and not only with regard to what we would now consider racial minorities. There was a sense of crisis among old-line Americans—white Protestants living on farms or in small towns. The poem on the Statue of Liberty talked about welcoming the tired, the poor, the huddled masses; but most old-line Americans wanted nothing to do with huddled masses. Immigration control had begun, as we saw, with laws directed against the Chinese. Now agitation for limits on who came in, and how many, resulted in a series of restrictive laws. Paupers, contract laborers, polygamists, people who advocated “the overthrow by force and violence of the Government,” people with any “loathsome…disease”: all these were excluded. A kind of climax was reached in the immigration law of 1924. This statute limited the sheer numbers of immigrants, and strongly favored people from the British Isles and northern Europe. It did this through a system of quotas. The results were dramatic. About 17,000 Greeks and more than 150,000 Italians had streamed into the country each year; under the 1924 law, Greece was allowed 307 immigrants, Italy a little under 6,000.25

In the South, where most African Americans lived, the early part of the century was the high noon of white supremacy. Blacks had no political power. They had no vote. True, the Constitution supposedly guaranteed the right to vote. The Thirteenth Amendment had abolished slavery; and the Fifteenth Amendment provided that the right to vote was not to be abridged on account of race or color. But the Constitution was, insofar as black voting was concerned, nothing but a piece of paper on display in a museum. Southern whites did not want blacks to vote; and they used every trick in the book to keep black people away from to the polls. We have already seen how this was done. Anybody who wanted to vote in Mississippi or South Carolina had to show that they could read and interpret the state constitution. No blacks ever seemed able to pass this test. Some state constitutions embodied the famous “grandfather” clause, as we have seen. In the new state of Oklahoma, for example, prospective voters were supposed to demonstrate their knowledge of the state constitution; but a voter was excused from this awkward test if he was a “lineal descendant” of someone entitled to vote in 1866, or of some foreigner. This covered just about everyone who happened to be white; and just about nobody who was not. The Supreme Court struck down the “grandfather clause” in 1915;26 but this made little or no difference. There were other ways to stop blacks from voting; and the South used all of them to good effect. In Alabama, in 1906, 85 percent of the adult white males of the state were registered to vote—and 2 percent of the adult black males. By 1910, effectively, blacks had been totally shut out of the voting process.27 Southern Democrats had “executed a series of offensives” with the aim of “the elimination of black voting and the emasculation of their political opponents.” For decades to come, the South was a one-party and a one-race region, politically speaking.28

No blacks in the South held office. No office-holder had any need to show sympathy or understanding for the needs and wants of blacks. And since there were no black judges, and no blacks on juries, the whole criminal justice system—the whole weight of state power—could come crashing down on the helpless black population. The criminal justice system was grossly unfair to blacks. When a black man or woman was accused, and the accusers were white, the black had little chance of justice in the white man’s court.

Yet, for much of the white population, criminal justice was not unfair enough: it was too slow and uncertain. Lynch mobs made sure that the message of white supremacy rang through loud and clear. A black who dared to transgress the southern code—or was accused of it—risked swift, brutal death. This savagery had begun its reign of terror in the nineteenth century; and it continued, unabated, in the twentieth. Luther Holbert was seized in Doddsville, Mississippi, in 1904, accused of killing his employer. As a thousand people watched, Holbert and his wife were tied to trees; their fingers were chopped off one at a time; their ears were cut off; they were tortured with corkscrews in their flesh, beaten, and then burned to death. The wife, at least, was completely innocent of this or any crime.29

Some Southerners, to be sure, were appalled by lynching; and organizations of blacks protested and lobbied to get federal legislation. Southern members of Congress blocked any movement in this direction. The federal government showed little or no interest in civil rights, and indeed, Woodrow Wilson, a Southerner by birth, was only too eager to promote segregation in Washington, D.C. Almost in desperation, black America turned to the federal courts. The National Association for the Advancement of Colored People (NAACP) was founded in 1909. Almost from the start, the NAACP used litigation as one of its weapons of choice. They had, after all, nowhere else to turn. This strategy soon began to show some results. In 1915, as we noted, the Supreme Court held the “grandfather clause” unconstitutional. In Buchanan v. Warley,30 in 1917, the Supreme Court struck down a Louisville segregation ordinance. Louisville had enacted an ordinance, ostensibly “to prevent conflict and ill-feeling between the white and colored races,” by making segregation the norm. If a block had a white majority, no black family could move onto the street; and no white could move into a residential block where most of the families were black. But this case, like the case on the grandfather clause, was a victory mostly at the symbolic level. The cities remained rigidly segregated. There were black neighborhoods and white neighborhoods, and very few areas where the races ever mixed.

The cases did suggest, however, that litigation had at least some potential. The Supreme Court, and perhaps the federal courts generally, showed a willingness at least to listen to the claims of black citizens. This seemed to be true of no other institution, no other branch of government. The constitutional war on racism continued. The results were slow and incremental. In a series of cases after the Second World War, the Supreme Court declared this or that situation or practice (segregated law schools, for example), unconstitutional. Still, the Court shied away from the broader issue: whether segregation, under the fig leaf of “separate but equal” had any warrant in law and morality at all. The NAACP pushed and pulled. The Court was a reluctant bridegroom. The plaintiffs won most of their cases, but on narrow grounds. As early as 1938, the Supreme Court ordered the University of Missouri to admit Lloyd Gaines, an African American, to its law school.31 Gaines never attended—in fact he disappeared, somewhat mysteriously; and Missouri hastily created a law school for blacks.32 Would this technique—quickly providing some sort of school for black students—satisfy the Supreme Court? The answer came in Sweatt v. Painter (1950).33 Heman Sweatt, a mail carrier in Houston, Texas, had ambitions of becoming a lawyer. The University of Texas Law School was open only to whites. Texas, to avoid integration, set up a new law school for blacks. But the Supreme Court would have none of this. The University of Texas was a powerful, unique institution, famous, rich in tradition; a feeble new school could not be in any sense its “equal.” In Oklahoma, George W. McLaurin, an African American, wanted to earn a doctorate in education at Oklahoma’s university, in Norman, Oklahoma. The University said no; but a federal court ordered him admitted. Once in, he was treated as an outcast: he ate at a separate table in the cafeteria, and studied at a segregated desk in the library. The Supreme Court ordered the school to give him the same “treatment…as students of other races.”34

These decisions were unanimous—but nonetheless, quite cautious. Caution was, in a way, understandable. The Court had no power to force states to follow through. Every step of the way was bitterly contested. In general, the house of white supremacy stood strong and fast. The United States fought racist Germany with an army, navy, and air force that was rigidly segregated by race. It fought Japan in the same way; and, after an outburst of hysteria on the West Coast, fueled by greed and paranoia as well, the Japanese of the Western states were shipped to dismal internment camps in the blistering deserts of eastern California. The Supreme Court supinely upheld this action of the government, in Korematsu v. U.S.35 The government defended its actions vigorously: we were at war with Japan, and “properly constituted military authorities” had raised the spectre of a Japanese invasion of the West Coast; the measures were tough but necessary wartime medicine. A majority of the court went along with these arguments More than forty years later, in 1988, Congress officially apologized, and even awarded some compensation to the men and women who survived the camps.

Korematsu was a kind of low point. The times were changing, however. American apartheid was an embarrassment in the postwar period. It handed the Soviet Union, during the cold war, a priceless weapon of propaganda.36 The colonial empires of Africa were dissolving; and black sovereign nations appeared all over that continent. President Harry S. Truman, after the end of the Second World War, issued an order, as commander-in-chief, desegregating the armed forces. Blacks had migrated in great numbers to the northern states. In the North, they voted, and exerted, directly or indirectly, more influence on national politics than was possible in the South. And the battle in the courts continued. The Supreme Court had come a bit closer to the heart of segregation, in Shelley v. Kraemer (1948).37 The issue was a restrictive covenant—a clause in a real estate deed, which made landowners promise never to sell or rent the property to blacks. These covenants were extremely common, especially in suburban developments. And they “ran with the land,” that is, they bound all later owners as well as the original buyers. In Shelley, the Supreme Court refused to enforce the covenant. Under the Fourteenth Amendment, states could not deny to their citizens the “equal protection of the laws.” Courts are a vital part of the state government. Race discrimination was a violation of equal protection; and if a court enforced such a covenant, this was state action, and consequently unlawful. Shelley v. Kraemer certainly did not put an end to housing segregation—nothing has—but it did make it easier for blacks to break out of some of the very narrowest of urban ghettos. And a genuine, crucial climax in the long struggle for equality came in 1954, when the Supreme Court handed down its decision in Brown v. Board of Education.38

This was surely one of the most momentous of all Supreme Court decisions. To find a case comparable in importance, and fame, one has to reach as far back as the Income Tax cases, or even Dred Scott; or perhaps forward to Roe v. Wade. The new Chief Justice, Earl Warren, wrote the Court’s opinion in Brown; It was short, and unanimous. In Brown, the Court faced an issue it had dodged before: even assuming facilities—schools for example—were equal yet continued to be separated, is this situation allowed under the Constitution? No, said the Court, it is not. Segregation is inherently unequal; and inherently unlawful. It is a violation of the federal Constitution. The dual school system had to be ended.

The Court did not order the system to end immediately; it did not tell the schools to open their doors, at once, to all races indiscriminately. The Court left the question of a remedy open. It asked for arguments from all parties, on how to implement its decision. In the second Brown decision,39 the Court dumped the problem into the laps of the local district courts. They were to see to it that schools were desegregated, “with all deliberate speed.” In the event, there was very little sign of speed, especially in the deep South. In fact, the white South reacted to Brown with fury and dismay. At best, Southern states tried delaying tactics; at worst, they resorted to violence. For at least a decade, almost nothing changed in the deep South; and there are those who think that Brown, like so many other of the Court’s decisions on race in the twentieth century, ended up accomplishing nothing.40 Some federal judges—at great personal cost—tried honestly to enforce what was now the official law of the land.41 Others were themselves segregationists, who did what they could to obfuscate and delay. In any event, every attempt to integrate, even at the university level, touched off riots, mob action, and a blizzard of federal writs and orders. In 1956, Autherine Lucy, a young black woman, tried to enroll in the University of Alabama. A mob “roamed Tuscaloosa for several days, burning crosses, waving Confederate flags, and attacking cars driven by blacks.” Lucy was expelled from the university.42

Still, most scholars are not ready to write off Brown as a failure. Brown ended segregation in the border states (the named defendant, one must recall, was the school system of Topeka, Kansas, not Jackson, Mississippi). And the case, and the events and litigation that followed, certainly catalyzed the civil rights movement. Brown did not specifically overrule Plessy v. Ferguson. But in fact, “separate but equal” was dead. Even though the Brown case talked only about education, the Supreme Court soon made it crystal clear that their principle went far beyond the schools. The Fourteenth Amendment meant there could be no segregation by race at all—anywhere, in any aspect of public life. American apartheid was a violation of fundamental rights. Vilification and “massive resistance” in the South did not move the Court. Chief Justice Warren and his colleagues refused to budge. The segregation case, and the cases that followed, also forced the hand of the federal government. President Dwight D. Eisenhower was no fan of the Brown decision. But when states openly defied the federal courts, and federal authority, he was forced to act. Eisenhower sent paratroopers into Little Rock, Arkansas, to enforce a segregation order directed at Central High School.43

With enormous effort and cost, and great personal humiliation and pain, in the decade after Brown, a handful of black students did manage to force their way into segregated schools and universities—guarded at times by battalions of federal troops. The civil rights movement, and its leaders, including Martin Luther King Jr., struggled to break the stranglehold of white supremacy on the South. It was the age of TV; and the whole country watched as southern sheriffs broke up crowds of peaceful black citizens, hounded and harassed people demanding their rights in a dignified way and sprayed them with hoses; and let loose dogs on small children. A bomb in Birmingham, in 1963, killed four little black girls at Sunday school. This, and other horrors, helped turn public opinion around, in the North. Under President Lyndon Johnson, Congress passed two historic civil rights laws. There was ferocious opposition from Southern die-hards; they hoisted the banner of states’ rights, but everybody knew what they really had in mind. The great Civil Rights Law of 1964 banned discrimination in education, housing, public accommodations, and on the job.44 It was a strong law; and it created a federal agency with power to make rules and regulations, and turn principle into working reality. The Voting Rights Law, in 1965, was if anything even more significant. This was a law that aimed to end the white monopoly on voting and political power. It too was a strong law, with real teeth. It got rid of all those legal tricks of the trade that had kept blacks from voting: poll taxes, literacy tests, and so on. It also contained a unique, and powerful, “trigger”: Any county (or state) where less than half the potential voters were registered or voted, had to reform itself; and any changes in voting rules and regulations had to be submitted to federal authorities for clearance.45

These laws were more sweeping than any passed since the false dawn of radical reconstruction. And they made a difference. Segregation is almost completely dead in hotels, public facilities, and restaurants. It is almost dead in higher education. It is still alive, but less virulent than before, in housing and employment. Blacks vote freely all over the South, and in substantial numbers. The black vote matters. Blacks serve as mayors and city councilors; they sit in state legislatures; they represent black districts in Congress. There are black judges on state and federal benches. Virginia, the capital of the confederacy, went so far as to elect a black man as governor. Even the most conservative Southern Senators feel the need to have some black people on their staffs. They may still play the race card—and some do—but they have to play it much more cautiously.

The North had its own brand of apartheid, more subtle than the southern form, but also quite real. Here too the last decades of the twentieth century brought about enormous change. The Civil Rights Act opened many doors for blacks. They were able to get jobs and positions that excluded them in the past. Black salespeople appeared in department stores; black secretaries in offices. Black police appeared on the streets. Blacks increased their role in political life. The civil rights movement, and the civil rights laws, led to profound changes in American culture. Overt discrimination went underground. The black middle class suddenly found itself in demand. Blacks now sing with the Metropolitan Opera, play baseball, run school districts, and work at trades that were once in essence lily-white. A few blacks have become big business executives; or partners in law firms. Blacks appear in TV ads, and in the movies; and interracial love and interracial marriage are no longer subjects of taboo on big and little screens. At the national level, presidents began to feel pressure for “diversity” in judgeships and high federal positions. President Lyndon Johnson, in 1967, appointed Thurgood Marshall, veteran of the civil rights movement, to be the first black justice on the U.S. Supreme Court.46 When Marshall retired, in 1991, President George Bush appointed a conservative black, Clarence Thomas. By the end of the century, black cabinet members were no longer a novelty. President George W. Bush, elected in 2000, appointed the first black Secretary of State, Colin Powell. There were parallel developments at the state level. Black mayors of big cities, by 2000, were no novelty. One or more blacks had been mayors in New York, Chicago, Los Angeles, San Francisco, Detroit, Atlanta, and many other cities. In part, this was because white flight to the suburbs left the cities with black majorities; but in some cities—San Francisco is a prime example—it took generous amounts of white votes to put a black mayor in office.

In many regards, then, there has been enormous progress in race relations. But racism remains a powerful force. It is a source of strong white backlash. The vigor of this backlash should not be underestimated. Also, there are still vast numbers of poor blacks who live in squalid ghettos. For whatever reason, black men and woman sit in prison in numbers far out of proportion to their share of the population. White fear of black violence and the social disorder of black ghettos helped trigger white flight. Black poverty and misery in turn feed black anger and alienation.

Within the legal world, no issue touching race has been so divisive and disputed as the issue of affirmative action or reverse discrimination. In the famous Bakke case in 1978,47 the University of California at Davis turned down a white student, Alan Bakke, who had applied for admission to the medical school. Bakke went to court, claiming that the school had discriminated against him. Out of one hundred spots in the entering class, the school had set aside sixteen for minority students. Students who got these sixteen spots had, on average, lower grades and scores than the whites—and lower than Bakke. He won his case; but the Court was badly fractured, and it was not at all clear exactly what the case had decided. It was read to mean this: Outright quotas (which Davis had) were unacceptable, but apparently a state university could take race somehow into account, in the interests of promoting diversity.48 In 1980, the Court also upheld a law of Congress that set aside a percentage of government contracts for businesses owned or controlled by minorities.49 But a more conservative court began to backtrack. Adarand Constructors, Inc. v. Pena (1995)50 was one of a series of cases that gutted the affirmative action doctrine. A federal court in Texas, in 1996, struck down race-conscious admissions to state colleges and graduate schools; the Supreme Court, somewhat surprisingly, refused to review the case.51 Whenever affirmative action comes to a vote, or a referendum, as it did in California (“Proposition 209”) in 1996, it loses, usually badly. Yet in 2003, in a case that came out of the University of Michigan,52 the U.S. Supreme Court did not take the opportunity (as some people had expected) to kill off affirmative action once and for all. Rather, it stuck, more or less, to its Bakke idea: Quotas were bad, wrong, and illegal; but taking race into account—somehow—was all right. Universities and other institutions read this case to mean that affirmative action, provided it was subtle and measured, had the Court’s seal of approval. At least for the time being.

Affirmative action has had its ups and downs; but the civil rights revolution is, in essence, irreversible. Whites have, on the whole, accepted large chunks of it, and have abandoned any thought of segregation, except perhaps for a lunatic fringe holed up in compounds in Idaho or elsewhere. But some—many—whites have never come to terms with other aspects of a multicultural society. They vote for the likes of Proposition 209; and through “white flight,” they vote with their feet. Race feeling lies at the root of some aspects of the law-and-order movement; and contributes to the deep unpopularity of welfare programs. The gap between black and white in America has not been eradicated by any means—not economically, not socially, not culturally. Prejudice is more deeply rooted than Brotherhood Week or Black History Month.

First Nations

The civil rights movement was about black liberation; but it accomplished, in the end, a great deal more. It helped influence the feminist movement, and the liberation movements of Native Americans, Asians, Hispanics, the so-called sexual minorities, old people, and the handicapped—almost everybody with a grievance against what they considered the majority force in society. Perhaps “influence” is not the right word. Rather, the same social forces that created the civil rights movement and led to Brown v. Board of Education were at work on other identity groups.

The legal story of the Native American peoples in the twentieth century is extremely complex. For them, too, the first part of the century was a kind of low point. The native tribes had been defeated militarily, robbed of most of their land, and relegated to “reservations” under the thumb of the Bureau of Indian Affairs. The Dawes Act (1887) set in motion a process of turning native lands into individual allotments. The underlying motive, supposedly, was assimilation—turning “savages” into real honest-to-goodness Americans. The effect, however, was more loss of land. In the early twentieth century, Native Americans were among the poorest of the poor; and the Depression made matters even worse. Navajo income in 1930 was $150 a year, per capita; in 1935, on the Sioux reservations, annual income was a pathetic $67. This was misery with a vengeance.53

But during the New Deal, almost for the first time, the Bureau of Indian Affairs put on a different face, under the leadership of John Collier. Collier, Roosevelt’s choice for the job, was unusual among bureaucrats of Indian Affairs. He admired Indian culture and felt that it was worth preserving—languages, customs, religions, and all. He rejected the idea of assimilation. He did not favor the idea that the natives were doomed or destined to melt into the great American melting pot. The Indian Reorganization Act (1934)54 allowed native peoples to draw up constitutions. Many of them did so—constitutions, and also tribal codes of law. This was an important step toward autonomy. In the 1950s, however, there was another turn of the wheel, and Congress took steps to “terminate” many of the tribes—another attempt to turn the native peoples into just plain Americans. All sort of benefits were also “terminated” along with tribal status. The results, for many native peoples, were nothing short of disastrous.

But then came yet another turn of the wheel. Lessons of the civil rights movement were not lost on tribal leaders. In 1968, the American Indian Movement was founded; in 1969, the Movement made headlines by seizing Alcatraz Island, in San Francisco Bay. In the 1970s, Congress put an end to “termination.” More and more, under pressure from activists, public policy moved away from assimilation. This was the period of “roots,” a period in which minority after minority began to celebrate whatever remained of their uniqueness. Hence, society accepted more and more the right of the native peoples to their own languages, culture, and way of life—even the right to reclaim old bones from museums. The Indian Self-Determination and Education Assistance Act of 1975 declared that federal domination had “served to retard rather than enhance the progress of Indian people.”55 The Indian Child Welfare Act (1978) gave tribes jurisdiction over cases of child custody.56 On the whole, too, Indian populations were now growing, not shrinking. “Affirmative action” extended to the native peoples as well as to blacks. Some tribes have struck it rich with minerals; some run lucrative casinos, or sell fireworks to the Anglos. There are quite a number of such stories. Yet, on the whole, poverty still stalks many of the reservations—in some cases extreme, grinding poverty, together with all the troubles that poverty brings in its wake, alcoholism, crime, social disorganization.

Moreover, plural equality has arrived too late to save the cultural heritage of many of the native peoples. A good proportion of the native languages are totally extinct. Others survive only in the mouths of a handful of very old people; in twenty years, most of these languages will be totally gone. Navajo and a few others languages seem, for the moment, secure—they are taught in schools, and there are kids who speak them on the playground and in their homes. Assimilation is no longer official policy. But it marches on, nevertheless. The villain now is not the Bureau of Indian Affairs; it is, rather, the overwhelming dominance of American television and American mass culture.

Asian Americans

Asian American history has many parallels to the history of other minority races. The Chinese were hated and vilified in the nineteenth century, particularly in California. The Chinese exclusion laws made it almost impossible for Chinese to enter the country, and denied them the right to become naturalized citizens.57 On the West Coast, laws were passed to keep Asians—especially the Japanese—from owning land. And, during the Second World War, as we saw, in a shameful episode, the West Coast Japanese were rounded up and sent to camps in dreary and remote desert areas.58 But the racist immigration laws died in the 1960s, and with them, the restrictions on Asians. In the 1980s and 1990s, far more Asians than Europeans entered the country as immigrants. The Chinese have been, on the whole, an economic success story. The same is true for the Japanese, the Vietnamese, and people from the Indian subcontinent—motel-owners, engineers, and on the whole upwardly mobile. The Korean grocery store is as familiar as the Chinese laundry once was. America—and especially such states as California—is more and more a rainbow of racial colors.

Hispanics

After the Mexican War, the United States absorbed a substantial Hispanic population, especially in California and New Mexico. Thousands of Hispanics, mostly Mexicans, also crossed the border, or were imported, to do tough, dirty jobs that most Americans did not want—picking crops, washing dishes, scrubbing floors. The numbers of Spanish speakers swelled even more in the last half of the twentieth century—millions of Porto Ricans (American citizens by birth), who moved to the mainland, Cubans running away from Fidel Castro, Central Americans, and Caribbeans, and, again, a flood of Mexicans, fleeing from poverty and overpopulation in their homeland. Most of the millions of illegal immigrants are Hispanic. Mexicans have met with hostility and race discrimination in many parts of the country. In certain California towns, there was outright apartheid: the city of San Bernardino refused to allow people of “Mexican or Latin descent” to use the city’s precious swimming pools and parks. A federal court declared these practices unconstitutional in 1944, ten years before Brown v. Board of Education.59 There were segregated schools as well: in Orange County, in El Modena, poor Mexicans went to one high school, the Anglos went to another. The two schools were basically in the same place, separated by a playing field and an invisible social wall. In the 1940s, federal courts declared this arrangement was a violation of the Constitution.60

Overt discrimination lost any claims to legitimacy in these cases; and then came Brown and the various civil rights laws. But Hispanic issues did not go away. By 2000, immigration politics was dominated by issues of legal and illegal immigration, mostly Hispanic. Questions of bilingual education and the like were also mainly Hispanic issues. The public dutifully voted against bilingual education in California. Apparently, the language of Cervantes and Garcia Marquez posed some kind of awful threat to the polity or to the survival of the English language. California also voted, in 1986, to make English the official language. What that means is unclear. Instructions for voters in California are sent out in both English and Spanish; in California, drivers can take their written driving tests in Spanish, if they wish, and many do so. By 2000, Hispanics were the largest of America’s minorities, and their numbers continued to grow. They were, on the whole, less active politically than blacks, but their voice and vote were bound to be felt in the future.

The Revolt of Otherness

We have mentioned the ripple effect of the civil rights movement—or, as we were careful to add, the effect of the social forces that gave rise to the civil rights movement. At least some members of every group that considered itself an “other” rose up, in their own way, to ask for rights, and for a place at the bargaining table. Some won significant constitutional victories; some won victories in the halls of Congress; some won both. Hardly any got everything they wanted; but all got some. There was a students’ rights movement, and a prisoners’ rights movement. Gays and other so-called sexual minorities, against passionate resistance, won substantial victories. Most states wiped the “infamous crime against nature” off their books; and the Supreme Court, in 2003, finished the job. In some places, “domestic partners” had rights to benefits. Some cities and states had laws or ordinances banning discrimination. At the beginning of the twenty-first century, gay marriage became a fact in Massachusetts.

The gray lobby too had considerable success. This was a matter of both ideology and demographics: People were living longer, and voting longer, too. Social norms changed along with demographic facts. People talk about a “youth culture”; but this means not only worshiping youth, it also means the right to act young, regardless of age. In any event, people over forty got themselves a law forbidding discrimination in hiring and firing in the 1960s. Somewhat later, Congress banned mandatory retirement.61 Congress also passed a strong law to protect the handicapped in 1990. The Americans with Disabilities Act prohibits discrimination against people who are blind or deaf or sick or in a wheelchair, in places of public accommodation or, more significantly, in the job market (the act carefully excepted people with “gender identity disorders,” compulsive gamblers, pedophiles, pyromaniacs, and kleptomaniacs.62 These groups, apparently, will have to wait, perhaps forever.)

By far the most important “minority” was in fact a majority: American women. Here too, the results have been both legally and socially revolutionary. The Civil Rights Act of 1964 prohibited discrimination against women, as well as against racial and religious minorities. In 1971, the Supreme Court made one of its periodic discoveries: the Fourteenth Amendment turned out to ban discrimination on the basis of gender. This would probably have been a great surprise to the men who drafted the Amendment; but, whatever the Justices say, their actions show that they believe in an evolutionary Constitution, a living, breathing, changing Constitution. Whether the Court was leading society or following it in this area is a difficult question to answer. The courts, in general, read the Civil Rights law broadly; and the Equal Employment Opportunity Commission did the same. Women won cases that gave them access to jobs that had been reserved for men; and men won access to jobs (as flight attendants, for example) that had been reserved for women. There were to be no more employment ghettos. Women, like blacks, began to appear on the bench, in high positions in industry, in the professorate, and in the president’s cabinet. Here too there was a long way to go, at the end of the twentieth century; but gender relations had clearly changed as dramatically as race relations, if not more so.63

Freedom of Speech

A serious body of law on the subject of free speech hardly existed, at the level of the U.S. Supreme Court, until around the time of the First World War. The war set a whole caldron of chauvinism boiling. Congress passed wildly expansive laws about espionage and sedition. A great witch hunt began, against disloyalists, Bolsheviks, and the like, which lasted into the 1920s.64 The courts were not as heroic in defense of free speech as modern civil libertarians would have liked them to be. In Schenck v. United States (1919),65 a unanimous court upheld the Espionage act, and the conviction of Charles Schenck, a Socialist, who had distributed leaflets denouncing the draft and the war in Europe. Oliver Wendell Holmes Jr., writing for the Court, argued that Congress could suppress words “used in such circumstances and…of such a nature as to create a clear and present danger that they will bring about the substantive evils that Congress has a right to prevent.” Fine words, and later famous words. They were little comfort to Schenck. In Abrams v. United States (1919),66 decided shortly afterward, Jewish radicals had written pamphlets, in Yiddish and English, condemning President Wilson for sending soldiers to fight in Soviet Russia. The United States was not at war with Soviet Russia; and yet these pamphlets struck the Court as a clear and present danger (they were written and distributed while the war with Germany was still going on). This time Holmes dissented.

In general, the courts were not much help during the waves of repression and deportation that came right after the First World War—the famous “Red scare.” As we saw, the Supreme Court, during the Second World War, allowed the internment of Japanese Americans. After the Second World War came the Cold War. State and federal committees hunted everywhere for “un-American” activities. During the feverish, paranoid days of McCarthyism, in the 1950s, the Supreme Court was, at first, exceedingly timid; the Court, for example, upheld the Smith Act, in Dennis v. United States (1951).67 Under the Smith Act, it was a crime to “knowingly or willfully advocate, abet, advice, or teach the…desirability…of overthrowing…government in the United States by force or violence.” In Dennis, eleven leaders of the American Communist Party were indicted, and convicted, of this crime. The Communist Party was a danger to the nation, in the view of the Court. The Supreme Court also upheld loyalty oaths and, in general, bent with the winds. The campaign against “reds,” “dupes,” and “fellow-travelers,” was more virulent in the United States than, for example, in Great Britain. In part, it was a reaction against the New Deal, and against liberals in general; anti-Communism was a convenient weapon against socialized medicine, civil rights, and whatever else offended the right wing of the political spectrum. To be sure, the Soviet Union was dangerous; and there were Soviet spies in the country; perhaps some had infiltrated the government itself. The most dramatic of the spy trials ended with a death sentence, in 1951, for Julius and Ethel Rosenberg. They were executed in 1953. Another sensation was the trial of Alger Hiss, who had been a high official of the state department. Hiss was ultimately convicted of perjury, in 1950, and went to prison.68 In later years, as the McCarthy era subsided, the Supreme Court’s played a somewhat more ambivalent role. It accepted some Cold War outrages, but others it turned aside. In 1957, for example, the Supreme Court overturned the conviction of another group of Communists, who were charged with conspiracy to evade the Smith Act.69

In general, the concept of free speech was expanding; and the Court ultimately accepted and reflected changes in the social meaning of free speech. The war in Vietnam raged on without benefit of new sedition and espionage acts. The Court also, in its own fumbling way, wrestled with the issue of obscenity and pornography—a free speech issue that the court had never confronted before the second half of the century. Nor did it ever really work out a coherent constitutional theory. The Court never held, flatly, that states had no power to ban pornography or censor obscenity. Here too what happened in society at large really rewrote the law. The sexual revolution (so-called) was sweeping over most of the country. The courts came to accept books, movies, and plays that would have horrified and scandalized the Victorians, or even Americans of the early twentieth century. In theory, local communities can regulate or even ban hard-core pornography; but at least in most big cities, anything goes.

Religion and the Law

The Supreme Court has also wrestled with another almost intractable problem: church and state. Church and state are constitutionally separate in the United States: but what exactly does this mean? In the nineteenth and early twentieth centuries, public education had a distinctively sectarian, Protestant cast; Bible-reading and prayers were common in many schools in many states. In the 1960s, the Supreme Court exercised one of its most dramatic vetoes. The Court held that prayer in public school was a violation of the Establishment Clause of the First Amendment. Bible-reading suffered the same fate.70 These decisions were wildly unpopular, and continue to be; but the Supreme Court has stuck to its guns. On other issues—aid to parochial schools, vouchers, creches on city hall lawns—the decisions have been more mixed, and the problems of state involvement with religion remain alive and deeply controversial. This is perhaps inevitable. Of all the developed countries, the United States is the most fervently and deeply religious. Yet, of all the developed countries, it is the most religiously diverse. No single religion commands more than a quarter of the population. There is every conceivable denomination of Christian, plus millions of Jews, Muslims, Hindus, and Buddhists, to mention only a few of the minority religions. There are Mormons and Christian Scientists and Jehovah’s Witnesses and dozens of small splinter religions. Under these circumstances, it is not easy to disentangle government from religion—while at the same time recognizing the massive role of religious feeling and religious faith in the United States.

1 34 Stats. 232 (act of June 11, 1906). The Supreme Court, in the Employers’ Liability Cases, 207 U.S. 463 (1908), struck down the law by a narrow 5 to 4 margin. But Congress passed a new law, 35 Stats. 65 (act of April 22, 1908), which successfully met the Court’s objections.

2Ives v. South Buffalo Railway Co., 201 N.Y. 271, 94 N. E. 431 (1911).

3 On the rise of workers’ compensation, see Price V. Fishback and Shawn Everett Kantor, A Prelude to the Welfare State: The Origins of Workers’ Compensation (2000).

4 See John Fabian Witt, “Toward a New History of American Accident Law: Classical Tort Law and the Cooperative First-Party Insurance Movement,” 114 Harv. L. Rev. 692 (2001).

5 243 Wis. 612, 11 N.W. 2d 179 (1943).

6 Laws Wis. 1911, ch. 50, pp. 46–47.

7 Okla Comp. Stats. 1926, sections 7283–4, pp. 662–63.

8 On this incident, see Claudia Clark, Radium Girls: Women and Industrial Health Reform, 1910–1935 (1997).

9Louie v. Bamboo Gardens, 67 Ida. 469, 185 P. 2d 712 (1947).

10 See Workmen’s Compensation Appeal Board v. Bernard S. Pincus Co., 479 Pa. 286, 388 A. 2d 659 (1978).

11 For an example, see Helen J. Kelly’s Case, 394 Mass. 684, 477 N. E. 2d 582 (1985).

12 Cal. Labor Code, sec. 3208.3; Laws Cal. 1989, ch. 892, sec. 25.

13 Edward A. Purcell Jr., “The Action Was Outside the Courts: Consumer Injuries and the Uses of Contract in the United States, 1875–1945,” in Willibald Steinmetz, ed., Private Law and Social Inequality in the Industrial Age (2000), pp. 505, 524.

14 See Arthur F. McEvoy, “The Triangle Shirtwaist Factory Fire of 1911: Social Change, Industrial Accidents, and the Evolution of Common-Sense Causality,” 20 Law and Social Inquiry 621 (1995).

15 Lawrence M. Friedman, Total Justice (1985).

16 Ruth Schwartz Cowan, A Social History of Technology (1997), pp. 233–34.

17 See the classic study by H. Laurence Ross, Settled Out of Court: The Social Process of Insurance Claims Adjustment (rev. 2d ed., 1980).

18 217 N.Y. 382, 111 N.E. 1050 (1916).

19Welge v. Planters Lifesavers Co., 17 F. 3d 209 (7th Cir. 1994).

20 William Prosser, The Law of Torts (3rd ed., 1964), pp. 1023–24.

21 Randolph Bergstrom, Courting Danger: Injury and Law in New York City, 1870–1910, (1992), p. 20; Michael G. Shanley and Mark A. Peterson, Comparative Justice: Civil Jury Verdicts in San Francisco and Cook Counties, 1959–1980 (1983).

22Borel v. Fibreboard Paper Products Corp., 493 Fed. 2d 1076 (C.A. 45, 1973).

23 G. Alan Tarr, Understanding State Constitutions (1998), p. 21.

24 Peter J. Galie, Ordered Liberty: A Constitutional History of New York (1996), pp. 228, 306, 357.

25 The law was 43 Stat. 153 (act of May 26, 1924); see Elliott Robert Barkan, And Still They Come: Immigrants and American Society, 1920 to the 1990s (1996), pp. 11, 14.

26Guinn v. United States, 238 U.S. 347 (1915).

27 Leon F. Litwack, Trouble in Mind: Black Southerners in the Age of Jim Crow (1998), pp. 225–26.

28 Michael Perman, Struggle for Mastery: Disfranchisement in the South, 1888–1908 (2001), p. 328.

29 Litwack, Trouble in Mind, p. 289.

30 245 U.S. 60 (1917).

31Missouri ex rel. Gaines v. Canada, 305 U.S. 337 (1938).

32 On this, and the integration of Missouri’s law schools in general, see Robert C. Downs et al., “A Partial History of UMKC School of Law: The ‘Minority Report,’ ” 68 UMKC Law Review 508 (2000).

33 339 U.S. 629 (1950).

34McLaurin v. Oklahoma State Regents for Higher Education, 336 U.S. 637 (1950).

35 323 U.S. 214 (1944). For an exhaustive—and depressing—treatment of this case, see Peter Irons, Justice at War: The Story of the Japanese-American Internment Cases (1983).

36 See Mary L. Dudziak, Cold War Civil Rights: Race and the Image of American Democracy (2000).

37 334 U.S. 1 (1948). On this case, see Clement E. Vose, Caucasians Only: The Supreme Court, the NAACP, and the Restrictive Covenant Cases (1959).

38 347 U.S. 483 (1954).

39 349 U.S. 294 (1955).

40 This view was advanced, very notably, by Gerald Rosenberg in The Hollow Hope: Can Courts Bring about Social Change? (1991).

41 One clear example was Alabama federal judge Frank M. Johnson Jr., See Tony Freyer and Timothy Dixon, Democracy and Judicial Independence: A History of the Federal Courts of Alabama, 1820–1994 (1995), pp. 215–55.

42 Michal R. Belknap, Federal Law and Southern Order: Racial Violence and Constitutional Conflict in the Post-Brown South (1987), p. 29.

43 Belknap, Federal Law and Southern Order (1987), pp. 44–52. The Supreme Court weighed in on the Little Rock issue in Cooper v. Aaron, 358 U.S. 1 (1958). In Loving v. Virginia, 388 U.S. 1 (1967), the Supreme Court struck down one of the last remnants of the old order: the miscegenation laws. The decision was unanimous. Any law that prevented blacks and whites from intermarrying was a violation of the Fourteenth Amendment.

44 78 Stat. 241 (act of July 2, 1964).

45 79 Stat. 437 (act of Aug. 6, 1965).

46 On Marshall’s life and career see Howard Ball, A Defiant Life: Thurgood Marshall and the Persistence of Racism in America (1998).

47Regents of the University of California v. Bakke, 438 U.S. 265 (1978).

48 In fact, there was no “majority” opinion. There were six separate opinions. This was a 5 to 4 decision. Justice Powell provided the crucial fifth vote to let Bakke into the school. But he agreed with the four dissenters that the state could, under some circumstances, and in some ways, take race into account.

49 448 U.S. 448 (1980).

50 515 U.S. 200 (1995).

51Hopwood v. Texas, 84 Fed. 3d 96 (C.A. 5, 1996); the Supreme Court denied certiorari, Texas v. Hopwood, 518 U.S. 1033 (1996).

52 In Grutter v. Bollinger, 539 U.S. 306 (2003) the Court approved of the law school’s program of affirmative action, though in Gratz v. Bollinger, 539 U.S. 244 (2003), they disapproved of the undergraduate plan, which was more rigid.

53 John R. Wunder, Retained by the People: A History of American Indians and the Bill of Rights (1994), pp. 62–63.

54 48 Stat. 984 (act of June 18, 1934).

55 88 Stat. 2203 (act of Jan. 4, 1975).

56 92 Stat. 469 (act of Aug. 11, 1978).

57 See Lucy E. Salyer, Laws Harsh as Tigers: Chinese Immigrants and the Shaping of Modern Immigration Law (1995); Bill Ong Hing, Making and Remaking Asian America Through Immigration Policy, 1830–1900 (1993).

58Korematsu v. United States, 323 U.S. 214 (1944); Peter Irons, Justice at War (1983).

59Lopez v. Seccombe, 71 F. Supp. 769 (D.C.S.D., Cal., 1944).

60Westminster School District of Orange County v. Mendez, 161 Fed. 2d. 774 (C.A. 9, 1947).

61 The Age Discrimination in Employment Act was passed in 1967, 81 Stat. 602 (act of Dec. 15, 1967); on the background, see Lawrence M. Friedman, Your Time Will Come: The Law of Age Discrimination and Mandatory Retirement (1984); on the subsequent history of this law, see Lawrence M. Friedman, “Age Discrimination Law: Some Remarks on the American Experience,” in Sandra Fredman and Sarah Spencer, eds., Age as an Equality Issue (2003), p. 175.

62 104 Stat. 327 (act of July 26, 1990).

63 The breakthrough case was Reed v. Reed, 404 U.S. 72 (1971). See, in general, Deborah L. Rhode, Justice and Gender: Sex Discrimination and the Law (1989).

64 See Harry N. Scheiber, The Wilson Administration and Civil Liberties, 1917–1921 (1960); Richard Polenberg, Fighting Faiths: The Abrams Case, the Supreme Court, and Free Speech (1987).

65 249 U.S. 47 (1919).

66 250 U.S. 616 (1919).

67 341 U.S. 494 (1951).

68 See Sam Tanenhaus, Whittaker Chambers: A Biography (1997); Chambers was the former Soviet agent who accused Hiss.

69Yates v. United States, 354 U.S. 298 (1957); Arthur L. Sabin, In Calmer Times: The Supreme Court and Red Monday (1999).

70 The cases were Engel v. Vitale, 370 U.S. 421 (1962), and Abington School District v. Schempp, 374 U.S. 203 (1963); religion had long been an issue—for example, in the famous Scopes trial, in the 1920s, where the issue was whether the state could prohibit the teaching of evolution in the schools. See Edward J. Larson, Summer for the Gods: The Scopes Trial and America’s Continuing Debate over Science and Religion (1997); Lawrence M. Friedman, American Law in the Twentieth Century (2002), pp. 506–16.