Chapter 3

Internal Legal Culture in the
Twentieth Century: Lawyers,
Judges, and Law Books

If the theme of the twentieth century was growth, nowhere was this more evident than in the legal profession itself. At the beginning of the century, there were some one hundred thousand lawyers in the country. At the end of the century, there were about a million—the population had more or less doubled, but the number of lawyers had increased by a factor of ten. This growth process had accelerated in the last part of the century; in the early 1980s, there were about six hundred thousand lawyers—four hundred thousand more joined them in the next generation.

The profession was transformed in other ways as well. At the beginning of the century, the bar was basically a white male preserve. There were women lawyers and minority lawyers; but freakishly few. In Philadelphia, for example, as we noted, exactly three out of some 1,900 lawyers in 1905 were women.1 In fact, women continued to be rare beasts in the bar until the 1960s. Then the tide turned, and dramatically. By the end of the century, about a quarter of the bar was made up of women, most of them rather young; and there were so many women in the pipeline—half or more of the law students in many schools—that the percentage of women lawyers was bound to rise, perhaps to majority status, in the twenty-first century. The number of black lawyers was extremely small in the first half of the century; and black women were even rarer—they were “doubly marginalized.”2 But black lawyers were in the forefront of the battle for civil rights, and the cadre of minority lawyers (men and women) grew in the last third of the century. In that period, black, Hispanic, and Asian faces began to appear in the offices of law firms, on the bench, and on the law faculties of schools all over the country as well.

What does this army of a million lawyers do? Not all of them practice law. Some, as has always been the case, drift away from the practice—they go into business, or politics, or quit doing anything even remotely resembling law practice. The majority, however, do make use of their training. More and more, they practice with other lawyers: in a firm, or partnership (or professional corporation). The firms have been steadily growing in size.3 In 1900, a firm with twenty lawyers was a giant. In 1935, the largest Chicago firm, Winston, Strawn & Shaw, had forty-three lawyers. In 1950, the largest Chicago firm, Kirkland, Fleming, Green, Martin, & Ellis, had fifty-four lawyers. At that date, a firm of one hundred lawyers would have been a super-giant, and such hulking creatures were found only in New York City. But by the 1980s, the situation was dramatically different. Baker and McKenzie had 697 lawyers in 1984.4 The best (or worst) was yet to come. The giant firms grew like weeds. In 2001, it was reported that Baker & McKenzie had grown to a truly mammoth size: 3,117 lawyers. No other firm had reached the 2000 mark; but the number two firm, Skadden Arps, had well over 1,000. Nor was this only a phenomenon of Wall Street. There were large firms in every major city, and even minor ones. The largest firm in Richmond, Virginia, at the beginning of the twenty-first century had 584 lawyers, the largest firm in Kansas City, Missouri, had 568. The 250th largest firm in the country had no less than 158 lawyers.5

In the old days, a New York firm was a New York firm, a Houston firm was a Houston firm, and that was that. Slowly, a few firms began to branch out. At the end of the twentieth century, the biggest law firms all had branch offices. Sometimes, a firm acquired a branch by swallowing up a local firm; sometimes a firm colonized distant cities. It is unlikely that law firms will match the record of Motel 6, or Kentucky Fried Chicken, let alone McDonald’s; but the big firms by 2000 had a toehold in all sorts of domestic and foreign sites, either far away, or as close as the flashier suburbs. Coudert Brothers, the firm founded in the nineteenth century by three brothers, had 650 lawyers, in thirty offices and in eighteen countries. Hunton and Williams, a big firm in Atlanta (over 700 lawyers), had offices in Miami, Washington, D.C., New York, Raleigh and Charlotte, North Carolina, Norfolk and Richmond, Virginia. It also had branches in Brussels, Hong Kong, Bangkok, Warsaw, and London. Skadden, Arps, aside from its many American branches, had offices in Paris, Brussels, Hong Kong, Beijing, Frankfurt, Toronto, Moscow, and Sydney, Australia.6 Coudert Brothers had an office in Almaty, one of eight Asian branches (the others were in less offbeat places—Singapore, for example).

Not all lawyers by any means worked for firms. There were also solo practitioners; “in-house” counsel, and government lawyers. In 1952, about 10 percent of the lawyers worked for government, about 9 percent for private companies. The percentages did not go up appreciably; but by the end of the century, the absolute numbers were impressive: 80,000 house counsel in 1995, 65,000 government lawyers.7

By the end of the century, too, lawyers seemed to be everywhere in government and in society. They busied themselves with every conceivable kind of work: merging giant corporations, suing the government, handling cases of child custody, drawing up wills, or handling a real estate deal. They gave advice to people who wanted to open a pizza parlor, or set up a private foundation. A few were even traditional courtroom warriors. The most elaborate study of what lawyers actually do, the study of Chicago lawyers in the 1970s, by John Heinz and Edward Laumann, classified lawyers into two big groups (the authors called these “hemispheres”). The larger “hemisphere” was made up of business lawyers. This was the “corporate client sector.” The second, somewhat smaller “hemisphere” did half of its work on business law, but for little businesses, not big ones; and this “hemisphere” devoted about a fifth of its total legal effort to nonbusiness matters—divorces, personal injury work, and criminal defense.8 Relatively few lawyers ever crossed over from one “hemisphere” to the other, or worked in both at the same time.

Legal Ethics

In 1908, the American Bar Association adopted a canon of professional ethics. Most states accepted it, or a similar code, as the official rules on the conduct of lawyers. State bar associations, or state judges, had nominal or real power to enforce these canons, and discipline black sheep in the profession. Many of the canons were rules of plain common sense, or ordinary morality. No lawyer was supposed to steal his client’s money. But other rules reflected the norms and ambitions of the elite members of the bar. No lawyer, for example, was allowed to use “circulars or advertisements” (Canon 27). Wall Street lawyers never advertised; big firms had no need to. They had old, stable clients on retainer; and they got new business through informal networks. On the other hand, store-front lawyers, who represented people run over by streetcars, or who wanted a divorce, needed a constant stream of new business; they had “one-shot” clients, and no retainers. Other canons of “ethics” were simply anticompetitive. The rules basically outlawed price-cutting. Bar associations set minimum fees, and they fought against “unauthorized practice,” that is, they defended their turf against other professionals. They wanted to keep others off their turf. They wanted a monopoly over all activities that could reasonably be defined as the practice of law.

The bar never gave up the fight against “unauthorized practice”; but in some regards the canons of ethics have been democratized. The Supreme Court gave the trend a big push when it struck down the ban on advertising in 1977. The case involved two Arizona lawyers who put an ad in the newspapers, offering “legal services at very reasonable fees.”9 The Supreme Court saw this as an issue of free speech; the public had a right to hear what lawyers who advertise wanted to say. At the top levels of the profession, this decision made very little difference. Wall Street lawyers still do not advertise; why should they? But it is now common to see lawyers on television, selling themselves like pizza parlors and used-car dealers, scrambling for low-income clients facing drunk driving charges, or people with whiplash injuries, or aliens in trouble with immigration authorities. The lawyers of the old days, the old Wall Street bar, would turn over in their graves.

The Organized Bar

The bar associations began as organizations of elite and high-toned lawyers, as we have seen—they were a reaction to the muck and the scandal of city politics. The bar associations were exclusive in more ways than that. The American Bar Association (ABA), for example, was for white lawyers only. In 1912, the ABA admitted three black lawyers—by mistake. The executive committee, when it discovered this dreadful error, tried to rescind their admission. The three lawyers remained members, in the end; but future applicants were screened by race.10 In response, black lawyers felt compelled to found their own organization, the National Bar Association, in 1925. No women were admitted to the ABA until 1918. Only gradually did the ABA mend its ways, and become more inclusive. By the end of the twentieth century, the situation had changed radically. The ABA was open to any lawyer of any race or sex. Roberta Cooper Ramo became the first woman president of the ABA in 1995–1996.

From about the middle of the century, many state bar associations became “integrated.” This had nothing to do with race; it simply meant that all lawyers in the state had to belong to a single (state) bar association, which collected dues, and claimed the power, at least in theory, to police the bar and discipline members who strayed from the straight and narrow path. By 1960, more than half of the states had integrated their bars. Illinois integrated its bar in the 1970s. Still, compared to the hold of the American Medical Association over doctors (largely through control of hospitals), the legal profession has been fairly loose and free-wheeling. Despite furious membership campaigns, at the end of the twentieth century only a little more than a third of the country’s lawyers even bothered to join their national association. And although the bar tried to speak with one voice, it never quite persuaded the outside world to listen.

This was not necessarily a bad thing. The behavior of the organized bar, compared to its ballyhoo, has been weak and retrograde. It does not take much courage to thunder invective against shysters and ambulance chasers. But when justice or civil liberties were in genuine crisis, the organized bar was not always on the side of the angels. During the McCarthy period, the ABA was eager for loyalty oaths and purges; the House of Delegates recommended disciplinary action, and even expulsion, for lawyers who were Communists, or who advocated Marxism-Leninism.11 But as the bar expanded and took in a more diverse mix of lawyers, its began to change its general orientation. Indeed, toward the end of the twentieth century, the ABA came out in favor of a moratorium on the death penalty, which no doubt startled many people on the right. The bar has also, since the 1950s, played a role in vetting lawyers nominated to the federal courts. Judges are still appointed for political reasons. But the ABA does screen nominees for minimal competence. It has helped keep some egregiously unqualified lawyers off the bench.

Legal Education

In 1900, the apprenticeship system was in retreat; by 2000, it was virtually dead. Practically speaking, the law schools now have a virtual monopoly on access to the bar. The sheer number of schools—and students—has grown tremendously, which is not surprising, since the number of lawyers (and of people who wanted to be lawyers) was also growing rapidly. There were about one hundred law schools in 1910 (some of them night schools). In 1980, there were more like two hundred. Most states had one or more law schools; and the states that had done without lawyers now mended their ways. Hawaii and Vermont joined the list of states with law schools. Nevada joined them at the very end of the century. In 2000, Alaska was the only state without a law school inside its borders. Most big states had not one but many law schools. California, the biggest state, had the most law schools—more than 35—including a flock of unaccredited ones.

The Langdell method had, as we saw, struggled to stay alive even in its mother church in Cambridge during its early years. By 1900, it was clearly on the move and conquering new territory.12 The armies of Langdell eventually swept the field, spreading the gospel of casebooks and Socratic dialogue throughout the country. Yale was a convert in 1903. Other holdouts gave up and fell into line somewhat later. Schools that switched to the Harvard method also tended to hire Harvard men as their teachers: Harvard, in the first part of the century, supplied almost a quarter of all the country’s law school teachers. All over the country, in 1930 or 1950, small, poorly financed schools, some with night divisions as well as day-time students, pathetically tried to imitate Harvard, buying its methods and its casebooks, rather than searching for their own mission and soul, or asking how they could better serve their own community. A good law school was (and is) supposed to be “national,” that is, to ignore whatever state it happens to be located in, pretending instead to teach more general truths and more national skills.

Meanwhile, it became harder to get into law school. At the beginning of the century, Harvard already required a bachelor’s degree for admittance. Other schools gradually fell into line—Stanford, for example, in 1924; George Washington in 1935. In the 1960s, the ABA and the AALS required four years of college for everybody. The requirement, in short, became universal. After the Second World War, the GI Bill of Rights allowed thousands of bright young veterans to go to any school that accepted them. The government would pay for it all—tuition, books, and living expenses. Elite schools were no longer only for elite people. Any veteran with brains could go to Harvard or Yale. Veterans made up over 90 percent of the Harvard class of 1947.

Applications now flooded the schools. Social background and money no longer did an adequate job of filtering students. What replaced these, in part, was the Law School Admission Test (LSAT). The LSAT and other requirements were meant to eliminate unqualified students. The LSAT was launched in 1948; 3,500 applicants, in sixty-three cities, took the test. In 1952, forty-five law schools required it; the rest eventually followed suit. The University of Georgia adopted the test in 1962.13 The LSAT led to a drastic reduction in flunk-out rates. Schools as late as the 1950s admitted great numbers of students, and flushed out as many as a third of them after their first-year exams. According to a famous story, Professor Edward Warren of Harvard (the notorious “Bull” Warren) used to tell the entering students, “Look to the right of you. Look to the left of you. One of the three of you will be gone within a year.”14 Similar grim warnings—and predictions—were made at other schools as well. In the LSAT era, schools became very selective; it was much harder to get in to the top schools. But there was an end to the slaughter of the innocents. In 2000, and for many years before, it was almost impossible—short of outright cheating or plagiarism or, more commonly, a nervous breakdown—to flunk out of Harvard, or most other law schools for that matter.

Legal Literature

During most of the twentieth century, Langdell and his way of thinking had the upper hand in literature as well as in legal education. This was the age of huge, elephantine treatises. Samuel Williston built a monumental structure (1920–1922) out of the law of contracts, volume after volume, closely knit, richly footnoted, and fully armored against the intrusion of any ethical, economic, or social ideas whatsoever.15 On the model of Williston was the ponderous treatise of Austin Wakeman Scott (on trusts); and indeed, each branch or field of law had at least one example of an arid and exhaustive treatise. Arthur Corbin, at Yale, published his own treatise on contracts. It reflected the influence of the legal realist movement (about which more later). Corbin served as a kind of counterpoise to the old man of Harvard. Wigmore’s monumental treatise on the law of evidence also deviated considerably from the Williston model.

Changing times—and the legal realist movement—had a certain impact on the law school curriculum. There was some attempt to change content in older courses; and new courses, like administrative law and taxation, made their way into the ranks. Thus, a professor of 1900, come back to life, would have found the law school curriculum familiar in some ways, strange in other ways. The classroom culture, too, was the same only different. The classic “Paper Chase” professors, hectoring and badgering their students, terrorizing the classroom, had gone to their eternal rewards; many professors in the late twentieth century still used the Socratic method, but in a modified way—much milder and more humane. After the 1960s, it was unfashionable to mock and humiliate students. But many aspects of the core curriculum would be quite familiar to our professor of 1900. The list of first-year courses, in 2000, would not have been that surprising to Langdell. The center seemed to hold, for better or for worse. Many upper level courses, on the other hand, would have struck Langdell, or for that matter, Samuel Williston, as rather odd. Many schools have added “enrichment” courses; some have tinkered with clinical courses and clinical training; a few have gone further and made clinical training a key aspect of legal education. While most students stick to “the basics” (corporations, taxation, and similar courses), those who want to stray off the beaten track can now do so, at least to some extent. The catalog of Cornell’s law school, for 1995–1996, for example, offered courses in Comparative and Transnational Litigation, Economics for the Lawyer, Feminist Jurisprudence, Law and Medicine, and Organized-Crime Control, plus a flock of seminars, clinical courses, and externships.

In part, this reflects the recurrent attempts to bring law closer to the university, that is, to the larger world of theory and scholarship. There were experiments in integrating law and social science in the 1920s and 1930s, at Columbia and Yale.16 The Johns Hopkins Institute of Law, in Baltimore, was established as a research institution, devoted to the empirical study of legal institutions. After publishing a few studies, the Institute died of financial starvation during the Depression. After the Second World War, more serious efforts were made to integrate law and the social sciences. Government and foundation money began to trickle into the law schools. One notable result was The American Jury (1964), by Harry Kalven Jr., and Hans Zeisel, of the University of Chicago Law School, a monumental study of a major institution. Kalven and Zeisel showed that collaboration between legal scholars and social science colleagues was a real possibility.17 But mostly, law and social science remained only a promise. In most schools, it was not even a promise. Only a few schools—notably Wisconsin, Denver, and Berkeley—did anything more than pay lip service to the interdisciplinary ideal.18 In the 1960s, law study did broaden to include more social issues—courses on welfare law, for example, a reflection of President Lyndon Johnson’s “war on poverty”; a few schools initiated a richer, more varied dose of foreign law. The social sciences, basically, made hardly a dent. The one exception was economics. Here the University of Chicago was a leader; and Richard Posner, later a federal judge, was a key contributor to the field. His Economic Analysis of Law went through many editions, and made a major splash in the world of legal education. Many law professors, especially those who were politically liberal, resisted and resented the “law and economics” school, which they considered right-wing, and excessively narrow. They also accused it, with some justice, of disguising ideology in the Halloween costume of hard science. Still, by the 1980s, the law and economics movement had won a significant place in legal teaching, thought, and research, and not only in the more obvious fields (antitrust law), but in other courses as well—torts, property, contract. It retained its strong position for the rest of the century.

Of the making of books, quality aside, there was truly no end. The few university law reviews of the nineteenth century multiplied like rabbits over the years. By 1950, there were about seventy; and by the late 1990s, more than four hundred. Virtually every law school, no matter how small or marginal, published a law review, as a matter of local pride. The classier schools had more than one—perhaps as many as ten. One of these was the flagship journal—the Yale Law Journal; the Harvard Law Review; the University of Chicago Law Review. This was the most general (and the most prestigious). The others were more specialized. One of these others was, almost certainly, some sort of journal with an international or global bent. The law journals had also become extremely fat; and fatter and fatter over the years. Volume 15 of the Yale Law Journal, for 1905–1906, ran to 446 pages; by Volume 40 (1930–1931), the journal had swollen to 1,346 pages, and Volume 109 (1999–2000) had no less than 2,024 pages. Many schools could beat this dubious record. Volume 74 of the Tulane Law Review (1999–2000) was 2,268 pages long (Volume 15, for 1940–1941 had been a mere 652).

By law school tradition, it is the students, not the professors, who run these journals. They choose the articles, they edit them, they mangle the prose (when it has not arrived pre-mangled), they check the footnotes and make sure every jot and tittle is correct. Traditionally, too, these law review students formed the student elite of their schools. They were the ones with the best grades, the best (or only) rapport with the faculty; and when they graduated, they were rewarded with the best clerkships and the best jobs with the best of the law firms. Some schools, in a burst of democracy in the 1960s, opened law-review competition to all students who wished to try out, regardless of grades. The law reviews also began to worry about “diversity.” Many of them actively tried to recruit more women and minorities. By the end of the century, the law reviews had lost a little of their glamour—they were less of a club of the chosen, the crème de la crème; and they no longer had a total monopoly on top jobs and top clerkships.

They remained powerful in the world of scholarly publishing, but here too they did not keep their total stranglehold. In the early 1930s, in a pioneering and daring move, Duke University began to publish a journal called Law and Contemporary Problems. Each issue was a symposium on some topical subject; the journal did not set aside any section for student work. Later, more specialized scholarly journals—Law and History Review, for example—began to appear; and journals devoted to law and economics. The first issue of The Law and Society Review appeared in 1966—an outlet for the social scientific study of law. The American Bar Foundation published a research journal, later renamed Law and Social Inquiry. Many of these are peer-review journals; the students do not have the dominant role in these journal, rather the professors do.

Conceptualism long retained its baleful and persistent influence; and this influence has never died out. Nonetheless, the twentieth century managed to produce a rich legal literature. One of the most notable scholars was Roscoe Pound (1870–1964), a Nebraska native, who began his scholarly career studying fungus, and then switched to law. Pound became dean of the Harvard Law School, and poured a lot of his energy into legal philosophy. He founded a school of “sociological jurisprudence,” although there was precious little about it that was truly sociological. Karl Llewellyn (1893–1962), was a key figure in the realist movement of the 1920s and 1930s. He was clever and iconoclastic, despite a dense and Teutonic writing style. Others in that movement included Jerome Frank and Thurman Arnold. Realism was, in fact, less a philosophy than an attitude. It rejected the mind-set of judges and scholars of the late nineteenth century, who had emphasized legal logic and the purity of concepts. It rejected, in other words, the philosophy of Langdell. The realists had no great reverence for legal tradition as such. They were skeptical about rules—skeptical whether they worked in practice. They doubted that judges could or should decide cases according to the dictates of legal logic. They had little or no tolerance of artifice, fictions, real and apparent irrationalities. Law was a working tool, an instrument of social policy; and it had to be seen in that light. Jerome Frank’s book, Law and the Modern Mind (1930) was one of the key documents of the realist movement. Frank denounced legal formalism, the “illusion” of “certainty” in the law. Rules were only the “formal clothes” in which judges dressed up their actual thoughts. The search for certainty (Frank said) was nothing more than the vain search for a father-figure, something to believe in implicitly. Another leading realist was Thurman Arnold. His books, The Symbols of Government (1935) and The Folklore of Capitalism (1937), written in a witty and sarcastic style, hammered away at the “myths” and “folklore” of legal thought.

These attitudes were not entirely new; and they never captured the minds of all judges and lawyers. But they did affect an important elite. Realism made a difference in the way a small but important group of judges wrote, and perhaps (though here we are on thinner ice) in the way they thought and the way they decided their cases. And a notion of law as instrumental, as a tool of policy, rather than the dry bones of classical legal thought, the ideas of men like Langdell, was a natural fit for the lawyers who flocked to Washington in the early 1930s, who inhabited Franklin Roosevelt’s New Deal, and who were eager to reform the world (or at least the country) through law. These were the lawyers who drafted New Deal legislation, who devised the bold new programs that came out of Roosevelt’s administration, and who then argued and fought for these programs during the long battles in New Deal courts.19

Llewellyn was also a leading figure behind the drafting of the Uniform Commercial Code, of which more later. He coauthored a book on the Cheyenne Indians with the anthropologist, E. Adamson Hoebel (The Cheyenne Way, 1941). Between the two world wars, Charles Warren wrote a number of seminal studies of the American legal past. Legal history emerged once more from the shadows after the Second World War. The outstanding practitioner was J. Willard Hurst, who spent almost his entire career at the University of Wisconsin. Hurst and his followers—the so-called Wisconsin school—moved sharply away from case-centered, formalistic, doctrinal history. Hurst placed his attention squarely on the relationship between law and society—in particular, between law and the economy.20 In the 1970s, a group of younger historians, somewhat more radical, began questioning some aspects of the work of Hurst and his school; but this too was, in a way, a mark of its permanent influence.21 The law and society movement gathered scholars from various disciplines—psychology, sociology, political science—who were looking at law with a fresh and sometimes illuminating eye, and (importantly) an empirical eye. Many of these scholars—a majority, in fact—were not lawyers or members of law faculties at all.

In the late 1960s, volcanic rumblings began to disturb the peace of the law schools. Students led the way: they marched under banners of civil rights and civil liberties; then they flew the flag of the war against poverty; then they embraced a more general radicalism, a more general revulsion against the established order. The richest and most famous schools were the most affected; and within the schools, on the whole, it was the most intellectually active students who made the most stir and the most noise. When the Office of Economic Opportunity put money into neighborhood law offices, young lawyers eagerly turned out to work in the ghettos. Wall Street had to raise its prices, to keep its pool of talent from drying up. Things (it appeared) would never be the same. Classical legal education looked like Humpty Dumpty, teetering on the wall. Would it bounce, or would it break?

In the event, it bounced. But this was not obvious at the time. Law seemed caught up in crisis. To some, on the right, the welfare state seemed to be dangerously overextended; most government programs seemed foolish, counterproductive, and in violation of basic economic laws. The left condemned the system as rotten, racist, and unjust. The war in Vietnam was another massive irritant. And many people, left, right, and center, complained about a litigation explosion (largely mythical, perhaps).22 Society seemed to be choking to death on its own secretions.

Yet, when the war in Vietnam ended, the student movement seemed to deflate, like a balloon when the air is let out. In the 1980s, the country seemed remarkably quiescent; there was a hunger for old-fashioned verities; the most conservative president in decades (Ronald Reagan) smiled soothingly from the White House and on TV screens. In many law schools, professors began to complain that their students were boring, complacent, and vocational. New radical movements sprang up among the professorate—Critical Legal Studies, and its black and Latino branches—but the student body (and the bar) hardly blinked. The professors, left, right, and center, continued churning out pages, and filling the law reviews. The students, on the whole, seemed mostly absorbed in finding and keeping a job.

The Twentieth Century Bench

At the beginning of the century, the reputation of the U.S. Supreme Court was low among labor leaders and, generally, those who identified themselves as progressives. A few of the Supreme Court’s decisions seemed incredibly retrograde, cases like Lochner v. New York (1905), and Hammer v. Dagenhart (1918), the child labor case, which we have already discussed. During the early years of the New Deal, in the 1930s, the Supreme Court—the “nine old men”—hacked away at New Deal programs. The National Industrial Recovery Act was one of the keystones of the early New Deal; but the Supreme Court struck it down in the so-called “sick chicken” case, Schechter Poultry Corp. v. United States (1935).23 The next year, in United States v. Butler (1936),24 the Supreme Court consigned the Agricultural Adjustment Act to the ash heap.

To be sure, even liberals on the Supreme Court did not like some aspects of the early New Deal, which struck them as too much in the direction of corporatism. Moreover, the Supreme Court did sustain many notable labor and social laws, state and federal. Home Building and Loan Association v. Blaisdell (1934),25 was one of these cases. The Court, by a narrow margin, sustained a Minnesota law that tried to delay and avoid foreclosures of houses and farms. This seemed to be in flat contradiction to the contracts clause of the constitution, but the Supreme Court was only too aware of the country’s economic crisis.26 There was real anger against the Court for blocking much of the New Deal program. President Franklin D. Roosevelt, immensely popular, swept into a second term in 1936 in a landslide. He then made a clumsy attempt at reform and revenge. This was the infamous court-packing plan: a plan to give him the right to appoint extra justices, one for each justice who was older than seventy and a half. The plan would have allowed him to appoint as many as six new justices. But a storm of protest arose, and the plan was shipwrecked from the start.27 He had, in a way, attacked something holy; even his political allies deserted him. Congress scuttled the plan; it was one of Roosevelt’s worst defeats.

But in the event, the plan proved to be unnecessary—even in the short run. In 1937, the Supreme Court, by a 5 to 4 vote, upheld the National Labor Relations Act.28 In this, and a number of other cases, one Justice—Owen Roberts—apparently changed sides. This was the famous “switch in time that saved the nine.” Was there really a “switch” at all? Did Roberts have a genuine change of heart? Was he motivated by political events? Or was the decision, and what followed, unrelated to the court-packing plan—and to Roosevelt’s reelection? The question has been much debated; and is much disputed.29 G. Edward White takes the position that the court-packing crisis did not produce the “constitutional revolution of the New Deal”; rather, the crisis itself was the “product of a constitutional revolution.” The crisis grew out of a growing recognition that the Court was, and had to be, deeply political—a “modernist” theory of constitutional law and constitutional interpretation, and “of the nature of legal authority.”30 Indeed, there is no question that the Court is (and always has been) a political institution, though of a rather special kind. The Court is not, to be sure, autonomous—that is, merely applying “law,” in ways that are detached from the swirling, vibrant forces working away in society. Social norms and values are at the heart of the work of the Court. But the Court is independent, that is, the government cannot control it; and, unlike the legislature, it has no irate constituents on whose votes it depends. Nobody can fire a Justice. Nobody can throw him out of office. The Justices serve for life. And they tend to live a long, long time.

Still, Roosevelt had the last laugh, switch or no switch. He was elected four times. In short, he outlasted the “nine old men”; ultimately, he was able to fill the Court with men who were friendlier to the New Deal, and had a less jaundiced view of government regulation in general. The Supreme Court virtually abdicated its role as an economic watchdog; it accepted, basically, whatever the states and the federal government wanted to do, provided the program had any hint of a rational motive behind it. But the Court did not skulk off the stage and hide behind the scenes. It assumed an equally powerful and controversial role, as we have seen: champion of the underdog, the voice of the voiceless, the protector of human rights, the inventor of new human rights. This was the role of the Warren court; and the Courts that followed it—the Burger and Rehnquist courts—though much more conservative, never really abandoned this role.

The prestige of the Supreme Court in the late twentieth century continued to be great. What the Court did about segregation, the death penalty, rights of criminal defendants, voting rights, contraception, abortion, and gay rights—all these were hotly debated and hotly contested. The Court—a deeply secretive body, allergic to press conferences and spin doctors—was nonetheless often front page news.31 At times, the Court did seem timid and deferential (during the McCarthy period, for example); at times, so bold it seemed almost breathtaking (school segregation, voting rights, abortion, sodomy laws). Scholars sometimes worried about the Court’s “legitimacy.” If the Court seems too political, will this kill it in the minds and hearts of the public? Won’t people stop supporting and respecting the Court? But the evidence suggests that not even the most controversial decisions damage the Court.32 “Legitimacy” adheres to the institution, not to individual justices, or individual decisions. This is not surprising. Even people who feel the police in their town are brutal and corrupt want better, more honest policemen; they do not withdraw their support from the idea of police.

The power of the Court is obvious; and more and more so over the years. Appointments to the Court had always been at least somewhat controversial. Controversy did not end in the twentieth century; on the contrary, it probably increased. There was a major tumult over the appointment of Louis Brandeis, the first Jewish Justice, in 1916, and, more recently, over the appointment of Robert Bork in 1987 (Bork was defeated). Another contested appointment was Clarence Thomas, a conservative black lawyer, in 1991 (Thomas got the job, after a bitter struggle).33 The public seems acutely aware that appointments matter—on issues such as abortion, for example. A single vote may make all the difference. In the election campaign of 2000, for example, whether George W. Bush or Al Gore would have power to name justices was a significant talking point. Ironically, events during and after the election showed that this was far from foolish. The Supreme Court, in Bush v. Gore,34 took a decisive step in determining the outcome of this disputed election; George W. Bush won by a single vote in the Supreme Court—and this guaranteed his victory in the electoral college as well.

The Supreme Court is an exceptional court. The federal judiciary, in general, gained power in the twentieth century—an inevitable result of the flow of authority to the center. But the Supreme Court rarely had to share the limelight with the lower federal courts. And few federal judges, below the Supreme Court, ever achieved much fame. They were crucial, however, to judicial policy and to social policy in general. The district court judges in the South bore the brunt of the furor over desegregation. Learned Hand (1872–1961), an appeals court judge, though he never made it to the Supreme Court, was perhaps the most respected federal judge of his day.35 State courts and state court judges were relatively less powerful and significant than they had been in the nineteenth century, compared to the federal judges. But they almost surely gained power in absolute terms. Like federal judges, few state court judges could expect to become famous. They were rarely celebrities. Yet, there are a handful that are at least familiar to legal scholars and law students. Benjamin Cardozo (1870–1938) of New York served on the U.S. Supreme Court, from 1932 on, but is best known for his work on the New York Court of Appeals.36 In the 1950s and 1960s, Roger Traynor of California had perhaps the largest reputation of any state court judge; few nonlawyers, probably, had ever heard of him. An occasional state court judge was famous or notorious for one reason or another. Rose Bird, first woman chief justice of California, and the storm center of continual controversy, was unceremoniously dumped by the voters in 1987. But in general, state court judges remained obscure and unsung.

The differences between Cardozo and Traynor were, in one respect, highly revealing. The two men lived a generation apart. Both were considered bold, innovative judges. Cardozo, however, was subtle and crafty; he preferred to disguise change in old-fashioned, common law clothes; he liked to show continuity in the midst of change, to argue that the genius or spirit of the common law required him to move as he did. He was an artisan molding traditional clay. At least that is how he liked to position himself. Traynor was more likely to break sharply with the past. He was willing, at times, to say quite openly that times had changed, and the law must change with it. He spoke a different, franker language than Cardozo.

Both judges may have been influenced by the legal realist school, or, perhaps more accurately, by those social forces and currents of thought that created the realist school to begin with. On the surface, decisions of 1970 or 1990 or 2000 were more likely to contain explicit or implicit appeals to “public policy,” compared to decisions written by judges of the late nineteenth century. Judges also seemed more willing to overrule cases they considered outmoded. Overruling had always been possible in America; but the power was, on the whole, rarely used. It became, most probably, more frequent over time. The Supreme Court uses it sparingly, but it is a powerful weapon, nonetheless, even if exercised only once or twice a year.37 Far more often, courts “distinguish” away or simply ignore prior cases they do not care for. Modern high court judges also tend to write more dissenting opinions than their predecessors, and more concurring but separate opinions. In Michigan, in the 1870s, about 95 percent of all high court cases were unanimous; in the 1960s, there were dissents or concurrences in 44 percent of the published opinions.38

The trend toward more dissents and concurrences was most marked in the U.S. Supreme Court. Oliver Wendell Holmes and Louis Brandeis were “great dissenters” in the first third of the century. In their day, however, most of the Court’s decisions were unanimous. In the last half of the century, unanimous decisions were becoming unusual, at least in cases of first importance. The school segregation case, Brown v. Board of Education, was unanimous. This was, however, the result of considerable maneuvering behind the scenes. In the 1973 term, there were 157 full opinions; 33 of them (21 percent) were unanimous; in 79 percent (124 cases) somebody dissented. The Court at the end of the century seemed a bit less conflicted, but probably only because it took so few cases. There were 77 full opinions in the 1999 term, and 58 percent of them had one or more dissenting opinions.39

The Court has often been fractured and splintered—during the New Deal decade, for example, and even at times in the nineteenth century. In the 1990s, the Court again frequently split down the middle. There were 18 cases in the 1999 term that were decided 5 to 4. The Justices seem to feel no particular need to agree with each other—even with those on the same side of the issue. In some cases, the Justices wrote three, four, five, even nine separate opinions. In Furman v. Georgia, the 1972 death penalty case, every single Justice wrote an opinion of his own. Political scientists (and journalists) amuse themselves analyzing blocs on the Court, trying to figure out the games that Justices play. It is not always possible to pigeonhole a particular Justice. But because the Court’s cases are so often deeply political, commentators are able to label Justices as liberal or conservative or in between; and to predict how they will vote in a particular case. Often they have it exactly right.

To say that Justices “write” opinions is not quite accurate. John Marshall certainly wrote his own opinions, and so did Oliver Wendell Holmes Jr. But after the 1920s, the Justices began to rely more on their law clerks—bright young men (later bright young women too) who had just emerged from the cocoon of law school. Clerks typically served a year, then went on to greener pastures. William Rehnquist, later Chief Justice, was a law clerk to Justice Robert Jackson, in 1952–1953.40 In the 1950s, each justice had two clerks, for the most part; the Chief Justice had three. In the 1940s, most Justices were still writing their own opinions; the clerks did legal research. But by 2000, it was the clerks who wrote at least the first drafts of opinions.41 How much actual influence they had is hard to say; and no doubt varied from Justice to Justice. The Justices, and their clerks, of course, also had access to the latest, most up-to-date online services. One result of all this was judicial bloat. Opinions became longer, wordier. Some of them seem to go on forever. For a decided case to take up a hundred pages—majority, concurrences, dissents—is nothing at all unusual.

The Justices also seem less interested in presenting a united front. Collegiality is not gone, but it is not the collegiality of the nineteenth century. The proliferation of concurrences is one sign of this: Justices agreeing with the result but not with the reasoning, or agreeing with this or that bit or section of the majority opinion, disagreeing with this or that; accepting (say) Part IIB and Part IIIC of the majority, withholding acceptance from everything else. If the Justices all insist on their own particular take, it becomes very hard to decide what if anything the Court actually decided. It makes the decisions fragmented, unwieldy. The Supreme Court, more and more, consists of nine little law offices, each one to a degree independent of all of the rest.

In the twentieth century, the bench had become fully professional. In most states, lay judges—even lay justices of the peace—had gone the way of the heath hen and the Carolina parakeet. To be sure, judges were still elected in most states; they were, in this sense, more like politicians than like civil servants. But in most states, most of the time, the election of judges tended to be a dull, routine affair—an electoral rubber stamp. Most often, a judge got the seat on the bench through appointment, when an old judge died or resigned. After serving for years as a judge, the new judge would run for reelection, but typically without serious opposition.

The Missouri plan (1940) made “elections” even more of a sham. Under this plan, a panel made up of lawyers (chosen by the local bar), a sitting judge, and laymen appointed by the governor, would nominate judges. The panel would put forward three names; the governor would pick one of the three. This judge would then serve one a year on the bench. At that point he or she would run for re-election, but unopposed. In the 1960s, a number of states adopted the Missouri plan, or something similar. Since it is hard to defeat somebody with nobody, the unopposed judge was supposed to win; and almost always did. There were, however, a few glaring exceptions: in 1987, as we mentioned before, Rose Bird, chief justice of the California Supreme Court, and a very controversial figure, was buried in an avalanche of “no” votes; two other liberal judges went down to defeat along with her.

In states without the Missouri plan, or something like it, judges run for election, just like any other candidate. But not quite the same. Most of these elections are quiet and uneventful. The public is profoundly uninterested. There are few real contests. In some states, the parties divide up the slots. The judges typically serve for long terms. Mostly, they are unopposed. Even when there is opposition, judges can hardly campaign as vigorously as candidates for, say, Congress; or even a local school board. They can hardly make promises, or announce how they intend to decide cases, except in the vaguest, and most general way. This adds to the bloodless character of judicial elections. There are occasional nasty battles; but they are the exception, not the rule. In 1998, not one sitting judge was defeated; and between 1964 and 1999, only fifty-two judges out of some 4,588.42

1 Robert B. Bell, The Philadelphia Lawyer: A History, 1735–1945 (1992), p. 206.

2 The phrase is from Kenneth W. Mack, “A Social History of Everyday Practice: Sadie T. M. Alexander and the Incorporation of Black Women into the American Legal Profession, 1925–1960,” 87 Cornell L. Rev. 1405, 1409 (2002). This is a study of the career of Ms Alexander, who, in 1939, was the “first and only black woman lawyer” in Pennsylvania.

3 On the large law firms, see Robert L. Nelson, Partners with Power: The Social Transformation of the Large Law Firm (1988); Marc Galanter and Thomas Palay, Tournament of Lawyers: The Transformation of the Big Law Firm (1991).

4 This figure, and those on Chicago firms, are from Nelson, Partners with Power, p. 41.

5 Source: National Law Journal, Annual Survey of the Nation’s Largest Law Firms, Nov. 19–26, 2001.

6 Source, National Law Journal, supra.

7 Survey of the Legal Profession, The Second Statistical Report on the Lawyers of the United States (1952), p. 2; Clara N. Carson, The Lawyer Statistical Report: The U.S. Legal Profession in 1995 (1999), p. 10.

8 John P. Heinz and Edward O. Laumann, Chicago Lawyers: The Social Structure of the Bar (1982).

9Bates v. State Bar of Arizona, 433 U.S. 350 (1977).

10 Jerold S. Auerbach, Unequal Justice: Lawyers and Social Change in Modern America (1976), p. 66.

11 Jerold S. Auerbach, Unequal Justice (1976), p. 238.

12 On the rise and spread of the Langdell system, see William P. LaPiana, Logic and Experience: The Origin of Modern American Legal Education (1994).

13 Gwen Y. Wood, A Unique and Fortuitous Combination: an Administrative History of the University of Georgia School of Law (1998), p. 86.

14 Arthur E. Sutherland, The Law at Harvard: A History of Men and Ideas, 1816–1967 (1967), p. 322.

15 Williston was born in 1861, and died in 1963 at the age of 101.

16 See John Henry Schlegel, American Legal Realism and Empirical Social Science (1995); Laura Kalman, Legal Realism at Yale, 1927–1960 (1986).

17 In 1941, in the preface, Kalven and Zeisel stated that the study meant to “bring together into a working partnership the lawyer and the social scientist…. To marry the research skills and fresh perspectives of the one to the socially significant problems of the other.” Harry Kalven Jr., and Hans Zeisel, The American Jury (1966), p. v.

18 However, the social study of law outside the law schools continued, and grew, throughout the last decades of the twentieth century.

19 Peter H. Irons, The New Deal Lawyers (1982); Ronen Shamir, Managing Legal Uncertainty: Elite Lawyers in the New Deal (1995). On the legal realists and their movement, see the sources cited in n. 16, supra; see also Robert W. Gordon, “Professors and Policymakers: Yale Law School Faculty in the New Deal and After,” in Anthony Kronman, ed., History of the Yale Law School: The Tercentennial Essays (2004), p. 75.

20 Among Hurst’s many important works were The Growth of American Law: the Law Makers (1950); Law and the Conditions of Freedom in the Nineteenth Century United States (1956); and Law and Economic Growth: The Legal History of the Lumber History in Wisconsin, 1836–1915 (1964).

21 See the discussion of various schools of thought in the world of legal history, in Robert W. Gordon, “Critical Legal Histories,” 36 Stanford Law Review 57 (1984).

22 For a discussion, see Marc Galanter, “Reading the Landscape of Disputes: What We Know and Don’t Know (and Think We Know) about our Allegedly Contentious and Litigious Society,” 31 UCLA Law Review 4 (1983); Lawrence M. Friedman, “Are We a Litigious People?” in Lawrence M. Friedman and Harry N. Scheiber, eds., Legal Culture and the Legal Profession (1996), p. 53.

23 295 U.S. 495 (1935).

24 297 U.S. 1 (1936).

25 290 U.S. 398 (1934).

26 The decision was 5 to 4. Chief Justice Charles Evans Hughes, who wrote the majority opinion, realized quite well the highly charged nature of the issue. An “emergency,” he said, “does not create power”; but he argued that the Contracts clause was not absolute; and that the law in this case was a “reasonable means to safeguard the economic structure” of the country.

27 See William E. Leuchtenberg, The Supreme Court Reborn: The Constitutional Revolution in the Age of Roosevelt (1995).

28National Labor Relations Board v. Jones & Laughlin Steel Corp., 301 U.S. 1 (1937).

29 One view—which Leuchtenberg, n. 27 supra represents—makes the court-packing plan extremely significant in explaining the “switch.” On the other side, see Barry Cushman, Rethinking the New Deal Court (1998).

30 G. Edward White, The Constitution and the New Deal (2000), pp. 235–36.

31 Bob Woodward and Scott Armstrong, The Brethren: Inside the Supreme Court (1979) became a best-seller, by promising to tell inside secrets of the Court, and to rip aside its curtain of secrecy.

32 See, for example, James L. Gibson, Gregory A. Caldeira, Lester Kenyatta Spence, “The Supreme Court and the US Presidential Election of 2000: Wounds, Self-Inflicted or Otherwise?” 33 British Journal of Political Science 535 (2003).

33 See George L. Watson and John A. Stookey, Shaping America: The Politics of Supreme Court Appointments (1995).

34 531 U.S. 98 (2000).

35 See Gerald Gunther, Learned Hand: The Man and the Judge (1994).

36 On Cardozo, see Richard Polenberg, The World of Benjamin Cardozo: Personal Values and the Judicial Process (1997).

37 See Christopher P. Banks, “Reversals of Precedent and Judicial Policy-Making: How Judicial Conceptions of Stare Decisis in the U.S. Supreme Court Influence Social Change,” 32 Akron L. Rev. 233 (1999).

38 Lawrence M. Friedman et al., “State Supreme Courts: A Century of Style and Citation,” 33 Stan. L. Rev. 773, 790 (1981). Michigan was one of 16 state supreme courts studied; there was tremendous variation among states. Dissent rates actually fell in West Virginia between 1870 and 1970; in the decade of the 1960s, more than 98 percent of the West Virginia high court cases were unanimous. But the number of nonunanimous decisions, in general, doubled in the 16 states during the century between 1870 and 1970.

39 For these figures, see 88 Harvard Law Review 376 (1974); 114 Harvard Law Review, 394–395 (2000).

40 A few years later, Rehnquist wrote an article in which he accused the clerks as a whole of “liberal” bias. He warned that they could bias the Court by presenting material in a slanted way. William R. Rehnquist, “Who Writes Decisions of the Supreme Court?” U.S. News and World Report, Dec. 13, 1957, p. 74.

41 This was also true of the whole federal system. District judges have had clerks since 1936, appeals court judges even earlier. Only a handful of federal judges—Richard Posner of the Seventh Circuit is perhaps the most prominent—actually write their own opinions from start to finish.

42 Larry Aspin, “Trends in Judicial Retention Elections, 1964–1998,” 83 Judicature 79 (1999).