A SOCIETY THAT EMBODIES liberal values—that encourages economic ambition and emphasizes individual choice, that espouses the meritocratic route to social mobility and takes for granted the variability of our tastes and allegiances—may be inimical to the values embodied in traditional liberal education. There is a tension between the self-assertion that a modern liberal society fosters and the humility required of someone who tries to immerse herself in the thoughts and sentiments of another writer or another culture; there is perhaps a greater tension still between the thought that some achievements in philosophy, art, or literature will stand for all time and the ambition to use those achievements as stepping-stones to something better. It may be a healthy tension rather than a simple contradiction; renewing the gentlemanly ideal celebrated in Cardinal Newman’s The Idea of a University in a liberal democracy perhaps requires us to live with such a tension. But this is something to be argued for rather than taken on trust.
I begin with a little autobiography because my own education was itself a training in how to live with this tension. I am what British observers of a certain age call a “scholarship boy”: the beneficiary of a meritocratic educational system that plucked boys like myself (girls, too, but less often) from working-class and lower-middle-class backgrounds in order to give us what the clever children of the professional middle classes got automatically—a fiercely academic secondary education, available to London children at schools like St. Paul’s, Westminster, Dulwich College, or the City of London School. I was a suitable case for treatment. My grandfathers were a miner and a truck driver; my parents left school at thirteen to work as a boy clerk and a housemaid. My father was a self-improving sort, though without the hard-driving qualities the label suggests. The family dynamo was my mother. Her fearlessness and organizational drive have been inherited by my sister, who runs one of Britain’s largest community colleges and is the only woman in such a position. In World War II, my father worked as a clerk in the Royal Artillery, and his commander was impressed enough to urge him to get professional qualifications after the war. He duly did so; my first post-1945 memories are of my father reading endless correspondence units for his accountancy examinations.
I benefited more from his efforts than he did. He became the chief financial officer of a midsized company and prospered, but throughout his career, he preferred the company of a book to that of his fellow executives. I got the liberation he hankered after. My North London “council school”—the local public elementary school—was run by two ambitious head teachers on the lookout for clever and energetic children; they got on well with my parents, and together they set my brain in motion. Tracking—what the British call “streaming”—has become unpopular, but I enjoyed going through school at my own speed, helped by teachers who were tough about making me get things right and imaginative about pushing me forward. The post-1945 London County Council, which ran the schools, was a model of old-fashioned social democratic virtue, but it was a liberal education that we received. We visited museums and grand houses; we sang Handel, Arne, and Purcell; and we had Benjamin Britten’s Young Person’s Guide to the Orchestra played to us by the London Symphony Orchestra under Malcolm Sargent. I was also in debt to the librarians at the local public library, who enjoyed dispensing the riches of their Aladdin’s cave.
The London County Council awarded scholarships to various public schools—in the English sense of that expression—and I was awarded one to Christ’s Hospital. The school had been founded in 1552 by the saintly boy king Edward VI as part of his plan for putting redundant monastic buildings scattered about London to charitable use. Christ’s Hospital was meant to rescue poor but honest children, and on the whole it did. (It was one of a group of institutions of which the best known was the older foundation of Bethlehem Hospital, or Bedlam; visitors to London could watch the lunatics in their asylum and the boys of Christ’s Hospital at their meals and prayers.) In 1903, the school moved into the countryside—medieval drainage and the proximity of Newgate Prison bred diphtheria—and set up near Horsham in Sussex. Entry to the school was “means-tested”: if your parents could afford to pay for a private education, you couldn’t go. My family couldn’t afford to pay for my education, so I went. Americans with children at private colleges and universities will be familiar with financial aid given on this basis; in Britain, it still remains an oddity.
During the 1950s, Christ’s Hospital was as meritocratic as my elementary school had been in the late 1940s. Most boys left at sixteen to join the sort of City of London firm that had for two centuries employed them in clerical jobs. Charles Lamb was the most famous of them; he spent his working life in the East India Company and his evenings writing the Essays of Elia and drinking tea with William Godwin. The small number who could tackle academic disciplines to a level that would get them into Oxford and Cambridge stayed on till eighteen; they included Samuel Taylor Coleridge. The purest example of the self-made academic that the school produced was Sir Henry Maine, the great nineteenth-century jurist and legal historian and a man with a lifelong hunger for academic glory and financial security. I am less driven than he, but I know that I have spent my life trying to meet the standards set for me at Christ’s Hospital.
I am vain enough to think the raw material my teachers worked with must have been good enough to inspire such efforts on their part, but my chief sensation is astonishment at my luck at falling in with the people who taught me in Islington, Horsham, and Oxford. For an anxious lower-middle-class child, conscious of the tight budgeting that went on at home and the sacrifice of the present to the future that defines English middle-class life, it was an unspeakable luxury to find this rich and vivid world to which the price of admission was only the desire to join. My brave new world was peopled with writers and my Ellis Island was the school library. There is almost no platitude about the pleasure of having your eyes opened and your mind stretched to which I do not subscribe. At its best, liberal education opens a conversation between ourselves and the immortal dead, gives us voices at our shoulders asking us to think again and try harder—sometimes by asking us not to think but just to look and listen, to try less hard and to wait for the light to dawn. It is not always at its best, and the contrast between what can happen and what more commonly does is not to be blinked at. Even when liberal education is not at its best, however, it is well worth defending against its wilder critics.
I was not special; innumerable students have had the experience I had; and innumerable students still get the care I got. I mention my education to make a general point about the idea of meritocracy and the pursuit of excellence. Liberalism has a natural affinity with meritocracy; it is attracted to an aristocracy of talent and critical of an aristocracy of birth. Liberal education in the conventional sense also rests on the thought that an acquaintance with intellectual, literary, and artistic excellence is in some (rather debatable and hotly debated) fashion good for us, and that one of the ways it is so is in teaching us to measure ourselves against touchstones of cultural and intellectual excellence. My reflexes are meritocratic. Let me take an embarrassing example. The Bell Curve’s claim that measured differences in IQ between black and white Americans reflect different “racial” endowments of native intelligence is entirely ill founded, and the insinuating tone of the book unpleasant.1 Its insistence that people should be selected for jobs, graduate training, university admission, and the like on the basis of measurable competence is, however, impossible to resist. The cliché defense of meritocracy is that none of us wants to be operated on by an incompetent brain surgeon. That suggests a rather narrow idea of merit; the principle applies more widely. Nobody wants Pushkin translated by someone who knows no Russian, nor do we want the Cleveland Orchestra conducted by a tone-deaf lout with no sense of rhythm. The fact that we can debate the merits of different translations of Pushkin and different performances of The Rite of Spring while conceding that all are technically competent makes no difference. Where there is a measurable skill, it should be measured, and the excellent should be preferred to the merely decent. Where standards are debatable, they should be debated. The point is well understood by sports fans, but underappreciated in the arts and humanities.
The case for meritocracy is so obvious that it is tempting to forget that there are respectable arguments against it.2 Some have been prominent in American life. Let me mention two. First, what we seek in most areas of life is not “the best” but “the good enough.” And rightly so. The restless search for the excellent automobile that an automotive perfectionist engages in does not increase his driving pleasure; it merely deprives him of the enjoyment that he would have had if he had settled for a merely decent car. Applied to education, the thought is that most students in high school and college will learn enough math and enough writing and reading skills to get a decent living. They need not be made anxious and dissatisfied by having to face the fact that they will never be very good in either field. Nor is there much of an economic argument for pursuing intellectual excellence. The economy needs very few excellent mathematicians, but a lot of averagely numerate workers. There is an economic case for insisting on a competitive marketplace and the development of excellent products and excellent management, but not for insisting on meritocracy—as distinct from competence—in the educational sphere. Experience suggests that this is a sound view: the United States is the most productive country in the world; its popular culture is as attractive to other countries as its technical expertise in aeronautical engineering and computer software. It is neither an intellectually rigorous nor a culturally ambitious society, however; outside major metropolitan areas, there are few bookshops, the radio plays an unending diet of gospel or country-and-western music, and intellectual pretensions are not encouraged. The nation has prospered without inculcating in its young people the cultural and intellectual ambitions that French lycées and German gymnasia inculcate in their students. Why should it change now?
Most Americans are happy to compete in the marketplace on the basis of the excellence of their products, but few wish to be more discriminating, better read, or whatever else than their neighbors. Most people regard what appears to be intellectual discussion rather as a way of cementing friendly relations among themselves than as a way of changing minds or seeking truth. Many conservatives in the 1990s have looked back nostalgically to the 1950s. But in the 1950s, American high schools taught “life adjustment” classes. Among the topics covered in one New York school system described by Richard Hofstadter in Anti-intellectualism in American Life were “Developing School Spirit,” “My Duties as a Baby-Sitter,” “Clicking with the Crowd,” “What Can Be Done about Acne?” “Learning to Care for My Bedroom,” “Making My Room More Attractive.” Eighth-grade pupils were given these questions on a true-false test: “Just girls need to use deodorants”; “Cake soap can be used for shampooing.” Women friends of mine were taught how to enter and leave a sports car without allowing their skirts to ride up and expose their underwear. If the 1950s were wonderful, it is not because they were years of universal intellectual excellence.3
A second persuasive objection to meritocracy rests on a related but rather different thought: most people prefer stability, authority, and tradition to uncertainty, freethinking, and openness to the future. Where merit is clearly defined and relevant, persons should be promoted, and ideas accepted, on their merits; otherwise, habits of acceptance should be cherished. This is an old conservative thought, and it is very hard to resist. Karl Popper, the philosopher of science, social theorist, and author of The Open Society and Its Enemies, defended the idea that the policies of governments, like scientific theories and social practices generally, should be accepted strictly on their merits, precisely because he thought that the “normal” condition of mankind was conservative and indeed, in his eyes, “tribal.” This was why the “open society”—a liberal, democratic, changeable, and argumentative society—had so many enemies, from the high-minded philosopher Plato at one extreme to low-minded terrorists such as Hitler and Stalin at the other. Popper’s model of the open society was a community of research scientists. Science is a strikingly artificial activity: scientists have to formulate bold hypotheses about how the world works and then submit these hypotheses to rigorous testing against whatever evidence can be found. Hypotheses may not be protected from refutation by appealing to our own virtues and our critics’ vices, or indeed by appealing to anything but the best available evidence. If The Bell Curve’s hypothesis that IQ is racially determined is to be tested scientifically, we must not try to discredit it by observing that one of its authors was a Jew who naturally liked the idea that science showed that Jews were innately more clever than blacks; and it will not do to defend it by observing that many of its most savage critics were blacks who naturally disliked the same idea.
Conservatives are sometimes criticized for advocating laissez-faire in economics but wanting stability in religious, social, and cultural matters. This seems a mistake; it may not be possible to have what they want, but the combination of stability in our deepest allegiances and quick-footedness in our habits of work and consumption would surely make for a happy and prosperous society. A greater source of anxiety today is that it is not groups who enjoy the pace of change in American economic and social life who want stability in their deepest beliefs. Recently, the threat to free speech and free inquiry on American campuses has come—as it has done for most of the century—not from the defenders of private property or the defenders of upper-class respectability but from lower-middle-class groups demanding “respect” for themselves, their opinions, and their culture. The latter are not defending privileges that have come under attack, but seeking comfort in a world they view as hostile and dangerous. Fundamentalist Christians—almost invariably from rather humble backgrounds—who try to stop colleges and universities from preaching tolerance for the sexual tastes of gays and lesbians genuinely feel as ill-used as the gays and lesbians who want their schools to protect them against the insults of the godly.
The liberal view is hostile to the search for comfort and support, at least partly for meritocratic reasons. All sides are entitled to physical safety, and certainly everyone needs friends, but nobody is entitled to respect—other than the minimal respect that is involved in arguing courteously with one’s opponents rather than beating them up. Or, rather, one is entitled to be treated as a rational adult, but one’s ideas are entitled only to the respect they earn by being properly thought out, factually well grounded, and the like. It is this distinction that many students, particularly students from families with no previous experience of academic life, find it hard to deal with. The thought that much of what they have hitherto unhesitatingly believed is false, misguided, or simply one among many options produces a lot of anxiety. The wish to assuage this anxiety runs headlong into the view that we must try to believe only those ideas that are good enough to stand up to criticism. None of this licenses rudeness or brutality; it is no doubt true that many professors are socially inept, and others are authoritarian, and still others are so insecure that they cannot bear any criticism of their opinions. All the same, even if we were all as deft and sensitive as could be imagined, the ideal classroom would not be a cozy place. Part of the object of education is to teach us to treat our own ideas objectively rather than subjectively; we ought not to want to believe what will not stand up to criticism, though we all do, and we can hardly hope to discover which of our beliefs are more and which less reliable without a few moments of discomfort. Bullying and insult are intolerable, no matter who is on the receiving end; but shading the truth is the ultimate academic sin.
My first insights into my own education came through reading John Stuart Mill’s Autobiography and progressing from Matthew Arnold’s “Scholar Gypsy” to grappling with Lionel Trilling’s Matthew Arnold. A predictable result of a liberal education is that its beneficiaries behave like the hero of Saul Bellow’s novel Herzog—who spent much of his time composing postcards to the immortal dead. Writing about education is particularly likely to involve such a one-way traffic in postcards. Jaroslav Pelikan recently wrote The Idea of a University: A Reexamination to defend a conservative and traditionalist vision of higher education in homage to and in dialogue with Cardinal Newman. My antipathy to organized religion, to the Oxford movement, and to the personality of Newman himself means that I admire Newman’s prose without much liking the writer. Moreover, Newman wrote The Idea of a University to defend the newly founded University College, Dublin, against fellow Catholics who wished it to provide a sectarian education; and this is hardly our situation. We are more likely either to ignore the education of all but an elite or else to be besotted by the needs of the economy. My own touchstone is a book that nobody wrote—Culture and Anxiety—but the voices in my head are those of Mill, Arnold, Russell, and Dewey, and, among recent writers, Raymond Williams and Richard Hoggart.4 Since they were considerable readers, I therefore eavesdrop on their conversations with Marx, Hegel, Freud, Carlyle, and innumerable others.
Liberalism has for two hundred years suffered from three great anxieties. The first is fear of the culturally estranged condition of what has been variously called the “underclass,” the “unwashed mob,” the lumpen proletariat, or (by Hegel) the Pöbel; the second is unease about “disenchantment,” the loss of a belief that the world possesses a religious and spiritual meaning; the third is fear that the degeneration of the French Revolution between 1789 and 1794 into a regime of pure terrorism was only the harbinger of revolutions to come. These fears often feed on one another. “We must educate our masters,” said the English politician Sir Robert Lowe when he saw that he could no longer resist the Reform Bill of 1867. That legislation gave the vote to most of the adult male inhabitants of Britain’s cities. Lowe was frightened by a familiar scenario: unless the working class was educated, farsighted, and prudent, commercial and industrial change would bring in its wake a democratic revolution that would degenerate into mob rule and end with a guillotine in Hyde Park. In the alternative, there would be no revolution, because the mob would follow the first golden-tongued demagogue who cared to woo them. Arnold feared the mob. Mill did not, but he feared Napoléon III. Their contemporary, the social theorist Alexis de Tocqueville, gave Europeans some understanding of how the Americans had escaped both of these disastrous sequences while the French had not.
These political fears are today antiquated in Britain and the United States—but Britain and the United States remain unusual. The recent civil wars in the former communist state of Yugoslavia are but one of many contemporary instances of the way political and economic disruption leads to irrational, violent, and atavistic behavior. They seem to indicate that the first thing a newly emancipated and politically uneducated people will do is follow a dictator. Franjo Tudjman and Slobodan Milošević are nastier and more uncouth than Napoléon III; but everyone who saw Louis-Bonaparte, the nephew of Napoléon and an adventurer of the lowest kind, rise to power after the French Revolution of 1848, on the back of the popular vote, knew how a demagogue could turn the popular vote into a mandate for his dictatorial ambitions.
I want to emphasize the difference between these fears, however. It is clear that we might escape the guillotine but relapse into mindlessness; we might lead culturally vivid lives under the shadow of the guillotine; we might escape both these fates but feel intolerably estranged from the world because a life without strong religious sentiments turns out to be humanly impossible. The three great anxieties are different: the first rests on the idea of a distinctively cultural disaster, what Arnold termed “the brutalisation of the masses”; the second rests on the idea that if religious faith and a sense of community together decay, we shall be “unanchored” in the world; the third is more narrowly the fear of political violence. Nineteenth-century Britain and the United States—where modern liberal education was invented—suffered these anxieties to different degrees and in different forms. The French Revolution did not haunt the American political imagination as it haunted British and European writers; indeed, only with the rediscovery of Edmund Burke’s attacks on the French Revolution by American conservatives such as Russell Kirk and William F. Buckley during the Cold War did the political excesses of the French Revolution become a theme for American political controversy. Conversely, fear of the solvent effect of immigration affected nineteenth-century Britain only in the narrow and highly specific form of a dislike for Irish migrants in the 1840s and for Jewish and other eastern European migrants in the last two decades of the century; but in the United States, the fear that immigrants from anywhere other than England, Scotland, or Protestant northern Europe would erode the existing common culture was one source of the demand for “common schools” as early as the 1830s, and variations on that theme have been heard in American politics ever since.
I call these anxieties “liberal anxieties,” but an obvious objection is that they are everyone’s anxieties. My response is that both anxiety and liberal are to be taken seriously. Liberals have always been on the side of economic, political, and intellectual change; they have hoped that change would culminate in freedom rather than chaos or estrangement, but they have always known that they might unleash forces they could not control. The late eighteenth-century conservative and bitter enemy of the French Revolution, Joseph de Maistre, denounced the liberal philosophers of the eighteenth century for encouraging the aspirations of the common people, and so inciting revolution and bloodshed.
As he observed, it is not the tiger we blame for rending its victim but the man who lets him off the leash. Liberals have wished to raise expectations without being overwhelmed by the consequences, and it is none too clear that it can be done. Tocqueville famously argued that the French Revolution broke out because the population had made enough progress in the years before 1789 to be maddened when progress was not sustained. Americans commenting on the fact that black Americans were more aggressive in demanding their rights after Jim Crow legislation had been overturned always refer to what Adlai Stevenson described as “a revolution of rising expectations.” It is all too plausible that people who have never been able to raise their eyes to new possibilities will remain docile, while those who have seen new possibilities will rebel if they are then denied the chance to seize them. Liberal anxiety responds to the risk of such a revolution.
Liberals know that it is not irrational to bet against the liberal project from the right or the left, or from both sides at once—that it is quite rational to think that change should be approached much more cautiously, or that it must be embraced much more wholeheartedly. In intellectual and cultural matters, indeed, liberals themselves are often conservative and revolutionary simultaneously in just this way. They see that they are the inheritors of traditions they do not themselves wish to overthrow, but they want everyone to explore those traditions for themselves. When they do so, they may reject them or alter them out of all recognition. The liberal can respond only that this tension lies at the heart of all serious intellectual or aesthetic activity. How could a scientist proceed if not by absorbing the techniques and theories and problems of a tradition of inquiry and then launching out into new work; and are not scientists often disconcerted to find their cherished ideas dismissed as old hat by their radical juniors? Is not the same thing true in art and music?
The concepts of “conservative,” “liberal,” and “revolutionary” in intellectual matters are used loosely, of course. The context in which they are used more exactly, and where liberals are habitually beset from left and right, is the political. And here is where the difference between anxiety, fear, and anger is most plainly visible. Conservatives have rightly felt fear in the face of the changes the liberals wanted; but since they wanted to preserve an ancien régime society, creedally based political authority, and the habits of a rural economy, they had no reason to be anxious but much reason to be frightened and angry. Socialists and radicals have rightly felt exasperation and anger at the inadequacy of the changes that liberals have welcomed, and at the evils liberals have left untouched as well as at the new forms of exploitation they have brought with them. Conservatives have disliked the liberal, meritocratic ideal of “the career open to talent” because they wished to preserve a society in which hereditary privilege ruled, and not always because it was in their self-interest; socialists have disliked the liberal, meritocratic ideal because it placed unskilled workers at the mercy of financially or intellectually better-endowed people. Conservatives and socialists have often held in common the belief that stable societies in which people know what to expect of life are happier societies than the shifting, insecure societies that liberalism creates.
Nineteenth-century liberals added to their insecurity when they insisted that reform must come through the efforts of its beneficiaries. A chapter entitled “The Probable Futurity of the Labouring Classes” in Mill’s Principles of Political Economy (1848) set out the argument. Mill took it for granted that benevolent conservatives existed; their flaw was that they wanted to look after the working class. The liberal aim was that the workers should look after themselves; the ultimate aim was a wholly classless society in which individual success depended upon merit. Mill’s characteristically sharp way of putting the point was to insist that he wished to live in a society “with none but a working class.” Other than children and retired people, nobody was to benefit without contributing to the best of his ability. Self-advancement required intelligence and foresight. It demanded control over fertility. It demanded equality between the sexes. It demanded a transformation of relations at work. Mill was certain—wrongly, as it turned out—that educated people would not accept forever a division between managers and workers, or between capitalists and wage earners. The title of the political scientist Benjamin Barber’s recent book, An Aristocracy of Everyone, summarizes Mill’s aspirations: everyone was to make the best of herself. The revolutionary route to such a result was needlessly painful and unlikely to work; education in a broad sense was the only route. When I say education in a broad sense, I mean that Mill wanted to make society and politics generally more intelligent. He rightly had no thought of throwing open an unreformed Oxford and Cambridge to the English working class of the 1840s.
What Mill, like Arnold, and like Emerson on the other side of the Atlantic for that matter, had in mind was the transformation of the entire society into a community that was reflective and broadly cultivated as well as liberally educated in the usual sense. This is why I said earlier that the ideal of an educating society, rather than an educated society, was so important, and why it was the entire society rather than what we nowadays call educational institutions on which discussion focused. What role there was for the ancient English universities in such a vision was not obvious. Until they were reformed by act of Parliament in the 1850s, they could hardly play any role. In the early nineteenth century, they were actually less open to the lower-middle-class or working-class young man than they had been three centuries earlier. What animated reformers was the hope that a more rational society would be governed by a meritocracy; once power was gained by administrative ability and professional expertise rather than by birth, it would both induce the sons of the upper classes to get a decent education and open social and political advancement to their social inferiors. Only when major social institutions gave the stamp of approval to education would ambitious persons seek a decent education. Mill helped govern India from the London offices of the East India Company. The company had set up a school for boys who would go out to govern India—Haileybury College—and there they got a notably modern education, including courses in economics, history, and literature of a kind that Oxford and Cambridge introduced later and reluctantly. The new colleges—such as University College, London, founded in 1828—that were founded in London and the larger provincial cities could more easily than the ancient universities dispense an education both practical and liberal throughout the population.
The wish to be able to boast of the United States or of Britain, as Pericles did of ancient Athens, that our society is a school for all the world is and always has been a utopian ambition. Moreover, it is not an ambition that we can expect everyone to see the point of. The kind of society that sets store by being as self-critical and intellectually ambitious as the society Mill hoped for will not appeal to everyone. Tastes vary. Still, it is easy to imagine that its ideals can readily be realized on a small scale in particular contexts—that liberal arts colleges will form very “Millian” communities, as will many laboratories, some firms, and even some sports teams—and that in many contexts they will not be realized at all. The improvement in working-class well-being in the past hundred and fifty years has moved us both toward and away from that utopian goal. Especially in the United States, the growth of working-class incomes has produced a population that is infinitely less brutal, drunken, ignorant, and alienated than was the urban proletariat of Victorian England, let alone the denizens of New York’s Hell’s Kitchen seventy-five years ago. It is also a population that is emphatically private in its concerns, and in that way quite different from what Mill and even Arnold would have hoped. Only a bare majority of possible voters now go to the polls in the United States, even in presidential elections. Barely 10 percent of citizens can name their local congressman.
It would be wrong to call the citizens of the present-day United States passive or apathetic; but their concerns are domestic, private, and familial. In 1834, Tocqueville saw this retreat into domesticity as one possible American future, and it was one that he and Mill feared. A prosperous but narrowly self-centered society is better than a poor and narrowly self-centered society; but it is not what they wanted. They hoped that education would produce the wish and the ability dramatically to rebuild social, economic, and political institutions. That economic progress would remove the desire for violent revolution by subverting the ambition for anything other than our own private well-being would have seemed a sad bargain. If liberals have less reason than they once had to wonder whether change would not degenerate into mere chaos, they have every reason to wonder whether they were right to think that a free and prosperous society would also be lively, intelligent, and self-improving.
The fear of brutalization took, and still takes, different forms in Britain and America. Arnold’s talk of the brutalization of the masses made perfect sense in a class-divided society; culture was a middle- and upper-class possession. British migrants to colonial America, on the other hand, could more plausibly fear the brutalization of a whole society in a bleak and hostile environment. What led the spiritual leaders of the Massachusetts Bay Colony to establish Harvard College in 1636 had no direct contemporary British counterpart: the feeling of being a small island of Christian culture in a vast wilderness was a physical reality in the New World in a way it could not be in Britain. The feeling was reinforced by the incessant expansion of the country. To establish a college or university—after a church and perhaps a local elementary school—became an outward and visible sign of an intention to cultivate the territory and civilize the citizenry. The process received a great impetus when the American Revolution removed the imperial government’s constraints on expansion; but it had already received almost as much from the revivals that were a feature of American religious life from the early eighteenth century onward.
Harvard was two hundred years old in 1836, and the University of Michigan already nineteen. The College of William and Mary was founded in 1693, but when the elderly Jefferson founded the University of Virginia in 1819, he said it was the proudest achievement of his career. Small liberal arts colleges proliferated in the northeastern United States, beginning with Williams College in 1791. The town of Evanston is today a northern suburb of Chicago, but Northwestern University is a reminder that almost the first thing that devout Methodists in the Northwest Territory did was to build a college, just as Oberlin College speaks to the memory of the German educator after whom it was named on its foundation in 1819. After a church and a school, it seemed sometimes that almost the next thing that a respectable town required was its own college; most were evanescent foundations, but some flourished, to our great good fortune.
The cultural environments of Britain and America were less strikingly different than the physical environments. The frontier wilderness was hardly less propitious a setting for high culture than the spiritual wilderness of the British industrial cities in the mid-nineteenth century that so distressed Arnold and many others. Certainly, the British city called out the same response as the American frontier: local worthies and local clergy created civic colleges that eventually turned into the Universities of Manchester, or Bristol, or Leeds. Two differences have always been very marked. The first is the place of established religion, and the second the place of acknowledged class distinctions. The absence of both made the United States the more plausible setting for a democratic intellectual culture; whether they also made the United States a less favorable environment for “high” culture has been argued over for a hundred and fifty years.
The United States was, and Britain was not, committed to the separation of church and state. The U.S. Constitution forbade, while the British continued to accept, hereditary titles of nobility. The fate of England’s two ancient universities was tied to the fortunes of the Church of England and to the most conservative forces in national politics. In colonial and independent America alike, a student might well choose a college on the basis of religious affiliation, but there was no question of his being excluded altogether for religious reasons. Nondenominational state universities sprang up immediately after the Revolution. Their presidents had often been admitted to the ministry, and a generalized piety was expected of them; but the contrast with England was striking. There the two great ancient universities were an Anglican monopoly: nobody could even begin an education at Oxford without swearing allegiance to the Church of England, while nobody could graduate from Cambridge without doing so. In both countries, of course, only a tiny percentage of the population attended college at all: Princeton graduated eight or a dozen students a year for many years after its foundation in 1746, for instance, and in Britain, Oxford was a smaller university for the two centuries after the Civil War of 1641–49 than it had been for the century before the war. The role of colleges in defeating brutalization was understood to be a matter of “trickle down,” whereby the college-educated would diffuse enlightenment either directly, through the ministry and in teaching, or indirectly, through their support of the arts, libraries, museums, and the like.
The Anglican monopoly of access to the ancient English universities was part of the Anglican monopoly of access to the learned professions, Parliament, and political preferment. Dissenters established very small colleges to train their ministers and to provide a serious liberal education to their children. It was these dissenting colleges that eighteenth-century American colleges resembled in their seriousness, their accessibility to the relatively humble, their liveliness, and therefore their democratic potential. Indeed, when Princeton was founded in 1746, it was to the English dissenting colleges that it looked for curricular inspiration. The established English universities were very far from natural breeding grounds for a democratic culture, and in their eighteenth-century torpor were hardly an educational model of any kind. From the 1820s onward, things changed. With the creation in 1828 of what became University College, London—“the godless college in Gower Street”—England acquired its first secular college. King’s College, London, was founded shortly after to ensure that the capital possessed an Anglican college. Outside London, a range of colleges, often founded by Dissenters, started to reach beyond their original, sectarian clientele. The Scots, it has to be said, looked on in some amusement, having enjoyed the benefit of accessible nonsectarian universities in Edinburgh, Glasgow, and Aberdeen for several centuries. It was no wonder that the Edinburgh Review took particular pleasure in tweaking the noses of Oxford conservatives.
The early nineteenth century gave the critics of brutalization plenty to fear. The horrors of newly urbanizing Britain were recited by every spectator. Mill was jailed for a day at the age of sixteen for distributing birth-control pamphlets to the working-class houses of the East End; it is said he did so after coming across the corpse of an abandoned baby as he walked to work at East India Company headquarters. The starved and stunted condition of the children who worked in the cotton industry was a commonplace; so was the drunken and brutal behavior of their parents; so was their almost complete alienation from the church; so were their ignorance and illiteracy. Karl Marx, who relied on Friedrich Engels’s The Condition of the Working Class in England; Thomas Carlyle, on whose Past and Present Engels had himself relied a good deal; and both Mill and his father were in substantial agreement about the horrors of the situation. They were also substantially agreed that the better-off had neglected the spiritual and physical welfare of the worse-off. Revolutionary socialists such as Marx insisted that nothing would change until the expropriators were expropriated; conservative reformers such as the English humanitarian Lord Shaftesbury insisted that it was the proper task of a conservative ruling class to make coal owners and manufacturers treat their workers decently, to protect women and children from exploitation, and to secure the conditions of their spiritual growth. Paradoxically enough, it was the reports of the Factory Inspectorate that Shaftesbury and his allies had shamed the British government into establishing in 1833 on which Marx later relied when he was writing Das Kapital.
In the United States and Britain, the phenomenon of brutalization at its crudest is today a phenomenon only of the decayed inner cities. So far from proletarianization being the lot of the working class, the work and consumption habits of what used to be called the “respectable” working class have become universal. But the end of the twentieth century sees no diminution of anxiety about secularization and disenchantment. The fear has one source, but two distinct aspects. The one source is the increasingly dominant position of the physical sciences among the many ways in which we explain and understand the world. The two distinct aspects are, first, that scientific understanding will drive all the poetry out of the world—that color, beauty, sublimity, will vanish and nothing will be left but matter in motion—and, second, that in the absence of transcendental sanctions, mankind will become as the beasts, without shame, without morality, and without ambitions for perfection. The common thread is the fear that what science reveals is that human existence is accidental; the world has no purpose, humans have no special place in the world, whatever they contrive by way of an existence is wholly up to them, and in the absence of a divine ordering of the world, what they may get up to hardly bears thinking about. “If God is dead, everything is permitted,” wrote Friedrich Nietzsche. Many observers of the horrors of the twentieth century have thought that the Nazi death camps were a commentary on that claim.
The Enlightenment was an amorphous movement. Not all skeptical, secular philosophers of the eighteenth century thought they belonged to the movement. Nonetheless, by the end of the eighteenth century, the ideas that mankind was morally and intellectually self-sufficient and that the world was not intrinsically mysterious but would yield to scientific investigation and control were understood as the central ideas of the Enlightenment. The great German philosopher Immanuel Kant said that the motto of the Enlightenment was sapere aude, or “dare to understand.” Critics of the Enlightenment complained that their enlightened opponents were bent on driving the poetry from the world, that the world described by science was cold and colorless. William Blake thought that the arrival of Newton had been the death of the human world.
This is not to say that science is irreligious. When the sociologist Max Weber introduced the idea of “disenchantment” in The Protestant Ethic and the Spirit of Capitalism, he argued that modern science was a product of the same spiritual impulse as Protestantism. The German word usually translated as “disenchanted” is entzaubert, or “unmagicked.”5 The Protestant distaste for magic was a moral and spiritual distaste. To believe in magic was an insult to God. A serious God would not interfere with his creation in a capricious fashion; he could not be cajoled, bribed, or seduced into doing his worshipers a good turn. The Protestant God was deus absconditus, the God who had created a universe governed by intelligible natural laws and who had then allowed that universe to operate according to those laws. He himself was absent. This absence left the world to be explained by whatever theories the new natural sciences could validate. This austere picture of the universe sustained, and was sustained by, an ideal of self-discipline that repudiated the use of anything other than our own talents and energy to achieve our ends. It bred a decided moral and intellectual toughness in the process.
The “enchanted” world, in contrast, was a world where we were at home. It was not necessarily a world created by, or ruled by, any of the gods of the great world religions; but it was a world where “natural piety” made sense. William Wordsworth’s poetry conveys perhaps more acutely than any philosophical explanation what it was whose loss the critics of the Enlightenment lamented. The Romantic poets had no doubt that what we first encounter is an enchanted world. The child who comes into this world “trailing clouds of glory” needed no teaching or prompting to rejoice in the rainbow or to tremble as the shadow of the mountain stole across the lake. The natural world spoke to him, and he needed only to listen to it. Only when these natural reactions had been suppressed could he think that science could tell him all there is to know about the world. But the suppression of these reactions was a moral and emotional disaster, well captured in the line “shades of the prison house close around the growing boy.”
One of Matthew Arnold’s more famous essays, “Science and Literature,” is devoted to praising poetry and disparaging science as the basis of a concern for culture. It was written as a response to Darwin’s ally, Thomas Henry Huxley, who had himself been provoked by Arnold’s Culture and Anarchy to write in defense of science as the basis for a liberal education. The quarrel of science with poetry is a running theme in the nineteenth century, and one taken up in an interesting way by John Dewey in the twentieth. Its impact on the liberal theory of education is plain enough. Education is notoriously a solvent of traditional forms of religious belief. It is also likely to promote the belief that what cannot be explained by some kind of scientific explanation cannot be explained at all. That, in turn, is likely to promote a view of poetry—and with it, religion—that denies it any cognitive content and sees it as pure self-expression, a matter of sentiment, not intellect. The thought that poetry is “only” expressive is simply the other face of the view that a strictly scientific understanding of the world is all the understanding that there can be. The fear, then, is that neither the individual nor society can sustain an adequate life without an individual or collective conviction that the world itself is in harmony with our desires and affections. It is the fear that we will find life thin, shallow, and unsatisfying if our individual hopes and fears are not supported by rituals, by festivals, and by what, if backed by a supernatural faith, we would call religious belief, and otherwise might call social poetry.
It is no accident that Arnold looked to poetry to supply what the declining credibility of Christian mythology could not, nor that Mill argued that the religion of humanity could satisfy the needs of the heart while it also reinforced the dictates of rational altruism. Once more, it is the liberal who will experience as anxiety the suspicion that these palliatives may not anchor us in our world; the truly devout unfeignedly believe that the visible world reposes upon something deeper; skeptical conservatives hope people will not ask whether it does. Liberals suffer a self-inflicted wound: they want the emancipation that leads to disenchantment, but want the process that emancipates us to relocate us in the world as well. Nietzsche and Weber are only the most eloquent among the voices that say it cannot be done in the way the liberal wants. The anything but eloquent Dewey is the most philosophically astute of those who say that it can.
The terror induced by the Terror is an oft-told tale. It is not wholly true that the argument between Edmund Burke’s Reflections on the Revolution in France and Tom Paine’s Rights of Man ended in a knockout victory for Burke, but it is certainly true that Burke’s forecast—made in 1790—of the subsequent course of the revolution was unnervingly accurate. The revolution did degenerate into terrorism, dictatorship, and ultimately the arrival of a military government. The fear that opening the gate to popular aspirations would lead inexorably to mob rule, violence, and military dictatorship, together with the ruin of the traditional aristocracy, the spoliation of the established church, and an indefinite Continental war thereafter, was enough to make anyone flinch from reform. The liberal reply was naturally that the disaster occurred because reform had come too little and too late and because good sense had been swamped by ideology. This was what Burke himself said in calmer moments; a society without the means of reform is without the means of its own preservation. The argument rattled back and forth for half a century. The young James Mill edited the Anti-Jacobin Review when he first came to London in 1803, but twenty years later defended the democratization of British politics in terms that led Thomas Macaulay to prophesy that if James Mill had his way, some future visitor from New Zealand would be left to stare at the ruins of St. Paul’s and wonder what had happened to the British.
Macaulay was a Whig, not a Tory. As with the other anxieties of nineteenth-century liberals, the Left and the Right found matters simpler than the liberals did. Macaulay wanted to reform Parliament so that the respectable middle class could play a more active role in British politics; but he feared to go further. The Duke of Wellington and his fellow Tories thought it inconceivable that any change to so perfect a political system could be an improvement, while the Chartists wanted the demands of the People’s Charter: annual elections, universal suffrage, and the secret ballot at once. Liberals remained divided among themselves, as they have been ever since. At the time of the fairly mild rioting that preceded the passing of the 1867 Reform Act, John Stuart Mill told the rioters that they should resort to insurrection only if they thought it was absolutely necessary and they were likely to succeed, while Matthew Arnold angrily observed: “As for rioting the Roman way was the right one; flog the perpetrators and fling the ringleaders from the Tarpeian rock.”
Although Britain and the United States have possessed eminently stable political systems for many years, such arguments persist. In the 1960s, the generally liberal professors of the United States found themselves faced with student insurrection, and reacted with similar ambivalence. As young people who were at risk of being sent off to Vietnam to fight a war they disapproved of made common cause with assorted Maoists, Trotskyites, and even the Deweyan enthusiasts for industrial democracy who had created the Students for a Democratic Society, the old drama repeated itself. Badly needed reforms in teaching and administration threatened to lead to anarchy; and once the Black Panther movement joined the struggle, real violence was never further away than the twitch of a trigger finger. But whatever else was on the agenda, popular insurrection was not. What the state’s role in education should be, what degree of abstention from political involvement should be practiced by universities and their faculties, what the place of formal higher education was in the promotion of high culture, and what degree of openness to “low” culture and its fads was proper to a university—these have been perennial questions. That an alliance of students and workers should form a revolutionary vanguard was at most a passing fancy of Herbert Marcuse.
If these are the anxieties that beset liberals, and their connection with education is now made out—essentially, that it is not only a little learning that is a dangerous thing for social and individual stability and security—it remains to ask what the role of liberal education is in the liberal view of the world. The answer is that there is no one answer. Once that is acknowledged, it becomes clearer that many of the arguments about the expansion of college and university education over the past century have been arguments about the relative weight to be attached to the provision of a liberal education versus research on the one hand and vocational education on the other, while others have been arguments about the content of what everyone agrees to be a liberal education—what, in an American university, would lead to a bachelor’s degree in the college of arts and sciences. A little history may be illuminating as a preface to two famous nineteenth-century arguments, whose echoes rumble on. Before the American Revolution, there were nine colleges: Harvard, founded in 1636, whose doors opened in 1638; William and Mary (1693); Yale (1701); the College of Philadelphia, later the University of Pennsylvania (1740); Princeton (1746); King’s College, subsequently Columbia (1754); the College of Rhode Island, later Brown University (1764); Queen’s College, later Rutgers (1766); and Dartmouth (1769). All had a common purpose, which was well expressed in the words of Princeton’s founders: “Though our great intention was to found a seminary for educating ministers of the gospel, yet we hope it will be a means of raising up men that will be useful in other learned professions—ornaments of the state as well as the church.”
The sharp modern distinction between “public” and “private” universities and colleges was unknown until 1819. Most colleges would have been hard-pressed to survive their early years without assistance from their state governments. But when the State of New Hampshire set out to take over the direction of Dartmouth College and turn it into Dartmouth University, it was rebuffed in the U.S. Supreme Court. State courts had held that the college was established for public purposes of a sort that gave the state legitimate authority over it; but the oratory of Daniel Webster persuaded the Supreme Court by a five to one majority that although it was perfectly proper for state governments to establish publicly financed colleges, they could not simply expropriate a private “eleemosynary institution,” such as the college clearly was. It is often said that the Dartmouth decision set back the creation of state colleges for some decades and, perhaps more importantly, opened the door to a flood of tiny, and often short-lived, private colleges. Francis Oakley observes that one of its most important results was the feature of American higher education that most astonishes Europeans: the sheer variety of institutions in which it is carried on, and the vast differences in size, prosperity, and, above all, intellectual quality.
Today, some fourteen million full- and part-time students attend all sorts of places, from storefront two-year colleges handing out associate degrees to the graduate schools of Cal Tech and MIT. Rather little of what is on offer is liberal education. In the Ivy League and the liberal arts colleges, 90 percent of students get a traditional liberal arts education; in the entire higher-education sector, loosely characterized “business studies” provide two-thirds of the courses. This is a thoroughly modern state of affairs, dating only from the expansion of the 1960s and thereafter. In the first great expansion after the Revolution, there was no diminution of concern for liberal education. It was only later, with the rise of agricultural colleges and vocationally oriented state colleges, that the primacy of liberal education came under threat. It is worth recalling yet again that even after the expansion of college education in the late eighteenth and early nineteenth centuries, not many more than one American in a thousand went to college. Even in so classless a society as the United States, those who went were markedly upper class.
College education was not in the narrow sense vocational, but it was an education for persons whose vocations were tolerably clear—the ministry, law, medicine—or who would be “gentry.” It was not merely decorative, ornamental, or a means of self-expression, but it was not narrowly utilitarian. Learning the craft skills of the preacher, doctor, or lawyer happened elsewhere, commonly on the job. The chance to learn how to comport oneself in no matter what learned profession was something colleges could offer. The students were, as they were at Oxford and Cambridge at this time, very young by the standards of a later day. It was common to graduate at the age of eighteen or nineteen; and although boys entered college young, they frequently took only three years to graduate. The tales of assaults on the faculty, pistols fired at night, and frequent near riots over poor food are reminiscent of what went on at the great English public schools such as Winchester and Westminster in the early years of the nineteenth century, although the religious revivals that swept through American colleges in the early spring were a distinctively American phenomenon. Nothing that we should now recognize as “advanced studies” was possible, nor was it attempted. But this was a society where a man was old at forty-five or fifty, and a boy of fifteen was supposed to comport himself like a man. So it would be wrong to think that colleges like Harvard, Yale, and Princeton were more like modern high schools than they were like modern colleges.
Moreover, in the early and mid-nineteenth century, there began to be the traffic between American (and English) colleges and German universities that opened the eyes of the Anglo-Saxon world to the possibilities of a deeper scholarship than any practiced in England or the United States. And the grip of the English conception of liberal education naturally weakened in a society whose commitments were so self-consciously and explicitly republican. Whether the introduction at an early date of such subjects as “navigation” into the curriculum at Yale and Princeton was also evidence of the familiar utilitarianism of American culture may be doubted, on the other hand. Certainly, Ralph Waldo Emerson’s seminal lecture “The American Scholar,” in 1837, urged Americans to strike out on their own behalf and pay less attention to “the courtly muses” of European scholarship; but this was not an argument for a utilitarian, practical, or vocational approach to education. It was an argument for an indigenous philosophy and a self-confident, self-consciously American literature. Ten years earlier, Yale had confronted demands for the elimination of dead languages from the compulsory portion of the undergraduate curriculum. The Yale Report, published in 1829, defended traditional classical education; indeed, it was the most famous defense to be proffered in the pre–Civil War period. Yet the Yale president, Jeremiah Day, did not repudiate modern subjects: besides navigation, Yale taught chemistry, mineralogy, geology, and political economy. What was repudiated was not a particular content but a business-oriented scheme of instruction. Geometry and astronomy had always had a place in a liberal education; at Cambridge, indeed, students were required until the mid-nineteenth century to take the final mathematical exam before they proceeded to study classics. (At Oxford, characteristically, the sequence was the other way around.)
The place where liberal education was—largely—to take place was not much contested. It is accurate enough to think of pre–Civil War American higher education as collegiate and to date the rise of the research university from the foundation of Johns Hopkins in 1876. Preparatory schools got young men ready for college, and in college they got a liberal education. Just what constituted liberal education was another matter. Moreover, when one knew what the curriculum contained—if that was what the question of what constituted a liberal education meant—that did not answer the further question of what it was for. The thought I want to offer, which is only half-original, is that the curriculum embodied an ideal of cultivation that had a clearly religious background and that has since retained a dilutedly religious quality. I do not so much mean that many American colleges set out to supply one or another Protestant sect with a supply of educated ministers, though they certainly did. An education intended to inculcate “liberal and comprehensive views” was not narrowly religious. It was, however, intended to give its beneficiaries something one might call ownership of a distinctively Christian culture. With the retreat of sectarianism and the rise of secular education, the object of devotion was not the truths of biblical Christianity so much as the cultural values embodied in great literature.
Arguments over “the canon” in the last dozen years mimic amusingly the arguments that occurred in the late nineteenth century when modern literature in English was fighting its way into a curriculum in which literary studies had always been dominated by the classics. The question whether to substitute Paradise Lost for the Iliad is not obviously one that must provoke fury, any more than the question whether to save unused bread after the celebration of communion must provoke fury. But churches have remained separated from one another over the reservation of the host, and professors have thought that liberal education stood or fell with the standing of non-vernacular literature. Meanwhile, another argument, elegantly chronicled in W. B. Carnochan’s The Battleground of the Curriculum, was going on, which was essentially an argument over the merits of specialization versus the virtues of a general education.6 This could be, and often was, presented as a choice between a general and literary education on the one hand and a deeper, more specialized, and scientific education on the other. This did not raise only the familiar issue of the place of science versus the place of literature in liberal education. It also raised—as early as the 1840s—a question that still puzzles universities and colleges today. A general, literary education is better given by a scholar than by an ignoramus. A literary scholar is—it has optimistically been thought—naturally and happily a teacher; was it so obvious that a scientist, eager to conduct research, would wish also to teach his subject to unskilled neophytes?
To add more confusion to the confused scene, even before the Civil War there were many voices raised in defense of avowedly nonliberal education; and a more narrowly vocational training was indeed provided by the first state colleges devoted to “agricultural and mechanical science.” Or, rather, they set out to provide such a training, but until the last quarter of the nineteenth century had an extremely checkered career. For one thing, state legislatures were unwilling to finance them at a reasonable level; even the least luxurious provision for the basic chemistry and biology that a serious interest in scientific farming implied was beyond the imagination of legislators and taxpayers. For another, the enterprise suffered from divided aims. Was it to educate scientifically minded farmers and mechanics, or was it to provide a general education to men who would not enter the learned professions but would earn their living on the land or in business? The first goal did not appeal to practically minded critics who thought that the skills of any occupation were best acquired on the job; the second goal did not appeal to critics of liberal education, since it seemed to be the extension of liberal education to people to whom it would do no good at all. Both versions of the enterprise also suffered from the fact that there were so few high schools available for the potential students at such colleges. Preparatory schools existed to prepare young men to go to college, but precious few public high schools did. There were exceptions—Jefferson’s beloved University of Virginia was a state university; it was secular; and it provided a variety of courses of study from which students might choose. But even the University of Virginia had terrible difficulty attracting students who could benefit from higher education, and its intellectual standards were for many years lamentably low.
The state colleges as a class of institution were rescued by a combination of circumstances. One was the availability of money after the Morrill Acts of 1862 and 1890 provided a sufficient amount of federal lands to support state colleges, and especially after the second act provided for their receiving predictable annual appropriations. More importantly, perhaps, they benefited from the general growth of the American economy after the Civil War. It may appear paradoxical, although it is hardly so in fact, that colleges whose origins lay in the need to train scientific farmers and mechanics flourished at just the moment that urban colleges began to flourish, too. In both cases, however, the possibility arose because money became available just at the time that a new clientele arose, one that challenged the traditional belief that college education was only for the elite.
Three other developments are worth noticing before we return to the high theoretical issue of the virtues of science and the virtues of poetry as foundations of a liberal education. The first is the rise of graduate education. Unlike Britain, the United States took to the German model of university education with gusto. What excited American visitors to Germany was the graduate seminar. In the United States, there was nowhere to touch Berlin, Jena, Tübingen, or Heidelberg; nor was there anywhere where advanced research in medicine could be pursued. To Germany, therefore, went a stream of young men who came back to launch American graduate education, including G. S. Hall, who became the first president of Clark University; Charles W. Eliot, who revitalized Harvard from his appointment in 1869; Andrew White, who created Cornell; and Daniel Gilman, who had the greatest impact of anyone because he made such a success of Johns Hopkins from the moment it opened in 1876. Because so much of modern university education takes place in graduate school, the rise of graduate education is not wholly irrelevant to the question of what a modern liberal education is for and where it can be had. But in its origins, it was part of the revolt against the liberally educating liberal arts college. The first nonhonorary PhD was given by Yale in 1860; by 1918, some five hundred a year were being awarded nationally. Late twentieth-century students will probably not wish to know that John Dewey received a PhD from Johns Hopkins after two years of study—the standard timescale—and that his dissertation took only five weeks of his final semester to write. The modern semipermanent graduate career is a very recent development, and one that we may hope is already on the way out.
The second is the rise of education for women. In spite of the fact that Oberlin College was coeducational as early as 1833, there was, in the first half of the nineteenth century, almost no opportunity for women to pursue genuinely postsecondary studies. Most “female seminaries” were essentially high schools, and their intellectual standards unrigorous. Only after the Civil War did matters change, and then quite swiftly. Wellesley, Smith, and Vassar colleges date from the 1860s and 1870s; older establishments such as Mount Holyoke and Bryn Mawr began to raise their standards. Radcliffe and Barnard were established adjacent to Harvard and Columbia, while midwestern and western institutions either became coeducational or began as coeducational institutions from their foundation. Some of the commentary on female education was—appropriately—perfectly hysterical. Men who ought to have known better announced that the effect of excessive study would be neurosis and sterility; racial degeneration would be inevitable if upper-class women undertook studies that would result in their having fewer children than they ought to have for the well-being of the nation.7 Once again, the relevance of women’s education to liberal education is much like that of the movement for state colleges. The public thought that it was pointless to educate either workingmen or women beyond what the needs of their future careers as workers and wives dictated.
The third is the opening of higher education to black Americans. The usual objection to educating black Americans was, of course, that a man whose role in the world was to hoe and plow had no need of an education at all, let alone higher education. Like most American colleges in their early years, only more so, black colleges were chronically underfunded and usually short-lived. Interestingly enough, two black Americans—Edward Jones and John Russwurm—graduated from Amherst and Bowdoin in 1826, some fifteen years before Oberlin awarded the first three bachelor’s degrees to women. Few pre–Civil War black colleges survived into the twentieth century; the best known is Lincoln University in Philadelphia. After the Civil War, the Freedmen’s Bureau and a number of northern missionary societies set out to establish colleges for the emancipated Negro population—including Fisk, Morehouse, and Howard. As with those who wanted to found colleges for women, the founders of black colleges wanted to provide for blacks an education indistinguishable from the education that whites found acceptable. The reality, however, was that the inadequacy of the secondary education available to black students was so marked that only Fisk and Howard University were able to teach anything resembling the traditional liberal arts syllabus.
As they did over other aspects of black emancipation, Booker T. Washington and W.E.B. Du Bois quarreled over the conclusions to be drawn from this state of affairs. Washington thought it would be easier to gain white support for black education if the education of black Americans was confined to practical subjects. Du Bois thought that this was a cowardly concession to white prejudice. Again, the relevance of this piece of history to our topic is only that it shows a familiar argument in yet another setting. The one final point to be made is that in the last third of the nineteenth century, the demand for higher education gathered strength. By the time the United States had entered World War I, there were 350,000 students in some form of higher education. This was only some 4 percent of the age group, but it was three times as high a percentage as in Britain, and it was a period when only 7 percent of the population went to school after the age of fourteen.
I conclude this account of the origins of our contemporary hopes and uncertainties with a last look at the claimed preeminence of literary studies in a liberal education. To do this, I draw on Mill and Arnold, the two preeminent liberals who made “culture” their subject. Mill never had the impact in the United States that he had in Britain and France, but even in an American context, he epitomized reforming, public-spirited, secular, and democratic values. Arnold, the purveyor of “sweetness and light,” was in both countries contrasted with the man whom Charles Eliot Norton described as an “intellectual iceberg.” Those who thought Mill represented “science” and that science stood for progress preferred the iceberg. In the early 1880s, the students of G. S. Morris at the University of Michigan complained of their diet of German idealism and Christian moral uplift, and expressed the suspicion that their teacher did not tackle Mill and Herbert Spencer because he dared not. Arnold reinforced American anxiety about the uncultured quality of American life, about the hostility of self-made men to “college men,” and about the hold of old-fashioned Calvinism on colleges and universities. Both Mill and Arnold had the United States very much in mind while writing about the prospects of late nineteenth-century Britain, and both saw the United States as the place where the compatibility of liberalism and high culture would be put to the crucial test. Only Arnold ever visited, several times.
To my earlier question whether the term “liberal” means the same when it qualifies “education” as it does when it qualifies “anxieties,” Mill and Arnold return different answers. Mill’s conception of an adequate liberal education was tailored to his politics. The self-aware, self-creating hero or heroine of On Liberty sets the standard by which liberal education is to be judged. Mill took pains to say that such a creature will appreciate many of the things that reflective conservatives have valued, whether this is an appreciation of the natural world, an affection for traditional forms of behavior, or an acceptance of the importance of authority in cultural and intellectual matters. (These attempts to proffer an olive branch to conservatives fell flat; Mill’s observation in Considerations on Representative Government that the Conservative Party was “by the law of its being, the stupidest party” gave too much offense.)8 Still, just as Mill insisted in Representative Government that democracy had no need of “a party of Order” and “a party of Progress,” because order was no more than a precondition of progress, so the “conservative” elements of character are all ingredients in a life built around the ideal of autonomy.
Arnold is harder to pin down. At times he seems to be opposing the values of culture to the values of liberalism; this is certainly true when culture is understood as a subordination of our own judgments to “the best that is thought and written in the world,” and liberalism is understood as one of the laissez-faire enthusiasms of mid-Victorian British governments. This is the Arnold to whom late twentieth-century defenders of high culture so frequently appeal. They are quieter than Arnold was about the fact that “sweetness and light” are in opposition to commerce and to Protestant self-abnegation; but they approve of his emphasis on the disciplinary effects of high culture. More importantly, Arnold suggests that it is an inadequate, thin, and ultimately self-destructive liberalism that confines government to matters of economic management—a matter on which he and Mill were as one, though they had different ideas about how governments might express their concern with culture. A concern for the culture of all its people is a proper concern for a liberal political order. In that view, a cultivated liberal is not only a cultivated person but also a better liberal. It still remains true, however, that the connection between education and liberalism is, so to speak, an external one. The cultivation that a liberal education provides corrects and restrains the bleaker and more utilitarian tendencies of the politically liberal mind.
Mill wrote about culture in terms of his famous opposition between Bentham and Coleridge. He painted what he saw as the two main conflicting tendencies of the age as a conflict between two representative figures: the utilitarian philosopher, political theorist, and legal reformer Jeremy Bentham and the poet, philosopher of religion, and cultural critic Samuel Taylor Coleridge. Bentham was, in Mill’s eyes, the man who epitomized the eighteenth-century Enlightenment: analytical, reform minded, critical of existing institutions, contemptuous of what he called “unexamined generalities.” Coleridge epitomized the nineteenth-century Romantic reaction against this excessive rationalism: discursive, historic minded, reform minded too, but in a conservative fashion that involved recalling the English to their own traditions. In what Bentham had dismissed as unexamined generalities, Coleridge found the deep wisdom of the human race waiting to be elicited. It was true, said Mill, that Coleridge’s views on economics were those of an “arrant driveller,” but Coleridge’s understanding of how a society held together, what its people needed to know, and where it might draw its spiritual sustenance was infinitely superior to Bentham’s—indeed, Coleridge understood the subject, and to Bentham it was a blank. What Coleridge offered, as Mill well understood, was a theory of culture. It sustained a theory of education and announced the need for a learned class—what Coleridge coined the term “clerisy” to describe—who could serve the functions of a learned clergy but on a secular, or at least a nonsectarian, basis.
What Mill drew from Coleridge’s work was the thought that there might exist a form of cultural authority that would transcend political authority in the narrowest sense and yet would sustain it, and that would be simultaneously intellectual and emotional. In part, this was to draw on Shelley’s well-known claim that poets were the “unacknowledged legislators of mankind”; in part it was to take up Coleridge’s insistence that the state could not secure a willing and intelligent obedience unless it embodied a national spirit, a sentiment of unity, and attachment to a particular national culture. This was, though Mill never developed the idea, potentially an elegant way around the problems posed by what was later baptized “multiculturalism.” For it allowed a liberal nationalist to acknowledge with pleasure the multiplicity of different cultures in the world and to encourage their expression in a national setting, while also suggesting that a plurality of more local cultures in a society would need some form of appropriate political expression, too. Looking forward from Mill in the 1830s and 1840s to ourselves a century and a half later, it appears that Mill reverses the implications that John Rawls or Ronald Dworkin draw from the fact of cultural plurality. They claim that the state must be “neutral” with respect to all those cultural allegiances that do not themselves amount to attacks on a liberal state. Their reasons are good ones—primarily that state incursions into religious or cultural allegiances cause misery, alienate the victims, and do little good to anyone else. Mill would not have doubted it. He was, however, more ambitious than they, and hoped that an intelligently governed society could weave together what one might term “subnational” cultural attachments into something that sustained a sense of national identity.
Mill’s essays on Bentham and Coleridge were, during the 1950s, part of the canon of exemplary works on the idea of a national culture that F. R. Leavis and, after him, Richard Hoggart and Raymond Williams taught to a generation of mildly left-wing students of English literature. But they never seized the educated imagination in the way that Matthew Arnold’s Culture and Anarchy did. Written in 1869, it was both blessed and cursed by Arnold’s facility with elegant aphorisms. The very title ensures that conservative readers will think the book was written for them rather than for liberals. But Arnold’s intention was essentially liberal and democratic; he wished the blessings of a literary high culture to be extended to the working class. For this to happen, the ruling elite had to become cognizant of what cultivation was, and the middle class had to raise its eyes from its account books. Arnold hoped to convert the barbarians and the philistines for the benefit of the populace. “Barbarian” was his happy label for the English governing classes, who might profess a muscular Christianity but might equally profess nothing more than muscularity. They might collect an empire in a fit of absence of mind and impose a rough-and-ready order on the world, but they could not civilize the estranged proletariat of a modern industrial society, because they had no idea what civilization was. They had no sense of those touchstones of human existence—“the best that is thought and said.” The “philistines” were the middle-class dissenters whose lives rotated around a narrow sense of duty; respectable and legalistic, they distrusted pleasure, beauty, and the inspiration of the senses. John Dewey described the Congregational Protestantism of his Vermont childhood in very much these terms, and Arnold himself complained of the impact of dissent on American life in his mean-spirited essay “American Civilization.” Political authority in Britain was slowly slipping from the hands of the barbarians and into those of the philistines—which is to say that an aristocratic politics was being transformed into a middle-class politics. The leading lights in this movement, thought Arnold, were John Bright and William Cobden. They were pacifist, free-trading, and insular Little Englanders, who, he thought, had no sense of what it might be to be a citizen of the world, to reckon success by noncommercial measures.
Thus far, Mill and Arnold are natural allies. Both, evidently, think the ideals of liberal education to be even more important for an industrial and commercially minded society than for its simpler predecessors. Against the critics of liberal education in nineteenth-century America, who thought a more utilitarian, practical, and vocational education should replace traditional liberal education, their reply is that just because society offers so many incentives to acquire the vocational and practical skills we require, it is all the more important to balance these pressures by disinterested, noninstrumental, and, in that sense, impractical instruction. This is an ideal of “cultural literacy” in a stronger sense than that of E. D. Hirsch. As for a modern university education, and for a modern high school education, it implies, at the very least, the ability to “read” a poem in a fashion that goes beyond merely stringing the words together and to read a novel in a way that goes beyond merely following the story—and by extension, the ability to understand how other societies and traditions of interpretation have thought of such things. And in implying this, it suggests the need for more ambitious programs in history, literature, and languages than most high schools and colleges dare to contemplate.
I want to end by opening up the question of the place of science in a liberal education and by making some last observations on the kind of liberalism a liberal education might sustain. Surprisingly enough, the two topics are related, and Mill and Arnold show why. Mill’s ideal of a liberal education was firmly rooted in an attachment to the classics, as his rectorial address to St. Andrews University insisted. What the classics were to teach was another matter. Mill admired the Athenians for their politics, for the vitality of their citizens’ lives, and for their democratic aspirations. Athenians did not confine their interests to a literary education, and they were not superstitious about the wisdom of their ancestors. In short, a concern for the classics was to feed a concern for a lively democratic politics and for a kind of political and intellectual ambition that Mill thought Victorian Englishmen lacked. It followed that when Mill asked the question whether we should seek an education for citizenship or an education in the classical tradition, he inevitably answered ”both,” and when he asked whether such an education ought to be a scientific or a literary education, he unhesitatingly answered “both” once more. These were not Arnold’s politics, nor Arnold’s educational ideals.
In politics, it suffices to remind ourselves that Mill was unbothered by the protests that terrified Arnold at the time of the Second Reform Act. Mill embraced self-assertion in politics in a way that Arnold could not. Mill was a democrat of a kind Arnold could not be. Arnold’s reluctance to embrace the modern world colored their educational differences. Arnold insisted that science could not provide the basis of a liberal education. The capacity for “criticism” to which he thought education should train us up was essentially poetic and literary. It was also, one might complain, very passive. Learning physics or chemistry could hardly be expected to inculcate the kind of sensibility Arnold had in mind. It is all too easy to see why radical professors of literature take Arnold as representative of everything they dislike. If the study of literature amounts to the establishment of a canon of indispensable work to which we are to pay homage, any red-blooded young person will want to dynamite the study of literature so conceived. Arnold, of course, had a much more complicated view of what the study of literature in a university might involve; he had an almost boundless respect for German scholarship and for the historical and philological studies for which German universities were famous. The views for which he is best known applied not to higher—or deeper—scholarship, but to the impact of literature on the non-scholarly. In that sense, it is not unfair to juxtapose Arnold’s conception of education against a more aggressive modern conception of a radicalizing education.
More interestingly, and more paradoxically, however, Arnold begat the twentieth-century thought that culture is for us what religion was for our forebears by analyzing religion as essentially poetry. The effect was first to alienate readers who saw that Arnold was announcing the death of literal Christianity and the futility of sectarian differences, and second to persuade his later admirers that culture was religion. Every commentator has observed that the recent culture wars have been fought in the heresy-hunting spirit of the religious wars of the sixteenth and seventeenth centuries. If the point of culture is to save our souls, whether in a transcendental sense or a more secular one, culture wars are just religious wars. In The Culture of Complaint, Robert Hughes mocked Americans for subscribing to the view that art is either religion or therapy, and there is something mad about recent arguments. But it is not willful silliness. We are trapped by ideas put into circulation by the Romantic poets, by Matthew Arnold, and later by T. S. Eliot. The residual effect persists; F. R. Leavis’s belief in the potentially redemptive properties of the Cambridge English Tripos (honors examination) is mirrored in every literature department in the United States, where “theory” is nowadays thought to yield insights into the human condition that orthodox philosophy, sociology, or political science cannot.
What can we learn from this quick foray into the past? Perhaps the half-comforting thought that our anxieties and uncertainties are not new. We have been wrestling for a century and a half with the question of what culture a liberal education is to transmit, what the place of science, classical literature, modern literature, history, and linguistic competence in such an education must be. We have for as long been wrestling with the question of how far to sacrifice the pleasures of individual scholarship or deep research to the demands of teaching, just as we have been arguing about what education can achieve, and for whom—whether it is to put a polish on an elite, to open the eyes of the excluded, to permeate society with a secular substitute for the faiths of our ancestors, or to link a new society with an old tradition. The fact that we have been arguing about the same problems for a very long time is only half-comforting, since it suggests that we may not come to anything like a consensus for many years yet. On the other hand, it offers the mild comfort that our condition today is not an especially fallen one, and that even in the absence of general agreement about what we are doing and why, we can do a great deal of good.