INTRODUCTION

KNOWLEDGE is our most important business. The success of almost all our other business depends on it, but its value is not only economic. The pursuit, production, dissemination, application, and preservation of knowledge are the central activities of a civilization. Knowledge is social memory, a connection to the past; and it is social hope, an investment in the future. The ability to create knowledge and put it to use is the adaptive characteristic of humans. It is how we reproduce ourselves as social beings and how we change—how we keep our feet on the ground and our heads in the clouds.

Knowledge is a form of capital that is always unevenly distributed, and people who have more knowledge, or greater access to knowledge, enjoy advantages over people who have less. This means that knowledge stands in an intimate relation to power. We speak of “knowledge for its own sake,” but there is nothing we learn that does not put us into a different relation with the world—usually, we hope, a better relation. As a society, Americans are committed to the principle that the production of knowledge should be uninhibited and access to it should be universal. This is a democratic ideal. We think that where knowledge is concerned, more is always better. We don’t believe that there are things that we would rather not know, or things that only some of us should know—just as we don’t believe that there are points of view that should not be expressed, or citizens who are too wrongheaded to vote. We believe that the more information and ideas we produce, and the more people we make them available to, the better our chances of making good decisions.

Americans therefore make a large social investment in institutions whose purpose is simply the production and dissemination of knowledge—that is, research and teaching. We grant these institutions all kinds of exemptions and protections, and we become worried, sometimes angry, when we suspect that they are not working the way we want them to. Some of our expectations about colleges and universities are unrealistic (and so are some of our expectations about democracy). Teaching is a messy process, an area in which success can be hard to measure or even to define. Research is dicey, too. The price for every good idea or scientific claim is a lot of not-so-good ones. There are more than 4,000 institutions of higher learning in the United States, more than 18 million students, and more than 1 million faculty members.1 We can’t reasonably expect that all of those students will be well educated, or that every piece of scholarship or research will be worthwhile. But we want to believe that the system, as large, as multitasking, and as heterogeneous as it is, is working for us and not against us, that it is enabling us to do the kind of research and teaching that we want to do—that it is not, in itself, an enemy of reform.

There is always a tension between the state of knowledge and the system in which learning and teaching actually take place. The state of knowledge changes much more readily than the system. Institutions are recalcitrant, and the professional conservatism of professors is an ancient source of ridicule. In 1908, the Cambridge classicist F. M. Cornford, in his satirical guide for young academics, Microcosmographia Academica, advised that the basic rule of faculty governance is, “Nothing should ever be done for the first time.”2 In 1963, Clark Kerr, who, as chancellor of the University of California, would soon come to know more than he bargained for about politics and the academy, complained that “few institutions are so conservative as the universities about their own affairs while their members are so liberal about the affairs of others…. The faculty member who gets arrested as a ‘freedom rider’ in the South is a flaming supporter of unanimous prior faculty consent to any change whatsoever on his campus in the North. The door to the faculty club leads both in and out.”3

To people outside the faculty club, the resistance of professors to institutional reform can appear silly or petty. It can appear worse than that to academic administrators, and university presidents have famously broken their heads trying to get faculty to drink the water to which they have been led. Some of the reasons for this resistance are discussed later in the book; they have to do with the belief, central to the academic’s professional self-conception, that the university does not operate like a marketplace. But there is also a practical reason for resistance to reform, which is that any change potentially has a cost. If you require every student to take a certain course, then there is one less elective they will take. If you add a new field of study, then you have to take money from something else to pay for it. When the financial universe was expanding, universities could often add on new things without taking away from old ones, but the universe, as higher education (and everyone else) learned in 2008, can also shrink.

This book is an attempt to answer four questions about American higher education today. Why is it so hard to institute a general education curriculum? Why did the humanities disciplines undergo a crisis of legitimation? Why has “interdisciplinarity” become a magic word? And why do professors all tend to have the same politics? These questions involve ideas; they are essentially intellectual matters that should be amenable to debate and negotiation. They are not, in any significant way, about money. But they are oddly non-transparent, issues that are weirdly and sometimes unpleasantly difficult to discuss or reach agreement about.

My argument is that these issues are all fundamentally systemic—they arise from the way in which institutions of higher education sustain and reproduce themselves—and the most significant fact about American higher education as a system is that it is one hundred years old. The American university is a product of the nineteenth century, and it has changed very little structurally since the time of the First World War. It has changed in many other ways—demographically, intellectually, financially, technologically, and in terms of its missions, its stakeholders, and its scale—and these changes have affected the substance of teaching and research. But the system is still a late nineteenth-century system, put into place for late nineteenth-century reasons. The extraordinary series of transformations of higher education after 1945 have strained it. To the extent that this system still determines the possibilities for producing and disseminating knowledge, trying to reform the contemporary university is like trying to get on the Internet with a typewriter, or like riding a horse to the mall.

One thing about systems, especially systems as old as American higher education, is that people grow unconscious of them. The system gets internalized. It becomes a mind-set. It is just “the way things are,” and it can be hard to recover the reasons why it is the way things are. When academic problems appear intractable, it is often because an underlying systemic element is responsible, but no one quite sees what or where. People who work in the academy, like people in any institution or profession, are socialized to operate in certain ways, and when they are called upon to alter their practices, they sometimes find that they lack a compass to guide them. Some of the reasons why “this is the way things are” in American higher education are still good ones; some are almost certainly obsolete. There are things that academics should probably not be afraid to do differently—their world will not come to an end—but there are also things that are worth preserving, even at a cost, because the system cannot operate without them. To know which things are which, it helps to have some knowledge of how we got here. Knowledge, after all, is what it’s all about.

Like most people who write about American higher education, I focus on what is in reality a very thin slice of the whole—undergraduate and graduate education in the liberal arts and sciences.4 Most of the 4,000 institutions of higher education in the United States are not liberal arts schools: that is, they award fewer than half their degrees in liberal arts fields. Twenty-two percent of college graduates major in business; only 2 percent major in history. Most of what I have to say concerns higher education as it is experienced by the history major, rather than the business major, and most of my examples are taken from elite liberal arts institutions. This is because, historically, the elites have had the resources to innovate and the visibility to set standards for the system as a whole, but there are many institutions for which the problems I discuss are either irrelevant or non-problems. I am a humanist by training and by interest, and some of the issues I write about are more urgent for faculty and students in the humanities than they are for people in other areas. The natural sciences, in particular, are an exception to many of the trends I discuss (though their exceptionalism is also part of the reason for some of those trends).

There are some interesting academic problems today that do not (I think) test the limits of the system. A pressing pedagogical challenge right now is the problem of adapting a linear model for transmitting knowledge—the lecture monologue, in which a single line of thought leads to an intellectual climax after fifty minutes—to a generation of students who are accustomed to dealing with multiple information streams in short bursts. Once, being a professor meant (among other things) possessing, by dint of years immersed in library mineshafts, refinements on knowledge that were effectively inaccessible to the unlearned person. Now, most of that esoterica is available instantly on Wikipedia. Sheer information is no longer a major piece of the value-added of higher education.5 This seems a challenge that the system can meet. The most important intellectual development in the academy in the twenty-first century has to do with the relationship between the life sciences—particularly neurobiology, genetics, and psychology—to fields outside the natural sciences, such as philosophy, economics, and literary studies. So far, contention and collaboration in this area seem robust. The system is doing what it was designed to do. It is helping people think better by helping them think together.

In the case of the four problems I address, though, the system seems not to be so accommodating. For most of the book, I write as a historian. I have not struggled to keep my opinions to myself, but I am not a prescriptivist. My emphasis is on the backstory of present problems. At the end of each chapter, I speculate about what higher education might be like if academics thought of their business differently, since one of the lessons of historical inquiry is that there is no one way that things must be. I do not have an agenda, though. I am in favor of reform when it shakes the system and not when it breaks the system. I do think that intellectual life should involve taking chances.

I became interested in the history of higher education in part because I’m an English professor, and getting interested in the history of higher education, or at least the history of English departments, is something that happened to a lot of English professors after the 1980s. I am interested also because my career has had a slightly askew relation to academic life. This doesn’t give me any special insight into the academic world, but it has made me curious about the reasons most academics do what they do in the way they do it. Finally, I worked on the intellectual history of the nineteenth century for a long time, and anyone who does that is bound to get interested in the history of higher education, because the rise of the modern research university is a big part of that picture. The modern American higher education system was and remains a great social accomplishment. It can handle a few questions.