Chapter 5

Interdisciplinary

Prior to the French Revolution, universities were neither numerous nor very important. Students played no significant role in European politics until the upheavals of 1848. That year was marked not only by the publication of Marx and Engels’ Communist Manifesto, but also by a wave of rebellions by radicals, liberals and especially young nationalists against the dominant conservative Habsburg, Hohenzollern, Romanov and Ottoman empires controlling Central and Eastern Europe. Generally speaking, the contours of intellectual life were shaped by a class structure dominated by clergy and to a lesser extent by the aristocracy. Clergy and especially aristocrats were usually rich and did not need to work to make a living. If they became interested in intellectual study, clergy could do so in a monastery and aristocrats with their own money. The infrastructural costs were not great. The major monasteries had well-stocked libraries, it did not cost a great deal for aristocrats to build a good personal library, and newspapers were fairly cheap. Bourgeois intellectuals, if they did not have adequate private incomes, depended on the patronage of aristocrats. In the eighteenth century, Samuel Johnson’s famous dictionary of the English language was written by hand, and entirely by Johnson himself – something inconceivable today. Universities were often sleepy places.

The big changes really came about with the onset of industrial capitalism and the economic and political rise of the bourgeoisie in Western Europe in the nineteenth century. The rapid progress of industrialization, based on constant scientific and technological innovation, demanded a much more systematic and subdivided study of the hard sciences, which led to the setting up of specialized journals for the exchange of information and ideas. An increasing number of vocabularies had to be developed for physics, chemistry, biology and so forth, and these ‘languages’ quickly became too opaque for everyday intellectuals to keep up with and comprehend. One could say that this was all a result of the general, and increasingly rapid, division of labour in industrial societies. It applied less, however, in the case of what today we call, wishfully, the ‘social sciences’ and the humanities. Well into the twentieth century an educated person could still read important books on economics, sociology, anthropology, history, psychology, politics and even philosophy without too much difficulty.

In the wake of industrialization came the vast expansion of the functions of the modernizing and rationalizing state: ministries of health, education, agriculture, labour, science, culture, information and so on, as well as countless specialized boards for trade, immigration, urban planning and the like. Aristocrats were far too few in number to be able to staff these proliferating bureaucracies, even had they wished to do so. The required influx of bureaucrats therefore had to come from the bourgeoisie or middle class, who needed access to a better and more modern education. Education thus took on a new importance and required serious reformation, for which the state, for the first time in history, assumed a central responsibility. In this process the various German states were in the vanguard, and became a model for much of Europe and eventually the United States. The change took a special form in the latter, since the country had never had a real aristocracy.

The imposition of a ‘rationally ordered’ array of disciplines did not, however, come easily or quickly, especially in the humanities and social sciences. In the UK, for example, the prestige of classical studies remained high until after the Second World War. ‘Gentlemen’ were supposed to know their classics as part of a proper civilized upbringing. But classical studies was a jumble of history, archaeology, literary studies, philosophy, philology and art history. Oriental studies, less prestigious but still important, had the same jumbled content. Literature was divided up unscientifically between English, French, German, Italian and Russian. Anthropology – born out of colonial and folklore studies and eventually supported by Malinowskian fieldwork – arrived late, essentially after the First World War. Sociology, though stronger in Germany and France, did not become fully accepted in UK universities till after 1945. In many places anthropology and sociology were regarded as aspects of a single discipline. Thanks to the prestige of David Hume and Adam Smith, politics and economics were also intertwined. History was divided up by era and by country, categories about which there was nothing scientific, while philosophy was a mix of bits of mathematics, linguistics, intellectual history and politics.

It is significant that in the UK, until quite recently, a PhD was not thought at all necessary for securing a university teaching position or for doing first-class research. When at Cambridge in the mid-1950s, I would quite often hear older teachers ridiculing the American ‘mania’ for PhDs as simply a mindless imitation of German practice. Before the unification of the German states in the latter half of the nineteenth century, they each had their own universities to train future bureaucrats and professors. Those who wanted to teach at these universities needed to obtain doctorates, and thus there were many doctors in Germany. British universities, on the other hand, were usually built on the chair system, where there was only one full professor in a department. Once a new professor was appointed, there was not much need for other members of the department to write dissertations. This was one reason why British scholars tended to look down on German and especially American scholars who considered a PhD as a professional requirement and a means of social mobility. Another reason was that not many scholars really believed that even economics or sociology could be said to be truly ‘scientific’ in a hard sense; they were considered more like practical fields, not too different from, say, Oriental studies. Some people, perhaps nostalgically, claim that scholarship in those days was basically interdisciplinary. But this is arguably anachronistic: for study to be interdisciplinary there have to be disciplines in the first place. Disciplines did not become crucial to scholarship until they were embedded in the institutions and social structures of universities. Three important developments can be identified as operative in this process.

One was the setting up of professional associations and journals which claimed by their very names to ‘represent’ the national plenitude of disciplines: for example, the American Historical Association (1884) had its American Historical Review (1895), the American Economic Association (1885) its American Economic Review (1911), the American Anthropological Association (1902) its American Anthropologist (1888, initially published by the Anthropological Society of Washington), and the American Political Science Association (1903) its American Political Science Review (1906). (Interestingly, my great friend Kato Tsuyoshi tells me that this development and its timing were virtually the same in Japan as in America.) Inevitably, since the distinguished scholars who dominated the editorial boards of these journals had their own prejudices and formed their own cliques, scholars who were excluded or marginalized quickly founded professional journals of their own, in the same discipline but with different prejudices and followers. Since publishing articles in refereed journals was important in deciding whether young professors got tenure and promotion, the number of journals proliferated massively, most with disciplinary claims. A senior colleague and close friend of mine once laughingly calculated that the average readership of an article in a refereed journal was between two and three people.

The second important development was the restructuring of power within universities. The most obvious sign of this was the financial system that gave discipline-based departments far the largest budgetary allocations. Appointments and tenure decisions were almost exclusively in departmental hands. This power turned out in many cases to have rather conservative, and sometimes amusing, consequences. Within departments power was typically in the hands of elderly professors who had sometimes passed their prime and, when they realized as much, were mistrustful of the work of young scholars with new skills and interests.

Thirdly, the departments were based on the pleasant notion that disciplines were scientific divisions within the broad field of scholarly knowledge, and that what marked each division was a basic common discourse. In fact, this idea is a fiction, since scholarly knowledge changes all the time and in many different directions. For example, when anthropology departments started to be created in the US they included archaeology and evolutionary biology. As archaeology became a highly technical field in which chemistry was an important element, and as ‘the rise of man’ took scholars ever further back in time towards ‘hominid’ and required a strong grasp of biology, anthropology lost contact with those other disciplines.

Cultural anthropologists had the same problems with evolutionary biology as did archaeologists, and evolutionary biologists with advanced kinship studies and comparative religious systems. They did not usually read each other’s articles, which in any case were published in quite different journals. In effect, where they survived, such departments hung on as mainly administrative and budgetary shells.

An anecdote from my own experience at Cornell may be enlightening here. One day the dean of Arts and Sciences summoned me, along with a nice mathematician whom I did not know, and assigned us to look into a serious problem in the department of psychology. The immediate occasion was the department’s rejection of tenure for a popular and productive young professor, who had appealed against the decision. The dean also informed us that for ten years the department had not granted tenure to anyone. When the two of us investigated, we found a fascinating situation. The tenured faculty was evenly divided between three groups that had almost nothing in common, other than mutual dislike and lack of understanding. The behaviourist psychologists studied mice and rats, and had close ties with the biological sciences. Another group was firmly attached to the theories of the French psychoanalyst Jacques Lacan, and to the legacy of Sigmund Freud. The third group, who called themselves social psychologists, studied such things as why people who witness the same automobile accident have such different stories to tell.

It quickly became clear why no one had been tenured in ten years: any candidate would be vetoed by the two blocs who were not interested in, or despised, the remaining bloc to which the candidate was seen to be attached. Even in my own department it was increasingly clear that those who worked with complex mathematical models and equations and those who studied Plato or Nietzsche simply did not understand what each other wrote and were often not keen to try.

I no longer remember what the dean decided to do. But I have a strong hunch that he promised that if the young social psychologist under review was given tenure, he would give the department two new positions (one for rats, one for Lacan, so to speak). At the same time, the dean understood there would be enormous resistance to splitting the department or moving some faculty members to a different discipline. Institutional inertia, fears of budget cuts, anticipated loss of ‘positions’ in the short and long term, all played a role in the internal struggles.

These problems were magnified by two large social transformations surrounding universities, one quantitative, the other qualitative. In 1900, just under 30,000 bachelor degrees were awarded in the US, representing less than 2 per cent of Americans of graduating age. By 2005, the number of awarded BAs had risen to just under one and a half million, and 36 per cent of young Americans had such degrees. But the climb did not take place evenly, decade by decade. Up to the end of the Second World War, a college education was still something enjoyed largely by the children of the rich and the well connected. In the two prosperous decades that followed, however, there was a vast expansion of universities and enrolments (today there are over 1,400 four-year colleges and universities in the country), and a much wider aspiration to the benefits of a college degree. The social force behind this change was the huge number of Americans mobilized during the war, which included for the first time large groups of Blacks and women who had earlier suffered discrimination. The veterans formed a powerful political lobby demanding that their sacrifices for the country be recognized by the provision of massive funding for their college education. The lobbying resulted in the passage of the Servicemen’s Readjustment Act of 1944, informally known as the G.I. Bill.

The immediate consequence of the increasing student enrolment was a rapid expansion of the professoriat. I have described earlier how tiny the Cornell department of government was when I arrived in 1958 – only eight professors, all men. Over the next fifteen years it almost quadrupled in size, and was no longer entirely male.* Still, it was a small department for a top-level university. Equivalent departments in places like Harvard and Berkeley had seventy professors or more. Departmental meetings were thus difficult to manage, and close ties between professors harder to institute and maintain.

Qualitatively, one major response to these quantitative changes was a new ideology of ‘professionalism’, which began to replace the older scholarly traditions derived from Europe. At one level, the shift was marked by big changes in requirements for graduate students. When I first came to America, my fellow students and I had to pass reading-proficiency examinations in French and German (the other traditional world-languages of scholarship) to get our PhDs. By the early 1970s, an alternative option was made available: choosing either French or German or a year-long course in statistics. Eventually no foreign languages were required, except for those students planning fieldwork overseas.

Before leaving for Indonesia in 1961 I had to pass five examinations (in comparative politics, political theory, American politics, American political sociology and Asian politics), set by individual professors, over five consecutive days. Fifteen years later, students took only two examinations in politics, standardized by a committee of professors, and these could be taken months apart. These younger students worked just as hard as we had done, but they were being trained ‘professionally’, i.e. in standardized courses close to those offered in other good universities, with the much same reading lists, and with a strong emphasis on ‘current theory’ (which would soon be replaced by others). I say ‘professionally’ because they were being trained, rather than educated in a general sense, the idea being to make them competitive in what began to be called ‘the academic job market’ after finishing their dissertations. Passing such examinations and gaining a PhD were coming to be regarded as professional qualifications, in the same way that aspiring doctors and lawyers had to pass professional examinations to be licensed to practise medicine and law.

At another level, professionalization and the huge expansion of departments led to a big change in departmental culture. As described earlier, in my early student days, my classmates and I worked every semester as teaching assistants, so we had close contact with both undergraduates and our few professors. We picked our chief advisors on the basis of their interests and expertise. A decade later, funded by generous fellowships, the number of graduate students had greatly increased, and they did much less undergraduate teaching. This was not a matter of laziness or selfishness – they were watching their professors and being acculturated to professionalism.

As departments expanded, the top professors tended to leave the teaching of the big undergraduate courses to junior faculty and concentrate on seminars for graduate students. In turn this process created a striking asymmetry in the choice of chief advisors, who were typically confined to the five or six best-known (elderly) professors. Graduate students calculated that these ‘big names’ would be of great help in getting jobs. Finally, there were no strong incentives for taking courses in other disciplines, which would do little to boost a youngster’s chances in the job market, and might even make him or her look ‘amateurish’.

In spite of all this, there were significant countervailing forces at work. For a long time, these were most prominently represented by area studies, which, as mentioned earlier, both the national government and educationally concerned private foundations supported, financially and otherwise. Already in the 1950s, for example, Cornell had programs for China–Japan, Southeast Asia, and South Asia and Latin America; later programs came into being for Western Europe, Eastern Europe, the Middle East and so on. Cornell also had, from prewar days, a small department of Asian studies, mainly housing students and teachers interested in pre-modern Chinese and Japanese history, literatures and religious systems. Literature and history used to mean patently those of Europe, and accordingly it was not possible to encompass their Asian variants in the departments of literature or of history. In the UK they were covered in Oriental studies, but in the US they were lumped together in departments of Asian studies.

All the area studies programs mentioned above were cross-disciplinary to varying degrees, and many had their own publications, courses and weekly ‘brown-bag’ lunchtime meetings. What I mean by ‘cross-disciplinary’ here refers to the situation where a program includes professors of different disciplinary backgrounds among its faculty members, and graduate students are allowed to choose three members of their dissertation committee from across these disciplinary divides. It is different from ‘multidisciplinary’, which usually refers to a scholar of a particular disciplinary background incorporating other concepts and disciplines into his or her analyses.

At the national level there were also associations (again with their own journals) such as the Association for Asian Studies, which held large annual conventions with dozens of panels and hundreds of papers. Nonetheless the atmosphere was different to that of the standard disciplinary convention, a key aspect of which was job-hunting – students would expect their chief advisors to introduce them and praise them to influential senior colleagues at other universities, and also hoped to be interviewed as candidates for vacancies. Almost no students went to the AAS convention expecting to be interviewed or to make ‘key contacts’, since area studies programs only rarely had jobs in their own gift. So the atmosphere was less tense, the panels more varied, and the fun livelier. More like a mass annual vacation.

To get what they wanted, the programs were heavily dependent on backing from outside the universities and from intelligent university administrators. Among the area programs themselves, there were also big power differences that changed over time. Up to the American defeat in Indochina, the Southeast Asia programs were quite influential, and also commanded strong undergraduate followings. In the late ‘70s and ‘80s, when the US was briefly alarmed by Japan’s extraordinary economic success, Japan studies did well. China studies, traditionally strong anyway, became very powerful once the country opened up to American scholars. South Asian studies was much weaker, partly because people tended to think of the region as somehow ‘still British’, but mainly because Washington was not much worried about it. India was the ‘biggest democracy in the world’, except during the brief martial law regime of Indira Gandhi, and thus a fine counterweight to what was then thought of as ‘Red China’. A final factor was that both India and Old Pakistan increasingly imposed restrictions on foreign scholars, especially Americans: visas were harder to get, and more and more topics were declared too sensitive to be investigated.

I do not think that the tension between disciplines and area studies was altogether a bad thing. There was usually room for compromises and accommodations since there was lots of money around till the 1990s, and universities were still expanding. There were plenty of scholars who thrived in both environments. But the prestige of area studies in the end depended on their ability to produce Big Names. China-Japan studies had John Fairbank and Edwin Reischauer; Southeast Asia, Clifford Geertz and George Kahin; South Asia, Suzanne Rudolph and her husband.

The area studies programs (especially those concerned with Asia) nonetheless had one important card up their sleeves: ‘foreign students’, who multiplied when what people loosely call ‘globalization’ set in. These students did not include Western Europeans, who were wishfully regarded as ‘just like us’. Rather, as more and more Thais, Latin Americans, Indonesians, Japanese, Filipinos, Koreans, Indians, Sri Lankans, and later Iranians, Africans and Arabs arrived to study, there was at first, as I remember it, a mild nativist reaction. I used to hear some of my colleagues complain that ‘this an American university for Americans’, and ‘these Asians can’t speak English, don’t understand lectures, are useless as teaching assistants, and won’t think theoretically’. But in time they got used to the foreign students (some of whom did exceptionally well) and even became fond of them. By the late 1980s, my department even hired Asians as professors.

It took longer for Japanese universities to see the benefits of bringing in foreign students; above all, the benefits for Japanese students themselves. In terms of the relationship between disciplines and area studies, postwar Japan offers an interesting contrast. It seems that, from early on, the institutionalization of the disciplines and area studies in Japan took a different form than it did in the US. One could describe it as a process of segregation rather than unequal integration. In the best universities, the institutional power of the disciplines was even greater than in the US, probably because modern Japanese education, initiated in the Meiji era under strong German influence, is, though excellent in many ways, more hierarchically structured than its cousin across the Pacific. Thus it was not easy to establish cross-disciplinary area studies programs. In the face of this, the Ministry of Education’s policy-makers, recognizing the political, economic and foreign policy potential of area studies, decided to set up a congeries of separate institutes, or specialized colleges both inside and outside the existing universities, where area studies people could congregate (even if their prestige was lower than that of professors in the mainstream universities).

Furthermore, in postwar Japan, for a long time there existed no very wealthy and influential foundations comparable to those of Rockefeller, Ford and Mellon, which had provided the money and political support that allowed area studies to be institutionalized in the big American universities. However, the Japanese system had its advantages, of which the most important was real autonomy for area studies scholars. The disadvantage was that since these specialized institutes got their money and power from the Ministry of Education alone, it was sometimes hard for them to resist Ministry pressure to follow policy fads. It also meant that the intellectual cultures of the disciplines and the institutes did not often usefully cross-fertilize each other.

Finally, one of the important effects of the turmoil in American universities during the ‘radical ‘60s’ was the rise of what is today called ‘identity politics’. The pioneers were militant Black students who demanded that university authorities set up Black Studies programs, hire more Black professors, and recruit more Black students. They were quickly followed by militant feminists and gays and lesbians, who convincingly argued that the standard curriculum either ignored or marginalized their historical roles and the centuries-old discrimination they had suffered.

In the 1970s, various ethno-racial minorities joined the tide, including Native Americans and the American-born children of first-generation immigrants from Central and South America as well as many countries in East, Southeast and South Asia. In response to the demands of the latter, and taking into account their relatively small numbers, universities started setting up Asian-American Studies programs and hiring young professors capable of teaching courses adapted to their students’ identity interests. Only a few of these ‘amalgamated’ programs were very successful. Filipino-American students, for instance, shared few interests with Samoan-American, Chinese-American or Thai-American students. They wanted to take courses primarily on their countries of origin.

The expansion at Cornell had already encouraged the department to hire a China specialist before I returned to Ithaca. The year I was made a junior professor (1967) also saw the appointment of a Latin Americanist trained at Yale. A little later arrived a specialist on India, who was also interested in feminist politics. Over the next five years I was too absorbed in developing new courses, managing a program for advising undergraduates, and keeping up with Indonesia under the Suharto regime to get much involved in the department as such.

At the time I came up for tenure review, in 1971–72, it would have been hard to get rid of me, since the Vietnam War was still raging, Kahin was an influential and respected sponsor, and – a key requirement – my dissertation was being published by Cornell University Press. Still, a senior colleague said to me later: ‘I didn’t finish your book, though it looks well done. Isn’t it just history? Where is the theory? But I was interested your idea of power in Javanese culture, especially as you spoke about Machiavelli, Hobbes, Marx and Weber.’ In fact, no one was much interested except Kahin, and I felt myself to be something of an outsider. Later, I heard from students that a gifted senior said to them: ‘Anderson has a good mind, but he is basically an area studies person’, which meant someone second-class. I didn’t mind this judgement because I too saw myself as basically an area studies person.

When Imagined Communities was published by Verso in London, the curious thing was its contrasting initial reception on opposite sides of the Atlantic. In those distant days the UK still had a ‘quality press’ – meaning that there were good newspapers to which leading intellectuals and scholars regularly contributed, as both critics and essayists. To my surprise and pleasure the book was warmly reviewed by Edmund Leach, Cambridge’s famous anthropologist, the prominent Irish politician and political historian Connor Cruise O’Brien, and the up-and-coming Jamaican Marxist Winston James. Of course, they were all familiar with the long debate on nationalism in the UK and so could ‘situate’ my contribution.

In the US, the book was almost completely ignored. In a way, this was fair enough, since I hadn’t written the book for Americans in the first place. Besides, in the US, nationwide quality presses are not common. However, one old European émigré political scientist, writing for the professional American Political Science Review, did review it, and deemed it worthless apart from its catchy title.

This situation began to change rapidly at the end of the 1980s, with the end of the Cold War and the collapse of the Soviet Union. Like all empires, the American empire needs enemies. ‘Dangerous nationalism’ (which of course did not include American nationalism) emerged to fill the vacuum left by the evaporation of ‘the communist threat’. I vividly remember receiving a frantic telephone call from a high official at the Kennan Institute, one of the key centres for Soviet studies. He begged me to fly down and give a talk at his institute. When I asked why – since I knew very little about the Soviet Union or Russia – he astonished me by saying, ‘Soviet studies are finished, money is not coming in anymore, and our students can’t get jobs. Everything in the former Soviet Union today is about nationalisms, and almost no one here has ever studied them. You are among the few people in the country who can help us get back on our feet.’ I didn’t go.

A second factor was that, mainly by word of mouth, Imagined Communities had caught on in departments of history, sociology, anthropology and, strangely enough, English and comparative literature, and was being widely used as a graduate-level textbook. Political science was the one obvious exception, but eventually it had to yield to student demand for courses on nationalism, which, amazingly enough, did not exist almost anywhere in the US. As a result, in my fifties, I found my position completely changed. Suddenly I became a ‘theorist’, not just an area studies figure. I was even urged to teach a graduate course on the ‘theory of nationalism’, which I had never previously considered doing. To my amusement, the students who took the course came not only from political science, but from history, anthropology, comparative literature and sociology.

It was fun teaching ‘The Theory and Practice of Nationalism’, because I forced the young anthropologists to read Rousseau, political scientists a nineteenth-century Cuban novel, historians Listian economics, and sociologists and literary comparativists Maruyama Masao. I picked Maruyama because he was a political scientist, an Asian/Japanese, and a very intelligent man who read in many fields and had a fine sense of humour and history. Luckily he had been translated into English. It was plain to me that the students had been so professionally trained that they did not really understand each other’s scholarly terminology, ideology or theory. My task as a teacher was thus to break down these barriers to scholarly communication.

The idea of ‘interdisciplinary studies’ started to be talked about at around the same time. In its origins I suspect that this new interest reflected frustrations about the evident misfits between fields of scholarship and the conservative institutional power of departments claiming to represent disciplines. Discipline-based departments tend to have a vested interest in the maintenance of the status quo, yet fields of scholarship may not fit within the existing departmental boundaries because they are likely to change their contours in response to developing historical situations, societal needs or researchers’ academic interests. This is especially so in our age, when rapid social, economic, political and technological changes are everywhere apparent. Hence the misfits arise, and moreover expand. However, there were other signs of increased interdisciplinarity, as bits of different disciplines combined with each other. Interesting fields such as cultural studies and postcolonial studies blossomed. There was also the optimistic idea that interdisciplinary studies would help create bridges between disciplines and area studies. ‘Fashion’ also played an important if short-lived role.

If the general idea of interdisciplinary studies was attractive, it was also vague and open to very different interpretations. The two most basic views could be crudely described as follows. The first took off from the Latin prefix inter-, which was read as meaning ‘in between’; in other words, researchers lodged themselves in the big empty spaces ‘between’ disciplines. If, for example, you wanted to study the elaborate, often poetic slang of Filipino transvestites in its political, social, historical and economic contexts, would there be adequate space in these disciplines for this kind of work to be carried out? Is there a discipline of gender studies that could help you? Why not? People working along these lines produced a lot of interesting and valuable material, borrowing from several disciplines in an ad hoc manner, but the studies themselves were often rambling, anecdotal and intellectually incoherent. For such people ‘cultural studies’ was a useful, prestigious rubric, but some did not fully realize that really good cultural studies are very hard to do.

The second view implied the difficult task of systematically combining the basic frameworks and tools of two or more disciplines. But such an approach required both a mastery of each discipline and a carefully considered supra-framework in which they could be handled. Only really exceptional minds could do this work well. David Laitin’s superior comparative work on the politics of language-policy and everyday language use is a good of example of how political science and social linguistics can be elegantly combined. Needless to say, the two ‘basic views’ sketched above represent the two ends of a spectrum, and many scholars have worked somewhere in between.

One has also to look at the intellectual culture in which a lot of youthful research is planned and financed. The US is again a good, if extreme, example. The funds to support dissertation research usually come from private foundations and/or governmental bodies. Success in securing funding typically depends on a good proposal, ‘logical, tidy and tightly framed’, since the referees for these institutions are usually prominent ‘disciplinary’ professors. The student grapevine fairly quickly spreads the word about ‘what will work’, which is why, if you sit on such panels of referees, you find that the proposals often look very much like each other.

In political science, students are supposed to come up with a hypothesis to be confirmed or disconfirmed within the coming year. This time limit is a bad idea, since it is too short to attempt anything rather difficult. The demand for a hypothesis is often a bad idea too, because it implies from the start that only two general answers are possible: yes or no. Scale is always a problem. If a student says he wants to study sexual ideology and practice in the Meiji period, he will usually be told something like this: ‘Stick to sexual ideology, find an interesting decade, and confine yourself to Tokyo. Otherwise you will never finish and get a job.’ This kind of advice is not unreasonable, given the real financial and market constraints, but it is not likely to encourage bold or ambitious work.

The ideal way to start interesting research, at least in my view, is to depart from a problem or question to which you do not know the answer. Then you have to decide on the kind of intellectual tools (discourse analysis, theory of nationalism, surveys, etc.) that may or may not be a help to you. But you have also to seek the help of friends who do not necessarily work in your discipline or program, in order to try to have as broad an intellectual culture as possible. Often you also need luck. Finally, you need time for your ideas to cohere and develop. As an illustration, the research that resulted in Imagined Communities began when I asked myself questions to which I had no answers. When and where did nationalism begin? Why does it have such emotional power? What ‘mechanisms’ explain its rapid and planetary spread? Why is nationalist historiography so often mythical, even ridiculous? Why are existing books on the subject so unsatisfactory? What should I be reading instead?

I started out with only two certainties. Firstly, that part of the answer must lie with world-transforming capitalism. But Marx did not pay much attention to print-capitalism, while fine scholars like Elisabeth Eisenstein paid a lot of attention to print but not a lot to capitalism. So? Secondly, that another part of the answer had to involve the rejection of the standard European idea that nationalism developed out of old ethnic groupings, since this idea could not explain either the early nationalisms of the Americas, or the late nationalisms of the Third World anti-colonial movements. Rory advised me to read Lucien Febvre and Henri-Jean Martin’s masterpiece L’apparition du livre, which described brilliantly and in enormous detail the early marriage of capitalism and print, and Jim Siegel kindly gave me a copy. The inspiring work of Victor Turner, particularly his unsettling semi-psychological concept of the ‘pilgrimage’, gave me the clue I was looking for as a key to the mystery of Creole and anti-colonial nationalism.

I had long been in love with Walter Benjamin’s enigmatic ‘Theses on the Philosophy of History’, especially his difficult idea of ‘homogeneous, empty time’. But I wasn’t thinking of using it at all until Jim (again) gave me a copy of Erich Auerbach’s Mimesis: The Representation of Reality in Western Literature. The most fascinating sections were those on antiquity and the Middle Ages, which revealed a conception of time utterly alien to the modern world. This book then led me to the master French historian of the Middle Ages, Marc Bloch, and later to David Landes’ then recent book on time and clocks.

Finally, a complete accident. I was talking casually with an Americanist friend of mine when the conversation turned to the topic of Harriet Beecher Stowe’s Uncle Tom’s Cabin, which was a huge international success. He told me something very instructive about its domestic reception. Pro-slavery critics had mercilessly attacked the book as sheer fiction, if not pure lies. Mrs Stowe was so stung by these criticisms that she published a huge book containing all the documents on which she had relied for writing the novel. But very few people had any interest in buying it. This in turn made me think of Emile Zola’s Germinal, Ivan Turgenev’s Fathers and Sons, Eduard Douwes Dekker’s Max Havelaar and a few other novels which had an enormous political impact when they were first published. They are still read today, and yet no one other than a professional historian is eager to read about the ‘facts’ on which these grand fictions were based.

Was there then a sense in which one could think of fictions as being more real than reality? If so, then how could they seem so super-real? Was it only because of their content, or did it have something to do with the novel’s inner form? Out of these odd influences I finally saw how Benjamin’s notion of homogeneous, empty time might help me. The paradox of super-real fiction made it possible to think about nationalism along the same lines. So, a German political economist (Marx), three French historians (Bloch, Febvre and Martin), a British anthropologist (Turner), a German philologist (Auerbach), an American novelist (Stowe) and a German philosopher and literary critic (Benjamin) – all were crucial to the formation of Imagined Communities, yet none of them was particularly interested in nationalism. But in them collectively I found the tools I needed to solve (so I thought) the problem I had originally been incapable of grasping.

Can it properly be said that my book is interdisciplinary? Marx, Benjamin and Stowe, all long dead, were not professors, and I am not sure how far the three Frenchmen and Auerbach, all professors, thought of themselves as representing disciplines, even if Turner, in all probability, did. But Imagined Communities makes no systematic attempt at building a supra-disciplinary perspective (though Marxism is always there). Does the book then belong within one discipline? It certainly doesn’t belong to history, since it is not based on archival or other primary sources. Political science? Only one or two political science books are mentioned in the bibliography. Nonetheless, it is all about a single political force, and the underlying framework comes directly out of my training in comparative politics.

There is still another way of thinking about interdisciplinary studies, which has been hinted at already. All disciplines, simply to be disciplines, have to think of themselves as having boundaries and certain kinds of internal rules, even if these change over time. In doing so, they follow the much larger logic of the ever-expanding division of labour in industrial and post-industrial societies. In principle there is nothing wrong with boundary formation and the creation of internal rules and standards, so long as they are consciously seen as practices pragmatically devised to further the whole field of scholarly endeavour.

The analogy with sports is clear: If you play tennis, you use a round ball and a net, and there are rules about the size of the former and the height of the latter, as well as demarcated spaces in which you can gain points. You are not allowed to hit the ball with your arms, legs or head. If you play football, the ball has to be much bigger, and you need to have goalposts of a specific, arbitrarily decided height; you may use your head and legs, but not your hands. The space in which you play is much larger than in tennis, is differently demarcated, and the rules governing ‘scoring’ are quite detailed. But these rules have also changed over time. If you like playing both tennis and soccer you have to know the different formats and rules. No one thinks of playing ‘intersports’, and everyone knows when he or she is no longer playing the game.

This kind of consciousness is much less common in academia, because academic life is supposed to be about seeking truth rather than having fun (the boundaries and rules are set up for this purpose). When I first suggested to my colleagues that we should offer a course on the history of political science, and found that no one thought it a good idea, I interpreted the resistance in practical terms. Perhaps they thought we had no one who could devise and teach such a course? It turned out that this was not necessarily the case. The problem was how to interpret the relationship between ‘political’ and ‘science’. If one emphasized political and bracketed ‘science’, then the course would have to start with Plato and continue through to, say, Fukuyama. But if one did the reverse, the history would not go back much more than a hundred years, when the term was invented in the context of a very American merger between public administration and constitutional law. The department would have found it difficult to come to an agreement on this. In spite of the complete failure of my proposal, I think all disciplines should offer at least one really good course on their histories, however conceived, to make students thoroughly aware of the origins and zigzag development of the intellectual walls that largely define them.

Of course there are alternative methods for breaking down disciplinary fences. One is to introduce into the graduate curriculum, forcibly if necessary, fine works in other disciplines or even outside all standard disciplines, especially if these are written by foreigners. The students will then not only pick up some different technical vocabulary and learn new concepts, but will have chance to look at their own (nationally inflected) disciplines from the outside and in a comparative manner. Another method is to try to develop courses that will attract students from different disciplines and, if possible, nationalities. In my experience, students often learn as much from discussions and arguments among themselves as they do from listening to professors. Nothing is more likely to get students to stop thinking creatively than a combination of national egotism and disciplinary myopia.

And what of audience, style and creativity? It is obvious that graduate students start their training by writing papers for their teachers. Prior to that, their writing may be clear and even elegant, or clumsy and muddled, depending partly on talent but mostly on what they have learned in high school and as undergraduates. They are not yet inside the discipline, and they usually write, however naively, as persons. Anyone can read what they compose. But graduate students in the disciplines, especially if professionalism is well advanced, change their writing style fundamentally. As they proceed in their studies, they discover some key things about their future readerships. They are typically told that they are supposed to write primarily for other members of their disciplines, colleagues, editors of disciplinary journals, potential employers and eventually their own students. Their prose should reveal immediately the guild to which they belong.

The influence of this environment can be very strong, and is most visible in the use of (current) disciplinary jargon, excessive citations of previous works in the discipline which do not enlighten the reader but simply perform the rites of membership, and conformity to a kind of impoverished standardized language. Writing for a large, generally educated public, so they are often told, inevitably entails simplification, ‘popularization’ and lack of technical sophistication (that is, it is too easily comprehensible). They also learn that whenever possible the books they eventually write should be published by university rather than commercial presses, since this will ensure that their pre-publication reviewers will be people like themselves, not unpredictable outsiders. Hence, consciously or unconsciously, they are encouraged to employ a prose style which is often much worse than the one they used in high school or as undergraduates. Many continue to write in this way until they retire.

Furthermore, in most universities the everyday power of the disciplinary departments encourages members to take themselves very seriously, such that you feel that the word ‘discipline’ – whose history goes back to the self-punishing rigours of medieval monks intent on subjugating the body as the enemy of the soul – should really always be spelled with a capital D. ‘Frivolity’ and irrelevant digressions are therefore frowned upon. I learned this lesson quite soon after I arrived at Cornell. Still thinking like an undergraduate, in my early papers I included jokes and sarcasms in the main text and, in the footnotes, anecdotes and digressions I had enjoyed in my reading, as well as personal comments. In a friendly way, my teachers warned me to stop writing like this: ‘You are not at Cambridge now, and you are not writing a column for a student magazine. Scholarship is a serious enterprise, anecdotes and jokes rarely have scholarly value, and no one will be interested in your “personal opinions”.’ It was really hard for me to accept this advice, as in previous schools I had always been told that, in writing, ‘dullness’ was the thing to be avoided at all cost. Later I sometimes frivolously thought: ‘Now I understand what traditional Chinese foot-binding must have felt like.’ But eventually, at least after gaining tenure, I escaped. Java in a Time of Revolution (respectably published by Cornell University Press) has no jokes, few digressions and not many ‘personal comments’. But Imagined Communities (published ‘commercially’ by Verso) is full of them.

The obvious point is that breaking down unnecessarily high disciplinary walls usually improves a scholar’s prose, decreases dullness, and opens the way to a much wider potential readership. This does not mean ‘dumbing down’. Books by great stylists like Joseph Schumpeter, Marc Bloch, Maruyama Masao, Eric Hobsbawm, Ruth Benedict, Theodor Adorno, Louis Hartz and many others are often difficult, but they are always a pleasure to read.

To the last page of this chapter, my friend Yoshi adds the following comment:

We think and express ourselves by language if we are novelists or scholars. Between the two, novelists, or generally speaking artists, are usually more innovative and creative than scholars because they are supposed to break out of conventional ideas and expressions. In contrast, scholars tend to become complacent in their world, surrounded and protected by their disciplinary jargons. Jargons can be a blessing and a curse at the same time. Their use facilitates communication among scholars and certifies the professional credentials of their users. But they may also become a prison which constrains the way scholars conceive and express ideas. Thus the question of audiences and prose style goes beyond the simple question of not being dull; it is closely connected with creativity and innovation. It is in this context that the significance of interdisciplinary studies must be appreciated.

_________________

* In 1969 women in the US held 17.3 per cent of professorships, by 2008 the figure was almost 40 per cent, according to the New York Times (3 July 2008).