Introduction

David H. Richter

During the 1960s, when I was doing my degrees in English, literary theory was primarily studied as a set of historical topics, in which scholars investigated Aristotle’s notion of mimesis, or Corneille’s doctrine of the Three Unities, or the source of Edmund Burke’s theory of the sublime. My own interest in theory as an ongoing as well as a historic concern, in quirky thinkers like Northrop Frye, Kenneth Burke, and Walter Benjamin, seemed a harmless oddity to my colleagues in the English Department at Queens College, who warned that I was wasting my time with theory because there was absolutely no future in it. By then, of course, the revolution was well underway that would end by making literary theory the roiling pivot point of my profession. The turbulence and clash of ideas had begun decades before on the Continent, but those of us in the provinces, who read French and German haltingly and Russian not at all, did not experience the explosion of theory until the mid‐1970s, when Russian formalism, structuralism and semiotics, deconstruction, Lacanian psychoanalysis, Althusserian Marxism, and reception theory rode successive waves into our awareness. A profession that had been preoccupied with close and closer readings of canonical texts was now lit up with a rush of ideas, a dozen disparate systems with enormous philosophical reach and scope. Many of those systems were capable also of informing and channeling the social imperatives of women and minorities seeking an ideological manifestation of their desire for greater freedom and power. And even teachers like me, without any social imperative of our own, could become enthralled by the magnificent conversation going on about them.

This was a revolution that was reshaping our sense of intellectual history, forcing us to broaden our horizons and to read deeply, as well as broadly. Anglo‐American feminist thought, like that of Elaine Showalter and Sandra Gilbert, needed to be read against the backdrop of Germaine de Staël, Virginia Woolf, and Simone de Beauvoir, forebears who served either as antagonists or as sources of inspiration. To read Derrida we needed to understand not only the structuralist theories against which he had reacted but philosophers like Plato, Kant, Hegel, Nietzsche, and Heidegger, most of whom were comparative strangers to traditional literary criticism courses. Meanwhile, the New Criticism, which for the most part had generated our close readings, could be seen as a single strand within an international formalism that also included disparate theorists like Victor Shklovsky and R. S. Crane.

Intellectual revolutions too have their Thermidors, and by 1990, it became clear that the Era of Grand Theory was coming to an end. Theory had moved into a period of consolidation, when it was being explored not for its own sake but to make possible a new sort of encounter with a text or a group of related texts. Critical practices that had emerged since the beginning of the revolution, such as gender studies (including queer studies), New Historicism, and, broadest of all, cultural studies, began to dominate the graduate and undergraduate approaches to literature. People began to engage in loose talk about the arrival of a post‐theoretical age, and Terry Eagleton, who had cashed in on the critical revolution with Literary Theory: An Introduction (1983), published in 2003 a book titled After Theory.

But theory had by no means disappeared. The new critical discourses that had generated our encounter with texts were so thoroughly imbued with theory that they were essentially incomprehensible in isolation from their theoretical origins. When you read new historical essays on Shakespeare by Stephen Greenblatt, you couldn’t really understand them properly without unpacking them, and you couldn’t do that without reading the theorists who had influenced him – philosophers of history like Hayden White and Michel Foucault and cultural anthropologists like Clifford Geertz. And to do things properly you would also have to read the theorists who had most influenced them: not only Clifford Geertz on the semiotics of culture, but also Max Weber and Emile Durkheim and Claude Lévi‐Strauss; not only Hayden White on the tropics of history, but also Jacques Derrida and Ludwig Wittgenstein; not only Michel Foucault on the genealogies of power/knowledge, but also Martin Heidegger and the later Nietzsche. The underlying sources for gender studies and cultural studies would be even more diverse.

The process of consolidating and simplifying the elaborate and difficult Grand Theories into workable critical practices involved creating a pidgin, in much the same way people manage to communicate across language barriers by forming a lingua franca for trade and barter during interludes between hostilities. This critical pidgin was encouraged by the way universities in the United States avoided the creation of “schools” of like‐minded thinkers such as those we find on the Continent, and instead filled slots so as to create the greatest possible diversity. The tendency to isolate individuals using a particular theoretical vocabulary from one another had the consequence that, while they could speak their chosen critical language in all its purity at conferences, they had to use some other sort of discourse to talk with their colleagues. The result was a carnival of jostling jargons, in which purity of rhetoric took second place to the pragmatics of discourse. The discourse of one important postcolonial theorist, Gayatri Spivak, was an intricately modulated combination of deconstruction, Marxism, and feminism. And a gender theorist like Judith Butler could derive her notions about sex and society from Foucault, though her rhetorical moves were taken from Derrida and J. L. Austin, and never mind that these thinkers might otherwise be strange bedfellows.

From the 1990s up to the present day, these syncretic trends have continued to proliferate, as the study of literature has become just one area in a widening arena of textual criticism. The critical tools that we had developed for studying literature are being applied to other artistic and cultural productions like film and television, radio plays and comic books, painting and photography—and of course the influence flowed in both directions. The analytic approaches to narrative originally used to study novels and short stories have found application to memoirs and biographies, to medical case histories, and to the narratives judges create in writing legal decisions. Historical movements in architecture and home furnishing, such as the eighteenth‐century vogue in England and France for chinoiserie—once considered capricious episodes of fashion—are now seen as part of a larger cultural plenum shared with other fine and useful arts, determined by changes in trading relationships and other economic and social trends. Cultural studies has, in effect, turned back upon itself in ecocriticism, which attempts to understand how Culture comes to define its opposite, Nature, and to explore the changing relationship between civilization and the wild. Science studies, legal studies, business studies: newly developed fields like these attempt to interrogate the paradigms of knowledge taught to and accepted by professionals in these areas. Most eclectic of all, perhaps, is the field of globalization studies, which uses every resource of the social sciences and humanities to analyze how the international forces of military power, finance, and consumer culture have shaped a planet that had begun to become one world when the European voyages of discovery began over five hundred years ago. The result of all this syncretism has been that, although institutional structures within academe have remained more or less stable—most professors still teach and most students still earn degrees within departments—my own research projects and those of most of my doctoral students, colleagues, and friends have become ever more interdisciplinary.

One other clear change since the turn of the century has been the slow disappearance of the traditional literary canon as a basis for the humanities curriculum. The persistent attacks on the traditional canon as a gentlemen’s club for dead white European males provoked culture wars that began in the 1980s, but those wars are long over now. Ongoing research on the history of literary evaluation revealed that, apart from the general agreement on the significance of Homer and the Bible, the canon of the vernacular literatures had always been in flux. Since it was a presentist illusion that there actually existed a permanent list of what Matthew Arnold had called “the best that has been known and thought in the world,” the job of the humanities would need to be redefined. What we have actually been doing, if we are honest about it, is teaching the most interesting ways of reading the texts that have the greatest cultural importance today. The emphasis on the contemporary and the postmodern did not mean eliminating all the old favorites—indeed, Shakespeare and Jane Austen probably have as many followers as they ever had, and many more than they ever had in their lifetimes. But the culture of the university had approved so many new writers, and so many new areas of study, that it became clear that undergraduate and graduate students could never study any more than a small selection of them, and it would be irrational to feel guilty about what got left out. Nevertheless, living as they do in a postmodern culture that insistently recycles the cultural icons of the past, our students needed to read Defoe’s Robinson Crusoe not merely for its historical importance in the development of the European novel, but in order to understand John Coetzee’s Foe and Michel Tournier’s Vendredi, or Charlotte Brontë’s Jane Eyre in order to understand Jean Rhys’s Wide Sargasso Sea.

With contemporary cultural value taking clear precedence over other versions of merit, the curriculum began to give greater attention to ethnic literatures, particularly by writers of African American, Asian, and Latino/Hispanic descent, and the contemporary anglophone literature of Africa, South Asia, and the Caribbean, where so much of the most innovative poetry and fiction since the 1980s has been written. With this shift, postcolonial theory has become a major growth area. Originating in the politics of nation‐states carved out of former European empires, postcolonial theory can equally be applied to American literature: because even without an overseas empire the United States was formed by a process of internal colonization, absorbing into itself territories inhabited by indigenous populations. The theory behind contemporary and historical ethnic studies has tended to borrow and adapt from postcolonial theory and its sources. And, of course, such a program cannot be limited to the contemporary: it can be read back onto the past. It can be applied even to biblical texts, where Israelites appear first as enslaved immigrants, then as the conquering hegemons of Canaan, and finally as a conquered people at risk of cultural absorption by the Eastern empires of Babylon and Persia.

Having spoken of this period as an era of consolidation in the realm of theory, I would have to add, by way of correction, that this has also been an age of proliferation, during which theory has divided in order to multiply. Queer studies, which emerged in the early 1990s, with the work of Michel Foucault and Eve Sedgwick, was stimulated by the rise of feminist women’s studies but also in partial opposition to it, and is now most usually referred to as LGBTQ (lesbian‐gay‐bisexual‐transgender‐queer). This acronym recognizes the fact that sexual attraction and behavior have many variations, historically and at present, and that the chromosomes one is born with do not determine one’s preferred partner, one’s sexual behavior, or even one’s gender. Further — not to leave out the men, gender studies has come to include historical and sociological studies of maleness and masculinity.

Similar proliferation has developed in the areas of race and ethnicity. Africana studies, which can trace its history back to the late nineteenth century, and which was given a strong theoretical basis by Houston Baker and Henry Louis Gates, among many others, in the 1980s, has given rise both to a general area of theory, usually called Critical Race Theory, and to numberless specific “studies” programs to analyze the literature and culture of other racial and ethnic groups that have been marginalized in various Western societies. And just as feminism ultimately spawned “masculinity studies,” the ethnic and racial minority studies programs have generated “Whiteness Studies” in a spirit of critique, analyzing the defensive response of a powerful majority group that already sees itself under the threat of becoming, at some future moment, a marginalized minority.

These forms of identity politics have extended to the disabled, a set of disparate groups we will all join some day, if we are lucky, and to the traumatized, whose experience is less of having a specific identity than of losing its stability. Identity, in sum, has become a multidimensional vector space—of race, gender identity, sexuality, nationality, ethnicity, religion, and dis/ability—through which our imaginary individualities are determined. “Intersectionality” was the term Kimberlé Crenshaw coined in 1989 when she argued that certain combinations of vectors were more deeply discriminated against than others—as when Barbara Smith complained in “Toward a Black Feminist Criticism” (1977) that as a woman, a lesbian, and an African American, she had been triply marginalized. Ultimately, a coherent intersectionality theory will need to be developed to make better sense of our multiply determined selves.

If “intersectionality” is one overarching concept that helps us explain some recent developments in literary studies, another is “consilience.” Coined originally by the Victorian polymath William Whewell, “consilience” was used by biologist E. O. Wilson as the title of his 1998 book, which speculated that the sciences and the humanities would ultimately converge in their explanations of social and cultural phenomena. Whether or not such a genuine convergence fully occurs, several quite recent developments in literary theory are clearly inflected by the hard sciences of biology, physics, and chemistry and not merely by sociology, politics, and economics.

Evolutionary literary criticism explores the hypothesis that creating and consuming literature is not simply a delightful pastime but part of the reason the old world apes that became Homo sapiens succeeded and became dominant as a species. Telling stories was how our hominid ancestors communicated and bonded with each other as cooperative hunting and gathering societies living in competitive tribal groups, where the forces of both natural selection and sexual selection favored those who did it well and wiped out those who failed. It is clear that we are still telling stories, and evolutionary literary theorists would argue that narratives continue to have the same function: they enhance our abilities to survive by guiding the decisions we make about the work we do for a living, the friends we trust, and the prospective mates we select. Pride and Prejudice is a masterpiece of wit and irony, but it is also a primer on the danger to nubile women of succumbing to the attractions of superficially attractive young men, and of the rewards of seeking a mate whose solid worth may be obscured by defensive shyness—and this may be one reason so many subsequent romance novels have taken the bare bones of its plot as their model.

If evolutionary theory appears oriented toward the distant past, posthumanist literary theory is oriented toward the future, as our minds merge with the machine‐minds that we have learned to create to assist our own. The cyborg—abbreviation for cybernetic organism—appeared originally in science fiction, but the merger has already occurred; we are already posthuman. We find ourselves helplessly dependent on the tablets and smartphones we carry about with us, but at the same time we are practically omniscient, with vast libraries of information available with a few clicks on a keyboard or taps on a touch screen. Household robots remain a theme of science fiction, but many of us own an invisible digital servant: asked nicely, the disembodied Siri or Alexa will dial our friends, call us a taxi, turn on the lights or other appliances in our home, give us our precise global position in relation to the street grid, or tell us about the coming weather. Virtual reality technology allows us to “be” places we are not, with a 360° view of our surroundings in stereophonic sound. Posthumanist theory investigates the psychology and the politics of our immersion in the collective world of the internet, and its consequences for the social structures of our world.

Meanwhile, cognitive psychology seeks a new center within, exploring the functions and activities of the human brain and mind. It probably got its start when Noam Chomsky conjectured that natural languages are too similar in their deep structures to be the random product of culture, and are learned too quickly to be entirely the behaviorist result of the verbal stimuli children receive. Chomsky argued that human children are born with a “Language Acquisition Device” hard‐wired into their brains. While this theory is still contested, the controversy sparked widespread investigations into the relationship of mind and brain, by which it became clear that, whether the tracks are hard‐wired from birth or laid down by experience, the brain processes language in very specific sites. Neurologists examining patients with aphasias caused by brain lesions had long ago discovered that we store people’s names in a different site from common nouns, and that certain lesions prevented people from understanding metaphor and others metonymy. Through advances in neural science, cognitive theory has enabled us, without creating brain lesions in healthy subjects, to correlate specific thought processes with activity in specific areas of the brain by mapping which sites demand greater blood flow or demonstrate greater electrical conductivity. Since we store short‐term memories in different places from long‐term memories, it is suspected that the vivid dreams we experience while unconscious may be, contrary to what Freud thought, an artifact of the process of sorting and then “dumping” the data of the previous day. Philology could reveal the poem and its patterns, and rhetoric could give us some inkling of how audiences reacted, but until recently the key aesthetic moment of reader response was a mystery, a “black box” whose workings were hidden to us. Experimental cognitive psychology, however, has begun to shine light on both mind and brain, explaining how literary tropes (such as metaphor) are involved in all cognition, how empathy with fictional characters occurs, and how literary texts both engage and occasionally test the limits of cognitive functioning.

Digital humanities, finally, describes a wildly diverse group of projects that depend on the digital representation of texts and other data, and their distribution through the internet. Students of literature routinely consume the product of text digitization when they access both primary texts and criticism published in learned journals via the internet links provided by their university libraries. Digital images of rare or unique books make it possible for us to examine the manuscript of Beowulf while sitting at our desk; to view and read the printed versions of Shakespeare quartos; to compare the individually water‐colored copies of William Blake’s poems and prophetic books; and to view the poems and paintings on which Dante Gabriel Rossetti was simultaneously working.

Going beyond mere access, digital analysis of linguistic features has been used to test the ascription of texts published anonymously or under pseudonyms, like the Spectator essays and the Letters of Junius; scholars have also concluded that the first of the three narrative digressions in Henry Fielding’s Joseph Andrews was probably written by his sister Sarah. A project at the University of Nebraska analyzes Jane Austen’s use of free indirect discourse, with an eye towards specifying the linguistic features that mark its presence.

The availability of large corpora of texts from earlier centuries has theoretical implications that have been explored by scholars like Franco Moretti, who has advocated a “distant reading” to discover features and trends in a literature that by the eighteenth century had become far too massive for any single scholar to read more than a small fraction. Other critics like Sharon Marcus and Stephen Best have argued for a “surface reading” of texts to recover obvious features that have been temporarily obscured by psychoanalytical or Marxian searches for deeper meanings or latent content. All these manifestations of theory have taken us far beyond the search for close and closer readings that obsessed literary criticism in the 1950s.

The Blackwell Companion to Literary Theory has gathered together three dozen original essays, all by noted scholars in their fields, designed to introduce the general reader to the latest ideas about the literary and cultural theory of the last half century, focusing on the ideas that are still alive today. We have grouped the chapters for the reader’s convenience into seven sections, but many of the chapters speak to more than a single aspect of theory. The chapter on Digital Humanities, for example, has been placed with the other essays on “The Task of Reading,” but it might equally have been situated with “Scientific Inflections.” Scholars who were writing about theoretical movements whose heyday lay primarily in the past, like the chapters on the New Critics and the Chicago Formalists, were asked to discuss what was dead and what was still living about their group of theorists. Those who were writing about fields that were new or emergent were asked to trace the pre‐history as well as the current flowering of their area. Our aim was to allot proper space to all the areas of theory most relevant today, arranged by topic rather than chronology, in order to highlight the relationships between the earlier and the most recent theoretical projects.