In 1975, when I was a sophomore at Brown, my aunt was friends with and—the way he once pushed upon me some sort of reference book he compiled—must also have been dating a man who for years had been an editor of the New York Times Magazine, so I knew a bit before everyone else at school that the Times Magazine was planning a long article about the New Curriculum (no distribution requirements, optional pass/fail, fewer total courses required to graduate, the freedom to direct your own education, the encouragement to go deep rather than wide), which by then was quite old. When a reporter and photographer showed up one week that fall, their presence was immediately known and remarked upon. On the one hand, they were treated with almost a parody of blasé disinterest, whose purpose was to demonstrate how far we’d progressed beyond such mundane considerations as school spirit and self-promotion. On the other hand, from freshman orientation (which consisted primarily of people comparing Yale wait-list stories) through senior commencement ceremonies (which, according to those who knew, lacked the rowdy irreverence of Columbia’s), never, ever in my life have I encountered a more self-conscious and insecure group of individuals than the Brown class of ’78. Then again, maybe it was just me, insanely alert to rumors of my own inadequacy. There was, in any case, ambivalence toward the Times coverage—a certain feigned but felt indifference mixed with a childish hope that our parents back in the suburbs would read about the alarmingly bright Brown student body.
By the time the article came out, we’d forgotten all about it, and like any newspaper story whose subject you know well, it got things wrong and it was unbelievably boring. (My mother was a lifelong fan of Israel Shenker and S. J. Perelman; still, when she read Shenker’s portrait of Perelman in the Times, she said, “Like every news story I’ve ever known anything about personally, it was a bummer: inept, too cute, biased, incomplete. It made me wonder about all those stories I’ve admired.”) I don’t remember much about the article except that it appeared to have been written, in general, by the Brown admissions office, dwelling as it did upon the precipitous rise in the number of applications. I do remember, however, the photographer taking picture after picture of my Latin Lit classmate Theodore “Tad” Kinney III and me as we talked about who knows what outside Wayland Arch—probably how much better Tad Kinney was at Petronius than I was. I remember opening up the magazine and seeing the picture of Tad talking to…no one. I still had such acute acne that every night I had to mix and then wear a complex formula my dermatologist back in the Bay Area had prescribed for me; I didn’t embody Brown’s “new popularity” in the way that Theodore “Tad” Kinney III, with his tortoiseshell glasses and Exeter jaw, did. The point isn’t that the Elephant Man always has some slight to complain about; the photo editor probably did me a favor by cropping me out of the picture. The point is that for the purposes of the accompanying photograph, I couldn’t exist.
In the work of a striking number of creative artists who are Brown grads, I see a skewed, complex, somewhat tortured stance: antipathy toward the conventions of the culture and yet a strong need to be in conversation with that culture. (You can’t deconstruct something that you’re not hugely interested in the construction of in the first place.) Although these impulses are obviously not unique to former or current residents of Providence, Rhode Island, I’m curious to what degree, if any, Brown can be seen as an incubator for American postmodernism. Paula Vogel, who taught playwriting at Brown for twenty years before leaving for Yale, says, “I think Brown is a strong incubator, based on (1) our actual location; (2) our history as a school (the anti-Harvard, anti-Yale); and (3) the non-integration of artists into the curriculum here: we’re still on the margins and, therefore, artists who also teach a history/theory/literary curriculum are the artists who come to Brown.”
Is there an analogous Harvard or Williams or Oberlin or Stanford or Amherst or Cornell or Yale or Berkeley aesthetic, and, if so, how is it different, and, if not, why does Brown have such a thing while other, “similar” institutions don’t? These schools are somehow more secure in what they are and aren’t (the University of Chicago probably isn’t obsessed with the fact that it isn’t Princeton), whereas Brown is helplessly, helpfully trapped in limbo—Brown’s productively flawed, tragicomic, self-conscious relation to power/prestige/privilege. In 2004, Women’s Wear Daily named Brown “the most fashionable Ivy”—bobo clothes made (expensively) to look like the thrifty alternative to expensive trends. Embarrassing recent poll result: Brown is the “happiest Ivy.” Brown is Ivy, but it’s, crucially, not Harvard, Princeton, Yale. Brown students affirm a discourse of privilege at the same time they want to undermine such a hierarchy. Brown: We’re #14 (according to a recent U.S. News & World Report ranking); we try not necessarily harder but differently.
The result, in the arts: a push-pull attitude toward the dominant narrative. Todd Haynes, pitching his 2007 film I’m Not There to Bob Dylan, its anti-subject: “If a film were to exist in which the breadth and flux of a creative life could be experienced, a film that could open up as opposed to consolidating what we think we already know walking in, it could never be within the tidy arc of a master narrative. The structure of such a film would have to be a fractured one, with numerous openings and a multitude of voices, with its prime strategy being one of refraction, not condensation. Imagine a film splintered between seven separate faces—old men, young men, women, children—each standing in for spaces in a single life.” Documentarian Ross McElwee (whose film Sherman’s March changed my writing life): “Any attempt at some pure form of objectivity always seemed to me impossible and, at least in my attempts, dishonest, in some ways. In all of the hue and cry about objectivity and truth being captured by a camera at twenty-four frames per second, I’ve missed the idea of subjectivity. Somehow melding the two—the objective data of the world with a very subjective, very interior consciousness, as expressed through voice-over and on-camera appearances—seemed to give me the clay from two different pits to work with in sculpting something that suited me better than pure cinéma vérité.” Ira Glass, producer and host of This American Life: “I was a middle-class kid who didn’t know what he believed. My religion became semiotics, which was the conspiracy theory to beat all conspiracy theories. It wasn’t just that authority figures of various sorts did things that were questionable. It was that language itself was actually a system designed to keep you in your place, which, when you’re nineteen or twenty, is pretty much exactly what you’re ready to hear. Semiotics was how I defined myself. To a large extent, it still is. Most of what I understand about how to make radio is all filtered through what I learned in semiotics at Brown. There are certain things I learned from Robert Scholes—about, say, the way to structure a narrative to produce the most anticipation and pleasure—that I think of every day. Honestly, I wouldn’t have my job now without it.”
Rick Moody on exactly the kind of radio Glass was reacting against: “Humanism is a worthy goal for the literature and arts. Of course. It’s indisputable. The assertion of the essential dignity and value of humankind—who can argue with it? Certainly not I. The question, however, is if the goal of humanism, the assertion thereof, can survive the problem of its representation in the medium of audio. As with contemporary literature, contemporary radio has apparently found that it has to construct a certain rigid notion of humanism, in order to effect this humanist epiphany in you and me. And yet as soon as the construction becomes predictable, homogenized, devoid of surprise, I for one no longer hear the humanism at all. In fact, it starts to sound manipulative, controlling, condescending, perhaps even a little sinister. It’s like a piece of music that has been so compressed in the studio that the dynamic variation has been entirely squeezed out of it.”
Boston Globe, 2004: “From its founding as a fledgling program in 1974 to its morphing into a full Department of Modern Culture and Media in 1996, Brown semiotics has produced a crop of creators that, if they don’t exactly dominate the cultural mainstream, certainly have grown famous sparring with it.” Emphasis on “sparring”: over the last thirty-five years, Brown semiotics majors and others have tended to produce work with, as Moody says, an unmistakable tendency to “infiltrate and double-cross.” In the late 1960s, Scholes was invited, on the strength of his book The Nature of Narrative, to a semiotics conference in Italy; he’d never heard the term before. He joined the Brown faculty in 1970 and by 1974 had founded the semiotics program. Scholes says he chose the word “semiotics” because of its lack of meaning. “It didn’t have a lot of baggage. It was almost a blank signifier.” (“Semiotics?” I remember my mother saying. “What the hell is that?”) In the immediate wake of the New Curriculum, Brown could not have been more open to “interrogate certain ideological assumptions attendant upon bourgeois notions of pleasure,” according to film scholar Michael Silverman, one of Scholes’s first recruits. The irony being, of course, that as novelist Samantha Gillison says, “Semiotics was an exclusive, self-contained puzzle for super-smart, super-rich kids.”
“I see two distinct ‘schools’ of Brown writing with regard to contemporary culture,” Elizabeth Searle (author of the collection Celebrities in Disgrace, whose title novella she adapted into an opera, Tonya and Nancy—about Tonya Harding and Nancy Kerrigan) wrote to me. “Some, such as Moody, engage in a full-frontal assault, perhaps influenced as I was by Robert Coover and his ahead-of-the-curve A Night at the Movies. Other Brown alums follow more of a lone-Hawkesian flight path regarding popular culture. Jack Hawkes [the novelist John Hawkes, who taught at Brown for thirty-five years and died in 1998] seemed to soar above the whole computerized wasteland of contemporary culture with blissful indifference. Students of his such as Joanna Scott and Mary Caponegro seem to me to have followed suit, creating their unique takes from the vantage point of distant worlds (as in Scott’s Arrogance) or by conjuring up worlds wholly of their own making (as when Caponegro, in The Star Café, concocts a sexual funhouse in which a warped mirror forces lovers to view the strange, twisted postures of sex with only their own body and not the body of their partner reflected for view). What the two ‘schools’ seem to me to have in common is—for want of a better, fresher metaphor—an insistent, outside-the-box mentality that both Coover and Hawkes, our founding fathers, shared.”
Joanna Scott, describing the subjects of her own work, adumbrates many of Hawkes’s topoi as well: “Character and the motion of thought; the effects of varied narrative form; contradictory perceptions of time and place; the idiosyncrasies of voice; mystery and the impact of disclosure; beauty and ugliness; comedy, temptation, collapse, and recovery; the elusive potential of imagination.”
Brown, the seventh-oldest college in America, was founded in 1764—the Baptist answer to Congregationalist Harvard and Yale, Presbyterian Princeton, and Episcopalian Columbia and Penn. At the time, Brown was the only school that welcomed students of all religious persuasions. For nearly two centuries, it was what it was: a decent-to-good regional school, though also, I gather, a bit dandyish, gentlemanly, slightly or not so slightly second tier—among the least prestigious of the universities with which it wished to associate itself. In 1954, the Ivy League athletic conference was formed; Brown was/is last or nearly last among equals (with, for instance, by far the lowest endowment, currently three billion dollars to Harvard’s thirty-six). Brown still, I think, suffers from a massive superiority/inferiority complex (We’re anti-Ivy, but we’re Ivy!), proud of the club it belongs to and anxious about its status within that club. Saith Groucho Marx, two of whose films, Monkey Business and Horse Feathers, were cowritten by S. J. Perelman (he didn’t graduate), “I’d never join a club that would have me as a member.” At once rebel (We’re more interesting than you are) and wannabe (We got 1390 on our SATs rather than 1520), we’re like Jews in upper-middle-class America: We’re in the winner’s circle but uncertain whether we really belong. In general, Brown is (perceived to be) not the best of the best but within shouting distance of the best of the best—which creates institutional vertigo, a huge investment in and saving irony toward prestige; ambivalence toward cultural norms; and among artists, a desire to stage that ambivalence, to blur boundaries, to confuse what’s acceptable and what’s not.
In 1850, Brown’s fourth president, Francis Wayland, argued for greater openness in the undergraduate curriculum: Every student should be able to “study what he chose, all that he chose, and nothing but what he chose.” In 1969, the New Curriculum reanimated Wayland’s charge. The Brown aesthetic is a very loose translation, I would argue, of the New Curriculum: more loose limbed, more playful, more interdisciplinary, harder to define, at its worst silly (in 1974, my freshman roommate attended a lecture by Buckminster Fuller about the spiritual properties of the geodesic dome and spent all of November chanting in a teepee) and at its best mind-bending, life-altering, culture-challenging. Harvard runs the world; Brown changes it. The New Curriculum made, brilliantly, a virtue of necessity—took what was less traditionally pedantic and hypertensive about Brown and made it the very emblem of off-center experimentation and excitement, of off-axis cultural contributions. Brown is “branded” with a specificity that is surely the envy of other schools. Self-fulfilling prophecy: a certain kind of student now applies to Brown.
Amy Hempel, whose Collected Stories carries a foreword by Moody, says, “The smart dog obeys. The smarter dog knows when to disobey.” Dave Eggers, the recipient of an honorary degree in 2005, said at Commencement, “Man, did I want to go here. Brown was the number one school I wanted to go to when I applied to college. I wanted to go somewhere without any rules, but I was brutally rejected.”
It’s important to acknowledge, of course, that such self-reflexive, genre-bending, fourth-wall-shattering gestures are hardly exclusive to Brunonians. Brown didn’t invent American postmodernism. So, too, many extremely successful Brown grads in the arts are working a fairly traditional vein. As Beth Taylor, director of the Nonfiction Writing Program, says, “Certainly our Literary Arts Program has defined itself as experimental and against-the-grain since Hawkes and Coover brought it to national prominence in the 1970s. And perhaps, post-1969 New Curriculum, more Brown students have tended toward comfort with dissent than pre-1969 alums. Over the years, though, I’ve seen as many writers go off to mainstream publications as to alternative ones. And their stances have ranged from flip/skeptical to documentarian/fact-checked journalism.” Doug Liman’s Bourne trilogy does not seek to alter the face of an art form. Nathaniel Philbrick’s In the Heart of the Sea: The Tragedy of the Whaleship Essex, which won a National Book Award, is an estimable work of traditional nonfiction. The historical novelist Thomas Mallon has positioned himself in direct opposition to pomo relativism; he credits Mary McCarthy’s “premodernist” sensibility with inspiring him to become a writer, and he says about his collection of essays In Fact that “its prevailing moods and enthusiasms remain more retroverted and conservative than the academic and media cultures in which they were experienced.” Susan Minot is a direct descendant of the literary Episcopalianism of Henry James, Edith Wharton, Shirley Hazzard, John Updike. Deborah Garrison’s poetry is old-fashionedly accessible, as the irresistible title of her first book, A Working Girl Can’t Win, would suggest. Alfred Uhry’s Driving Miss Daisy is straight down Broadway. Kermit Champa, professor of the history of art and architecture until his death in 2004, believed in Gesamtkunstwerk—the “absolute aesthetic fullness of art.” Comp Lit professor Arnold Weinstein is an ardent advocate for traditional high-modernist fiction. (I should know; I house-sat for him one summer forty years ago, and I read all the marginalia in the books on the shelves: he really, really, really doesn’t like Beckett.)
The literary critic Stanley Fish, describing seventeenth-century “masterworks” such as Milton’s Paradise Lost, coined the term “self-consuming artifact”—a perfect phrase for a lot of what is, to me, the most exciting artistic work done by Brown faculty, grads, dropouts. Eurydice Kamvisseli, the author of the novel F/32, “rewrote Beckett and Homer” as a very young child. “My father still has the books in his library. I erased parts of the books and added, with my childish handwriting, better versions of the plot. I didn’t like anything mild.” So, too, see nearly every sentence S. J. Perelman ever wrote. Pretty much everything Nathanael West wrote as well, especially Miss Lonelyhearts (many of the letters in which were lifted from letters West steamed open when he was working as a desk clerk in a hotel) and A Cool Million (a large portion of which was appropriated without attribution from a Horatio Alger novel). The exquisitely self-conscious cartoons of Edward Koren, who is now retired after teaching in the Brown art department for decades. Richard Kostelanetz’s In the Beginning consists of the alphabet, in single- and double-letter combinations, unfolding over thirty pages. Shelley Jackson, who describes herself as a “student in the art of digression,” once published a story in tattoos on the skin of 2,095 volunteers. Andrew Sean Greer, whose second novel, The Confessions of Max Tivoli, is told in the voice of a man who appears to age backward, says about working with Coover, “He encouraged us to write anything except conventional narrative.” Even Coover’s titles are self-consuming: The Adventures of Lucky Pierre (Directors’ Cut); The Public Burning; The Universal Baseball Association, Inc., J. Henry Waugh, Prop.; The Origin of the Brunists; A Night at the Movies, or, You Must Remember This. Coover on Alison Bundy’s story collection A Bad Business: “In these elegant tales—not so much of adventure, comedy, and romance as their residue—Bundy summons up the world’s distance with bright, paradoxical immediacy that is sometimes almost magical. She is a poet to the prose line born, playing with the possibilities of plot as though it were a metrical system, rhymed with thought’s assonantal drift. This is rich comic writing, delicate and sure, touched at times by a wistful longing as a kiss might be touched by irony. Or life’s violence by the tenderness of dream.” Paul Grellong’s plays Manuscript and Radio Free Emerson. John Krasinski’s (well-intentioned and abysmally bad) film adaptation of David Foster Wallace’s Brief Interviews with Hideous Men. At the end of Michael Showalter’s short film Pizza, in the midst of Showalter’s weeping on the floor, he languidly orients himself toward the camera and poses—that nervous self-awareness that never turns off. Edwin Honig, who founded Brown’s creative writing program, is the author of Shake a Spear with Me, John Berryman. Note again, and everywhere, that note of self-consciousness, of self-reflexivity.
Will Oldham, aka Bonnie “Prince” Billy, an actor and playwright turned country musician and founder of the Palace Records label, stayed only a semester, but he must have absorbed the ions in the air pretty damn quickly: “I didn’t want to record under my own name, but also not under an implied group name. Assuming that a voice and guitar implies confession or self-expression doesn’t seem like a very productive line of thinking. I suggest that a song is no guarantee of its singer’s honesty, wit, sensitivity, or politics. I will always rewrite a song that seems like it’s too connected to a real event, because the intention is always to create the hyperreal event, so that—ideally—more people can relate to it.”
The Brown literary aesthetic: consciousness-drenched. Jaimy Gordon, author of the National Book Award–winning Lord of Misrule and many other difficult-to-categorize works, says, “This will sound odd, but I like having a mind. I like thinking, though I’m aware that I think eccentrically and often ridiculously, so that my thoughts threaten to isolate me, even though they take shape in the common tongue. I have confidence that what goes on in my mind, including but by no means featuring its review of personal experience, can be turned into something made of language that will be arresting to those who are susceptible to splendors of rhetoric.” Which could and should serve as the epigraph to Nancy Lemann’s giddily hall-of-mirrors novels, especially my favorite, Sportsman’s Paradise. Or Maya Sonenberg’s fascinatingly self-canceling story “Throwing Voices.” Brian Evenson, who is former chair of the Literary Arts Program at Brown and whose first work of fiction, Altmann’s Tongue, got him dismissed from BYU for its violation of Mormon tenets, says, “I’m fairly aware of philosophy and am especially interested in questions of epistemology, particularly theories that suggest the impossibility of knowing.” Evenson’s story “Prairie” ends with a character literally unsure whether he’s alive or dead; his novella The Sanza Affair gives each detail of a story two or three times (conflictingly) and cites each version with the name of a character (in parentheses) who believes that version to be true. Jeffrey Eugenides says about his Pulitzer-winning novel, Middlesex, “My narrator is not entirely reliable. He’s inventing the past as much as he’s telling it. The bottom line is that you can’t really know much about what you really don’t know. There are very old-fashioned narrative techniques deployed in the book as well, but postmodernism is always recuperating old styles of narration.” Hawkes, without whose encouragement I would never have become a writer (he told me that the first short story I ever wrote was about “the agony of love without communication and in the context of violence”), famously said, “I began to write fiction on the assumption that the true enemies of the novel were plot, character, setting, and theme, and having abandoned these familiar ways of thinking about fiction, I realized totality of vision or structure was really all that remained.”
For quite a while I wrote in a fairly traditional manner—two linear, realistic novels and dozens of conventionally plotted stories. I’m not a big believer in major epiphanies, especially those that occur in the shower, but I had one, about twenty years ago, and it occurred in the shower: I had the sudden intuition that I could take various fragments of things—aborted stories, outtakes from novels, journal entries, lit crit—and build a story out of them. I really had no idea what the story would be about; I just knew I needed to see what it would look like to set certain shards in juxtaposition to other shards. Now I have trouble working any other way, but I can’t emphasize enough how strange it felt at the time, working in this modal mode. The initial hurdle (and much the most important one) was being willing to follow this inchoate intuition, yield to the prompting, not fight it off, not retreat to SOP. I thought the story probably had something to do with obsession; I wonder where I got that idea—rummaging through boxes of old papers, riffling through drawers and computer files, crawling around on my hands and knees on the living room floor, looking for bits and pieces I thought might cohere if I could just join them together. Scissoring and taping together paragraphs from previous projects, moving them around in endless combinations, completely rewriting some sections, jettisoning others, I found a clipped, hard-bitten tone entering the pieces. My work had never been sweet, but this new voice seemed harsher, sharper, even a little hysterical. That tone is, in a sense, the plot of the story. I thought I was writing a story about obsession. I was really writing a story about the hell of obsessive ego. It was exciting to see how part of something I had originally written as an exegesis of Joyce’s “The Dead” (for Scholes’s course in post-structuralism) could now be turned sideways and used as the final, bruising insight into someone’s psyche. All literary possibilities opened up for me with this story. The way my mind thinks—everything is connected to everything else—suddenly seemed transportable into my writing. I could play all the roles I want to play (reporter, fantasist, autobiographer, essayist, critic). I could call on my writerly strengths, bury my writerly weaknesses, be as smart on the page as I wanted to be. I’d found a way to write that seemed true to how I am in the world. My “experimentalism” isn’t Hawkes’s, but his influence on me is the same as Coover’s on Greer.
The Guardian, 2006: At the opening of Sudden Glory: Sight Gags and Slapstick in Contemporary Art, a recent group show at the Wattis Institute in San Francisco, Ralph Rugoff suddenly found himself dangling upside down in the air. The six-foot-seven conceptual artist Martin Kersels had lumbered up behind Rugoff and swept him off his feet, suspending the curator by the ankles. “It was such an insane experience. My whole world turned inside out and I didn’t know what was going on for a second,” says Rugoff. “I was very thankful to Martin for doing that.”…At the Wattis Institute, where Rugoff served as director for almost six years prior to moving to the Hayward [Gallery in London], he curated a survey of invisible art that included paintings rendered in evaporated water, a movie shot with a film-less camera, and a pedestal once occupied by Andy Warhol. “Ralph is pretty experimental; he doesn’t follow the herd,” says Matthew Slotover, codirector of the Frieze annual contemporary art fair.
Which brings me to the Fuck You Factor (crucial to Brown’s overdog/underdog ethos): Lois Lowry’s Newbery-winning and frequently banned children’s book, The Giver, presents a dystopian view of a future society in which history is hidden, people are conditioned not to see colors, and those who do not fit within society’s narrow definition of acceptability are “released” (killed). Brown’s 1925 yearbook on S. J. Perelman: “He was a quiet and ingenious lower-classman, but he has since fought the good fight against Babbittism, Sham, Hypocrisy, and Mediocrity. All are supposed to quail before the vicious shakes of his pen and pencil.” Moody published a burn-all-bridges essay in the Believer in which he defended (as if defense were needed) the unconventional, uncommercial works of fiction—rather than the usual suspects—his committee chose as the five finalists for the 2004 National Book Award. C. D. Wright, who is an NBA winner and who taught at Brown for twenty-five years until her death in 2016, said, “It’s a function of poetry to locate those zones inside us that would be free, and declare them so. I’ve enjoyed the promise and limitations of several monikers but have never claimed them for myself. I’ve never been invited formally or otherwise to join an identifiable group, which doesn’t mean I’m opposed to their existence. It’s energizing to have an enemies list; it’s useful for other reasons as well, including the sharpening business.” In his senior year, James Rutherford directed a production of Sartre’s The Flies in which he bred and released forty thousand fruit flies into the room; audience members had to sign a statement before seeing the play. Leslie Bostrom, a professor of visual art, describes her current work as “anti-landscapes with incorporation of industrial components.” Vogel, whose Pulitzer-winning play, How I Learned to Drive, was attacked by some feminists for its investigation of the connection between women and pain, says, “For me, being a feminist does not mean showing a positive image of women. Being a feminist means looking at things that disturb me, looking at things that hurt me as a woman. Wherever there is confusion or double, triple, and quadruple standards, that is the realm of theater. Drama lives in paradoxes and contradictions.” Searle: “Walter Abish told our workshop, ‘The single most important thing in writing is to maintain a playful attitude toward your material.’ I liked the freeing, what-the-hell sound of that. I like the sense—on the page—that I’m playing with fire. I know I’m onto something when I think two things simultaneously: ‘No, I could never do that’ and ‘Yes, that’s exactly what I’m going to do.’ ”
Coover: “Sometimes the revolution of form seems almost accidental. Disparate elements are somehow juxtaposed, in art or life or both, creating a kind of dissonance, and an artist comes along who resolves that dissonance through the creation of a new form—a Chrétien de Troyes, for example, who secularized the monkish appetite for allegory and raised the fairy tale to an art form by enriching it with metaphor, design, fortuitous mistranslation, and exegetical tomfoolery, thereby inventing the chivalric romance, a form that dominated the world for five centuries. Something similar might be said of Cervantes as well, who brought an end to the tradition of chivalric romance by submitting its artifices to the realities of the picaroons and conquistadors of his day. Or of that eclectic ‘Redactor’ of Genesis, for that matter, who pressed the folk and priestly voice of his tribe into a contiguous relationship so profound that I myself am still affected by it, twenty-some centuries later. At other times—as with Ovid or Kafka or Joyce—this new form is clearly the conscious invention of a creative artist, pursuing his own peculiar, even mischievous, vision with the intransigence of a seer or an assassin. For these writers, the ossified ideologies of the world, imbedded in the communal imagination, block vision, and as artists they respond not by criticism from without but by confrontation from within. In mythopoeic dialogue, they challenge these old forms to change or die. And it is this sort of artist—whether intuitive or intellectual—that the PEN/Faulkner Award [which Coover judged in 1984] in its brief life has sought to honor. It is not surprising to me that its selections have sometimes seemed outside the ‘tradition.’ Its judges are prepared—obliged, even—to celebrate artistic genius wherever they find it, of course, but at its writerly heart this award belongs to the rebel, the iconoclast, the transformer. The mainstream.” (Emphasis mine.)
Thalia Field, a professor at Brown, resisting conventional notions of and defenses of literary and artistic realism, says,
For me it is “realistic” to be paradoxical, polyvocal, cacophonous. Stories where everything is tidy and psychologically or symbolically closed seem totally unlike lived experience. Whose universe is that? Recently at a performance in Denmark, someone asked me, “If you’re a Buddhist, why is your work so difficult?” There are eighty-four thousand Buddhist tenet teachings because though truths are very simple, our neuroses are so manifestly complicated it takes eighty-four thousand teachings to begin to penetrate them. In historical terms, realism is a vast subject whose meanings have shifted drastically over the course of one hundred and fifty years. When we speak of “realism” in drama or prose, we mostly mean a proscenium naturalism in which the audience observes the “contents” as one would observe another planet. This godlike perspective, the omniscience (the unobstructed view), the ability to contain closure and entire dramatic arcs within unit “sets” (of landscapes, time frames, characters, conflicts, etc.) allow the writer and reader to believe they are invisible. We think we need consistent and immutable selves; otherwise, the world of all our opinions, careers, likes, dislikes, borders, friends, enemies will fall apart. So if we’re the most traditional naturalist or the headiest philosopher, we try desperately to find coherence, whether defined through a period of a time, a theory, a series of events, even the word ‘she’ sitting in a few sentences. Characters are whom we love and lose, and their momentary appearance and unexplained passing are part of the ongoing drama. How rigidly we reveal or ignore this flickering ephemerality is for me a mark of “realism.”
We were/are taught at Brown to question ourselves rather than naïvely and vaingloriously celebrate ourselves—to turn ourselves inside out rather than (easily) inward or outward, to mock ourselves, to simultaneously take ourselves utterly seriously and demolish ourselves. Don’t you finally want to get outside yourself? Isn’t that finally what this has to be about, getting beyond the blahblahblah of your endless—yes, yes, a thousand times yes. I mean no. Or, rather, yes and no. I want to get past myself, of course I do, but the only way I know how to do this is to ride along on my own nerve endings; the only way out is deeper in. I’m drawn to writers who appear to have Schrödinger’s Cat Paradox tattooed across their forehead: The perceiver by his very presence alters the nature of what’s being perceived. I admire Hilton Als’s The Women, W. G. Sebald’s The Emigrants, and V. S. Naipaul’s A Way in the World—books in which the chapters, considered singly, are relatively straightforwardly biographical but, read as a whole and tilted at just the right angle, refract brilliant, harsh light back upon the author. So, too, for instance, Nicholson Baker’s meditation on Updike, Geoff Dyer’s on D. H. Lawrence, Beckett’s on Proust, Nabokov’s on Gogol: One writer attempts to write a book about another writer but, trapped in the circuitry of his own consciousness, winds up writing a book that is at least as much autobiography as biography. I’m just trying to be honest here: the only portraits I’m really interested in are self-portraits as well. I like it when a writer makes the arrow point in both directions—outward toward another person and inward toward his own head.
Several years ago I served on the nonfiction panel for the National Book Awards. One of the other panelists, disparaging a book (Down the Nile, by Rosemary Mahoney, who lives in Providence) I strongly believed should be a finalist, said, “The writer keeps getting in the way of the story.” What could this possibly mean? The writer getting in the way of the story is the story, is the best story, is the only story. We semiotics concentrators (although I actually wound up changing my major to British and American literature, due to my mother’s ongoing criticism of semiotics) knew that on day one.
Mary Caponegro’s story collection Five Doubts—the perfect Brown title: There’s certainty and there’s doubt; I’ll take doubt(s). Scholes: “My project has been to look into critical terminology and explore the confusions and contradictions lurking there, hoping, among other things, to recover the middle that they exclude.” Richard Foreman, the avant-garde playwright and director, says the goal of the Ontological-Hysteric Theater, which he founded, is a “disorientation massage.” He sees his work as driven by misunderstanding instead of (more Aristotelianly) conflict. A. J. Jacobs’s books The Know-It-All and The Year of Living Biblically are, simultaneously and respectively, arias to and desecrations of the encyclopedia and the Bible. (White) rapper Paul Barman, dissing a rapper appearing on the same track with him: “Your talents are bite-size / It’s no surprise you rhyme with white guys.” Hawkes: “There is only one subject—failure.”
When black students occupied the administration building my freshman year, the president of Brown was Donald Hornig, who earned his B.S. and Ph.D. from Harvard; before his presidency at Brown, he had been chair of the chemistry department at Princeton. He said, “If the private university is to continue as an important social and intellectual force, it must remain firmly in the storm center. It may mean controversy and conflict, and it may mean discomfort and dissent. Frontiers are dangerous places. The front edge of change is dangerously sharp, but it is where a great university belongs.” However, when, in response to a suggestion, he said, “Well, we’re not just going to drop out of the Ivy League,” most of the students hooted him down—to my ear, completely unconvincingly.
A football game in Providence in 1976: At halftime, the Yale band asked, “What is Brown?” and came back with various rude rejoinders (Governor Moonbeam, the color of shit, etc.), but the only answer that really stung was the final one: “Safety school.”
Anne Fadiman, in At Large and At Small, goes out of her way to avoid mentioning what college she attended—a very nearly unanimous gesture, I’ve noticed, among Harvard alumni: a pride beyond pride—the school that will not be named. Instead, she mentions the name of her dorm, as if Harvard is so famous that the dorm itself will suffice. Exactly the same gesture occurs in Sarah Manguso’s The Two Kinds of Decay.
The Brown-Harvard game for the 1974 Ivy League championship: We so hoped to experience, through our boys banging shoulder pads, an image of our own excellence. Our sense of anguished consolation depended upon it. We needed to know we were stronger than the geniuses in Cambridge. We were, in point of fact, both bigger and stronger than they were, but they were faster than we were. On a fall afternoon, we lost the big game. Lopsidedly. They moved the ball better than we did; they had a superior kicking game; they deserved to win. Alternatively, they got all the breaks. We were too tight. We wanted it too badly. Our identity depended too transparently upon our performance. No late-afternoon ringing of bells on campus. My friend Alan, who was at Harvard, said, years later (when he was my friend), referring to the game, “It really did seem as if the Harvard players were simply smarter, don’t you think?” He said this without hauteur or, on the other hand, irony or embarrassment. At Iowa, when each of us was asked where we’d gone to college, Alan said, “Harvard. University. In Cambridge.” Here he pointed, vaguely, in a northeasterly direction. “Massachusetts.” Which only had the effect, of course, of underlining the fact that Alan had attended the one school in America that exists in everyone’s imagination.
My junior year an essay appeared in Fresh Fruit, the extremely short-lived and poorly named weekly arts supplement to the Brown Daily Herald. A Brown student, writing about the cultural clash at a basketball game between Brown and the University of Rhode Island, referred in passing to Brown students as “world-beaters.” I remember thinking, Really? World-beaters? More like world-wanderers and -wonderers.
Jason Tanz, a senior editor at Wired: “I feel like my classmates at Brown didn’t necessarily think of themselves as natural inheritors of our nation. This is a huge overgeneralization, but my sense was always that friends at, say, Princeton knew the world was theirs and they were just biding their time until they got a job at their dad’s law firm; smart, snide Harvard grads had that pipeline into the TV-comedy industry. The Brown lines of influence and access weren’t quite as defined, so therefore maybe we were a bit more circumspect about our place in the world.”
Harvard: government; sketch comedy (same thing?). Yale: Wall Street; judiciary (same thing?). Princeton: physics; astrophysics (same thing?). Brown: art; freedom (same thing?). A myth is an attempt to reconcile an intolerable contradiction.