EVERY GENERALIZATION IS FALSE. WE LIVE IN AN AGE of hope and transformation. We also live in age of resignation, routine, and perhaps alarm. We anticipate the world will get better; we fear it will get worse. We exist amid incredible riches and paralyzing poverty. We conduct our lives in peace and we are surrounded by violence. The wealthy in spacious suburbs worry about keeping their shiny SUVs scratch free. The poor in dusty byways dream of clean water, the refugees in endless civil wars of four walls and a roof. On the outskirts of Johannesburg, the wretched seize land with the idea that “with all the space here, you can make a toilet.”1 Today little can span these realities.
This may, however, speak to both worlds: for both the prosperous and the destitute utopian ideas are as dead as door nails. They are irrelevant for the affluent and immaterial for the hungry—and dangerous for many intellectuals, to boot. To the desperate, utopian ideas seem meaningless; to the successful, they lack urgency or import; to the thinking classes, they lead to a murderous totalitarianism. Yet something must be stated at the outset: the choice we have is not between reasonable proposals and a unreasonable utopianism. Utopian thinking does not undermine or discount real reforms. Indeed, it is almost the opposite: practical reforms depend on utopian dreaming—or at least utopian thinking drives incremental improvements.
Edward Bellamy’s 1888 Looking Backward not only sketched a future society beyond selfishness and inequality but spurred political groups devoted to practical reform. His best-selling novel might be dismissed as airy speculation of a future America, but this would be misleading. It gave rise to a political association, the Nationalist Clubs, and it accelerated reforms as prosaic as the construction of good sewers. With Bellamy at their head calling for “evolution, not revolution,” and “orderly and progressive development,” the Nationalist Clubs pushed for reforms in voting, labor, and municipal services.2 The Bellamyites supported the city of Chicago in extending publicly sponsored electrical service, for instance. Chicago demonstrated that elected municipal authorities could provide electricity “cheaper and better than by private corporations,” proclaimed the Club newspaper. “In this ‘practical’ age men demand fact and not theory.” In Chicago they were getting it. In Boston, they were receiving the contrary lesson of private electrical service that was expensive and dangerous.3
Nor is Looking Backward the exception. History is replete with utopias that spurred reforms and utopians who advanced concrete improvements. Consider the marquis de Condorcet, the eighteenth-century French utopian, who dreamt of “the true perfection of mankind” living in complete equality unsullied by “greed, fear or envy.”4 He also served as the veritable director of the Société des Amis des Noirs (Society of the Friends of Blacks), the first French organization devoted to the abolition of slavery.5 Condorcet forcefully denounced the scandal of slavery, but far from demanding utopian measures, he proposed a series of moderate reforms leading to black emancipation. He feared the call for immediate freedom would stir too much opposition and checkmate any progress.6 He authored a founding document of the society that detailed how the group would meet, its yearly costs of membership, and its cautious goals. “Since we intend to concentrate on useful work, we need to repel in advance anyone who attempts to sow suspicion by accusing us of having no fixed aims” or “by presenting us as a dangerous institution.”7
Or take Enfantin, the nineteenth-century follower of the utopian St. Simon, who looked forward to a future Golden Age. Enfantin possessed more mystical goals than his mentor—and more practical ones. He wanted to link East and West, the female and male principles. He divined how to do this. In 1833 he traveled to Alexandria, Egypt, with a crew of engineers with the idea of building a canal to connect the two realms. It is easy to mock Enfantin’s florid language and metaphysical goals, observes Zachary Karabell in his recent history of the Suez canal, but he shared an idiom and sense of destiny with many late-eighteenth-century visionaries, including founders of the United States. Enfantin worked on plans for the canal and assembled work crews to excavate it. After three years of intermittent progress, he quit Egypt, but not before he made his most “avid convert,” Ferdinand de Lesseps, who successfully saw through the building of the Suez Canal.8
Down-to-earth reforms or feasible social changes coexist with utopianism and are often fed by it. At the beginning of modern utopianism Thomas More described an island community without money or private property. Yet the first section of his 1516 Utopia protested injustices of the day; he damned England for its endemic poverty, the theft it gave rise to, and the executions that ensued. Thieves were being hung “all over the place,” sometimes “twenty on a single gallows.” Why? Because they stole out of hunger. “In this respect you English,” comments the reporter from Utopia,
remind me of incompetent schoolmasters, who prefer caning their pupils to teaching them. Instead of inflicting these horrible punishments, it would be far more to the point to provide everyone with some means of livelihood, so that nobody’s under the frightful necessity of becoming first a thief and then a corpse.”9
Was this utopianism—this call to provide citizens with “some means of livelihood”? Little sounds more reasonable.
Over a fifty-year period (1805–1855), almost a hundred utopian communities were founded in the United States. Their founders and members did not generally run from but rather toward society; they saw themselves creating and promoting viable models for how people could live better. This was the belief, for instance, of Victor Considerant, the French founder of a Texas community. How will old Europe gain from our community? he asked. He saw his association as “the nucleus of the new society” that will lead to “thousands of analogous organizations.” “It is not the desertion of society that is proposed to you, but the solution of the great social problem on which depends the actual salvation of the world.”10
Or listen to Nathaniel Hawthorne, who fictionalized his brief experience at one utopian community, Brook Farm. “We had left the rusty iron frame-work of society behind us; we have broken through many hindrances … we had stept down from the pulpit; we had flung aside the pen; we have shut up the ledger; we had thrown off that sweet, bewitching, enervating indolence.” And for what? “It was our purpose—a generous one, certainly, and absurd … to give up whatever we had heretofore attained, for the sake of showing mankind the example of a life governed by other than the false and cruel principles on which human society has all along been based.”11
To be sure, the intention of “showing mankind the example” rarely ended well. The history of utopian communities is largely a history of failure. John Humphrey Noyes, himself a founder of a utopian community (Oneida), marveled that these associations “started so gaily and failed so soon.”12 Yet it wrongs history to ignore failure, as if nothing positive or humane comes out of it. Conversely, victory can testify to the configuration of force or power, rather than to truth or validity. This may seem obvious, but it runs against deep-seated beliefs or prejudices. Success needs no defense; it is its own advertisement. However, the questions tabled by success may be decisive: Success succeeds, but for how long and at what cost? To study only the world’s victors keeps thought locked to a narrow reality. Out of defeat emerges ideas, changed people, and new movements.
Even when they failed, utopian communities radically altered people and perceptions. In nineteenth-century America, Hawthorne was but one of the literary and political figures who took away lessons from utopian experiments. Frederick Law Olmsted, for instance, the landscape architect credited with New York’s Central Park, visited a New Jersey Fourierist community, the North America Phalanx. He was struck by “the advantages of cooperation” in labor and culture. While he admitted that he was “not a Fourierist for myself,” he came away with a belief in making “knowledge, intellectual and moral culture, and esthetic culture more easy—popular.” He learned the force of “democratizing religion, refinement and information.” In such an association, he believed, all people would “live more sensibly, be happier and better.”13
Today, however, the utopian vision has flagged; it sparks little interest. At best, “utopian” is tossed around as a term of abuse; it suggests that someone is not simply unrealistic but prone to violence. I offer at least three reasons for the fate of utopian thought: the collapse of the communist states beginning in 1989; the widespread belief that nothing distinguishes utopians and totalitarians; and something more difficult to pinpoint, but essential: an incremental impoverishment of what might be called Western imagination.
I can add little to the story of the fall of communism. To many observers, Soviet Marxism and its knockoffs symbolized the utopian project. The failure of Soviet communism entailed the end of utopia. Who can challenge the verdict of history? Of course, over its lifetime Stalinism engendered generations of critics who protested the identification of the Soviet system with human emancipation. But when the Soviet ship went down, it also capsized, willy-nilly, the row boats of dissenters paddling in its wake. It seems unjust that the Victor Serges, Emma Goldmans, Gustav Reglers, and even Leon Trotskys, who fought against authoritarian communism—and suffered the consequences—should share its fate, as if no distinction could be drawn between the accuser and accused. When Soviet communism thrived it silenced critics by its putative success. When it failed, it silenced critics by disappearing. Those who resisted the spell of Soviet success have been unable to escape the pull of its collapse.14
This is unfair, but who says the judgment of history is fair? It consists not of an anonymous Weltgeist but of countless individuals—writers, scholars, politicians, and ordinary people. Today they more or less agree; utopian thinking is finished. The sixteenth century gave us a new term, “utopia,” and the twentieth gave us “dystopia,” or negative utopia, the universe of Huxley’s Brave New World or Orwell’s 1984, where utopia has gone amuck. Perhaps this says it all. The movement from utopia to dystopia ratifies history.
The word “utopia,” coined by Thomas More, breathed of possibility, spurred by the recent “discovery” of the New World. “I don’t know, Madam,” said the narrator in Fontenelle’s eighteenth-century Conversations on the Plurality of Worlds, “if you grasp the surprise of these Americans” as they encountered the ship-borne explorers. “After that,” what is not possible? “I’ll bet … against all reason, that some day there might be communication between the Earth and the Moon.” “‘Really,’ said the Marquise, staring at me. ‘You are mad.’”15 This optimism and excitement found its way into utopian visions. Their willingness to learn, reported More’s Raphael of the utopians, is the reason they are “so much ahead of us politically and economically.” This news warmed his convivial audience. “In that case, my dear Raphael, for goodness’ sake tell us some more about the island in question.” They break to dine and return in fine spirits to hear his tale.16
Almost five centuries later the world has grown weary. We have come and gone to the moon. In the mid-twentieth century, J. Max Patrick, a coeditor of an anthology of utopian writings, coined the term “dystopia” as the contrary of utopia.17 He referred to a satirical utopia as the “opposite of eutopia, the ideal society: it is a dystopia, if it is possible to coin a word.”18 Without doubt, the twentieth-century dystopias look and smell very differently from classic utopias, even those created as recently as the end of the nineteenth century. “My first feeling,” reported the voyager as he awakes in William Morris’s 1890 utopian News from Nowhere, “was a delicious relief caused by the fresh air and pleasant breeze.”19 An opening sentence of Orwell’s 1984 reads: “The hallway smelt of boiled cabbage and old rag mats.”20
Yet a critical problem arises. Is dystopia the opposite of utopia—in the same way that slavery is the opposite of freedom or cold is the opposite of hot—or does dystopia grow out of utopia? The epigram by Nicolas Berdyaev that Huxley used for Brave New World puts it well: “We used to pay too little attention to utopias, or even disregarded them altogether, saying with regret that they were impossible of realization.” Things have changed. “Now, indeed, they seem to be able to be brought about far more easily that we supposed, and we are actually faced by an agonizing problem of quite another kind: how can we prevent their final realization?”21 For Berdyaev it is utopias themselves that are the threat.
Few would claim that freedom leads to slavery or that frigid water will boil, but many do argue that utopia leads to dystopia—or that little distinguishes the two in the first place. The blurred border between utopia and dystopia compresses the historical judgment. Dystopia does not relate to utopia as dyslexia does to reading or dyspepsia to digestion. The other “dys-” words, derived from a Greek root meaning diseased or faulty, are disturbed forms of something healthy or desirable, but dystopia is judged less as an impaired than as a developed utopia. Dystopias are commonly viewed not as the opposite of utopias but as their logical fulfillment. No one suggests that dyslexia signifies we should renounce reading, but many believe dystopias invalidate utopias.
Why? The short answer has to do with the blood bath of communism—Stalinism, Maoism, Pol Pot, and the rest—and alludes, again, to the great twentieth-century dystopian novels that apprehend that experience. Fair enough—or is it? This judgment raises questions about the popular, not the scholarly, reading of texts. From Brave New World or 1984, generations of high school and college students learn the lesson that utopias in general, and communism in particular, are not only doomed, but destructive. Yet the twentieth-century dystopic novels were not emphatically anti-utopian—and certainly its authors were not. Years after Brave New World, Huxley wrote Island, a novel rarely assigned to students but that praises a utopian society based loosely on Buddhism and cooperative living. “We’re not interested in turning out good party members; we’re only interested in turning out good human beings,” the island guide informs the visitor, who finds the utopians both happy and healthy.22
Nor was Huxley anticommunist or antisocialist. Like H. G. Wells, another utopian, he was obsessed by the promise and threat of science.23 The visitor to Island asks, who owns everything? “Are you capitalists or state socialists?” “Neither,” comes the reply. “We’re co-operators.”24 In his 1958 reconsiderations of Brave New World, twenty-seven years after he wrote it, Huxley approved the redistribution of property. It is, he wrote, “a political axiom” that “power follows property.” Now “the means of production are fast becoming the monopolistic property of Big Business and Big Government. Therefore, if you believe in democracy, make arrangements to distribute property as widely as possible.”25 Are these the comments of a dystopic thinker? Today they sound utopian.
While Huxley does allude to Soviet communism in Brave New World, which features a character named Lenina, neither communism nor Nazism much bothered him. “Brave New World was written in 1931,” he recalled later, “before the rise of Hitler … and when the Russian tyrant had not yet got into stride.”26 The fetish of youth, the dangers of consumerism, the manipulations of the human psyche: these worried him, especially as he observed them in America of the 1920s.27 After all, the American auto manufacturer Henry Ford, who pioneered mass production, pervaded Brave New World. The leaders were called Fords; the “T” (from the Model T automobile) was a sacred sign; dates were marked “A.F.” (After Ford); Ford’s Day was celebrated; and a slogan attributed to Ford (“History is bunk.”) is honored. Huxley feared a technological and Americanized future. The dystopia of Brave New World is less a rejection of utopian paths than a rejection of mass marketing and standardization.
Orwell’s 1984 targets Soviet communism much more directly. Yet even this book, and certainly its author, cannot be simply classified as anti-utopian. Orwell retained a belief in a socialist future.28 “Every line of serious work that I have written since 1936,” he declared, “has been written, directly or indirectly, against totalitarianism and for democratic Socialism. “29 He protested that Animal Farm and 1984 were being read as anti-utopian or antisocialist tracts. He had intended to highlight destructive tendencies surfacing in the Soviet Union, England, and Nazi Germany—not reconcile people with the status quo. In the preface to the Ukrainian edition of Animal Farm, Orwell explained that many in England retained illusions about the Soviet Union. “Indeed,” he wrote, “nothing has contributed so much to the corruption of the original idea of Socialism as the belief that Russia is a Socialist country and that every act of its rulers must be excused, if not imitated.” Orwell believed “that the destruction of the Soviet myth was essential if we wanted a revival of the Socialist movement.”30
At the end of his abbreviated life the misinterpretations of 1984 increasingly agitated Orwell—to the point that he dictated a press release to his publisher to clarify his intentions.31 He saw a danger “in the structure imposed on Socialist and on Liberal capitalist communities by the necessity to prepare for total war.” He feared the “totalitarian outlook” and worried that the capitalist and communist superstates would line up against each other. Obviously, the Anglo-American “block” would not be called communist, but something like a “hundred per cent Americanism.”32 In response to an inquiry from an American union member, he wrote that 1984 “is NOT intended as an attack on Socialism or on the British Labour Party (of which I am a supporter) but as a show-up of the perversions to which a centralized economy is liable and which have already been partly realized in Communism and Fascism.”33
Indeed, to read 1984 as a straightforward attack on utopia or socialism takes some doing. Many elements bespeak capitalist Britain, not communist Russia. The “proles,” or the workers, live in dingy suburbs and pass their time—apart from labor—in gambling, films, and football; they play darts and watch films “oozing with sex.” They also read trashy newspapers filled with crime stories, astrology, and sports. None of this rang true of the Soviet working class. Isaac Deutscher, an acquaintance of Orwell, noted, “Orwell knew well that newspapers of this sort did not exist in Stalinist Russia, and that the faults of the Stalinist press were of an altogether different kind.”34
Moreover, at the intellectual climax of the book, when O’Brien, representing the Party, interrogates Winston, not communism but a system that has left far behind even a pretence of justice or emancipation is exposed. O’Brien, who punishes Winston for incorrect answers, demands to know, what is the “motive” of the Party? “Why should we want power?” Winston thought he knew. “The Party did not seek power for its own ends, but only for the good of the majority,” who were too weak and incapable to govern themselves.
This was the wrong answer, however; it demonstrated that Winston still imagined the Party retained a progressive commitment to happiness or freedom. O’Brien corrected Winston. The Party seeks power for its own end, simply to wield power. “The object of power is power.” Why? To cause suffering. “Do you begin to see, then, what kind of world we are creating?” asks O’Brien. “It is the exact opposite of the stupid hedonistic Utopias that the old reformers imagined.” The Party aims for “a world of fear and treachery and torment…. If you want a picture of the future, imagine a boot stamping on a human face—forever.”35 1984 implicitly defends the “stupid hedonistic Utopias” that “old reformers” like Orwell continue to believe in.
Before 1984 stood Yevgeny Zamyatin’s dystopic novel We, which Orwell had read, reviewed, and, to some degree, imitated.36 Written in the early 1920s, We prompted Zamyatin’s exile from the Soviet Union. With its “One State,” an omniscient “Benefactor,” the “Bureau of Guardians,” a “Table of Hours” that regulates all activities including sex, and a story of love and subversion, the book anticipated many elements of 1984. Zamyatin intended We as a savage attack on Soviet communism—and it was read that way. “Everything here is untrue,” wrote a Soviet apologist. “Communism does not strive to subjugate and keep society under the heel of a single state.” This guardian of orthodoxy cautioned that Zamyatin was following “a very dangerous and inglorious path”—not quite as dangerous as this critic, who disappeared in Stalin’s purges.37
It takes nothing away from the book to note that Zamyatin’s concerns went beyond Soviet communism. He rejected not revolution or transformation but the idea that history had stopped. The new revolutionaries forgot that each revolution must be succeeded by another. “My dear,” asks the subversive comrade of her irresolute lover, who is a state mathematician, “Name me a final number … the ultimate, the largest.” “That’s preposterous!,” he replies. “How can there be a final number?” Her point exactly. “Then how can there be a final revolution? There is no final one; revolutions are infinite.”38 On this turned Zamyatin’s objections to utopias: they ended history and change. “Utopia is always static,” he wrote in an essay on H. G. Wells.39
The love of routine and repetition upset Zamyatin, who had spent several years in England. Prior to We he satirized an English obsession with mechanical precision and timetables.40 The Vicar in his short story “The Islanders” schedules his eating, walking, repenting, alms giving—and love making with his wife. When she sits on his lap in an unprescribed moment, he upbraids her. “My dear,” intones the Vicar, “you remember … life must become an harmonious machine and with mechanical inevitability lead us to the desired goal…. If the functioning of albeit a small wheel is disturbed…. Well, but you understand.”41
Orwell understood. He had been searching for a copy of We for some years and finally located a French translation. He recognized that the book was more than an anticommunist tract. In 1922 Zamyatin could hardly be charging the Soviet system with creating a boring life, Orwell argued. Rather, Zamyatin targeted “not any particular country but the implied aims of industrial civilization. It is in effect a study of the Machine.”42
The briefest inquiry reveals that the key dystopic books of the twentieth century were not anti-utopian; they did not deride utopian ventures as much as they mocked authoritarian communism or a technological future. They did not link utopia and dystopia; they damned contemporary society by projecting into the future its worst features. Herein lies the difference between utopia and dystopia. Utopias seek to emancipate by envisioning a world based on new, neglected, or spurned ideas; dystopias seek to frighten by accentuating contemporary trends that threaten freedom.
The common wisdom that utopias inexorably lead to dystopias not only derives from texts, it appeals to history to make its case. New words help make the argument. Like “dystopia,” the term “genocide” belongs to the twentieth century. Inevitably these new terms seem related; they seem to address kindred experiences. Raphael Lemkin, a Polish-Jewish refugee, coined “genocide” in 1944 “to denote an old practice in its modern development”—the annihilation of a national or ethnic group. He believed the Nazi practices occasioned a new word.43 While Lemkin worked tirelessly to spread the news about genocide—with few rewards44—he did not associate it with either utopia or dystopia.45
Yet scholarly and conventional opinion today consistently links genocide and utopia and bills the blood bath of the twentieth century to “utopians” such as Stalin, Hitler, and Mao. From Hannah Arendt’s 1951 Origins of Totalitarianism to Martin Malia’s 1994 Soviet Tragedy—its last chapter is titled “The Perverse Logic of Utopia”—scholars have thrown communism, Nazism, and utopia into one tub. Prestigious savants like Isaiah Berlin and Karl Popper have persuasively argued that utopia leads to totalitarianism and mass murder. “We must beware of Utopia,” wrote Ralf Dahrendorf. “Whoever sets out to implement Utopian plans will in the first instance have to wipe clean the canvas, on which the real world is painted. This is a brutal process of destruction;” it leads to hell on earth.46
To question this approach requires asking what utopias are actually about—and why, for instance, Nazism should not be deemed a utopian enterprise. Even the vaguest description of utopia as a society inspired by notions of happiness, fraternity, and plenty would apparently exclude Nazism with its notion of Ayrans dominating inferiors in a Thousand Year Reich. What connects Thomas More’s Utopia and Hitler’s Mein Kampf? Virtually nothing.47
More, a saint in the Catholic Church, offered a vision of a world where “everyone gets a fair share, so there are never any poor men or beggars. Nobody owns anything, but everyone is rich—for what greater wealth can there be than cheerfulness, peace of mind, and freedom from anxiety?” He dreamt of a place where man can “live joyfully and peacefully.” War was despised “as an activity fit only for beasts,” and tolerance extended to various religions. The leader of Utopia called it “arrogant folly” to enforce religious conformity by way of “threats or violence.” If “fighting and rioting” decide religious controversies, the best men will succumb to the worst “like grain choked out of a field by thorns and briars.”48
Hitler dreamt of a Reich resurrected on the basis of a “great, unifying idea of struggle,” anti-Semitism. He believed that the gassing of “twelve or fifteen thousand” Jews would have changed the outcome of World War One. He called for a race war to protect the bloodline since “all the human culture, all the results of art, science and technology … are almost exclusively the creative product of the Aryan.” The “mightiest counterpart to the Aryan is represented by the Jew,” who poisons and corrupts the pure Nordic blood. The “contamination of our blood … is carried out systematically by the Jew today … these black parasites of the nation defile our inexperienced young blond girls and thereby destroy something which can no longer be replaced.”49 In 1939 Hitler threatened to eliminate Jews from Europe. “If international finance Jewry … succeeds in precipitating the nations into a world war, the result will not be … the victory of Jewry, but the annihilation of the Jewish race in Europe.”50 This does not sound like Thomas More.
Careful scholars, such as Jost Hermand in Old Dreams of a New Reich: Volkish Utopias and National Socialism, have explored utopian writings that preceded and accompanied Nazism. Yet the term “utopian” in his study takes on a peculiar cast, which he half acknowledges by frequently putting it in quotes, as if having the words “Nazi” and “utopia” in one sentence violate sense, which it does. He cited an example of a pre-Nazi “utopian” novel from 1913 that sketches a future ice age that eliminates inferior peoples and opens the way for a more powerful Germanic race, “tall in stature, blond, blue-eyed.” In the new Germany, “totally absent are the racial vermin, the small in stature, the squat, the stocky, the black-haired…. The Ice Age exterminated them…. All that remained was the Germanic people, who, liberated from all Celtic, Mediterranean, and Oriental parasites, could now breathe free.”51 The themes of racial superiority studded with violence and mysticism permeated “utopian” fiction throughout the Third Reich, but that literature contained little of the brotherhood and harmony that marked classical utopias.
Hans Mommsen, a distinguished historian of Germany, titled an essay “The Realization of the Utopian: The ‘Final Solution of the Jewish Question’ in the ‘Third Reich.’” Did the annihilation of the Jews really fulfill a utopian vision? Mommsen barely says. He is as explicit as the following: “The utopian dream of exterminating the Jews could become reality only in the half-light of unclear orders and ideological fanaticism.”52 Yet this “dream” does not partake of a utopianism that from Hesiod to Bellamy imagined a world at peace. The fact that Mommsen used the term “utopian” sparingly or misleadingly seems to have struck his translators. They translated this essay, “Die Realisierung des Utopischen,” into “The Realization of the Unthinkable.”53
Yet numerous observers continue to tack together utopianism, totalitarianism, and Nazism.54 This is the wisdom of our age. A well-visited and -reviewed exhibit on utopia, subtitled “The Search for the Ideal Society in the Western World,” held in New York and Paris, included Nazi items and paraphernalia such as anti-Semitic posters and Hitler’s Mein Kampf. The exhibit displayed photos of a Nazi concentration camp and an Israeli kibbutz, as if both presented comparable faces of utopianism. Apparently, genocide and humanism both illustrate “the search for the ideal society.” In an essay, “Utopia and Totalitarianism,” from the exhibit’s catalog, a French scholar admits that “utopias do not all forecast totalitarian regimes to the same degree.” Yet the “differences” between them shrink before the similarities. “Precisely because of their utopian aims,” he writes, they are all “harbingers of totalitarianism.”55 This is inexact, but typical. Does Fourier’s notion of lavish dinners—“more exquisite than the best kings can obtain”—foreshadow totalitarianism?56 Historical precision surrenders to a liberal anti- utopian animus.
A recent book by the historian Eric D. Weitz, titled, A Century of Genocide: Utopias of Race and Nation seconds this approach. He argues that utopias undergird twentieth-century genocidal regimes. Taking up Stalin’s Soviet Union of the 1930s, Hitler’s Germany of the 1940s, the Khmer Rouge of 1970s Cambodia, and the Miloševic’s Serbia of the 1990s, Weitz writes that “all articulated powerful visions of the future. Each of them promised to create utopia in the here and now.”57 To discover utopianism in all his cases, however, Weitz stretches the term to incoherence. What was utopian in Miloševic’s violent efforts to create a greater Serbia? Weitz identifies a belligerent Serbian nationalism “imbued with a sense of aggrievement” and a Serbian hatred for Muslims, but where is the utopianism in this? He tosses about and finally offers that “every so often” “glowing” images of the future appeared in the rhetoric of Serb nationalists. For instance, a Serbian bishop declared that Serbia “had become the largest state in heaven.” Aggressive Serbian nationalism hardly confirms a utopian-genocide linkage.
He also scrambles to find a utopian note in Nazism. Anti-Semitic, racist, xenophobic, nationalist, authoritarian, but utopian? Weitz emphasizes the mystical, agrarian, and communitarian features of Nazism that might at first glance seem utopian. He tells us that the Nazis promoted whole-grain breads and “greater consumption of fresh fruits and vegetables,” as if this demonstrated a nefarious utopianism. Indeed, he informs us that “whole-grain bread was called the ‘final solution’ to the bread question.” He might as well dub the current movement for healthy school lunches utopian and genocidal. The Nazis did champion a “new” man and woman, but it was less the utopianism of this concept than its racism that proved lethal. The Nazi case does not establishes a utopian-genocidal tie.58
In classifying Nazism as a utopian venture, scholars ratify the anti-utopian bias. They clinch the case against utopianism. The casualty list for utopian enterprises lengthens and any lingering sympathy for it vanishes. Utopians stand condemned by the blood they have shed. Yet where is the evidence? The question Hegel asked still hangs over us, “But even regarding History as the slaughter-bench at which the happiness of peoples, the wisdom of States, and the virtue of individuals have been victimized—the question involuntarily arises—to what principle, to what final aim these enormous sacrifices have been offered”59 Can we say today that utopians are responsible? Do mass deaths largely derive from crazed, or indeed sane, utopians?
To assess this argument requires entering the morgue of history, not only to count the corpses but to determine the cause of death. More than a strong constitution and medical skills are required. Politics saturates the task. From the numbers of deaths (and their causes) in the New World after the European “discovery” to those killed in twentieth-century Rwanda, experts and partisans squabble. How many and why? A temptation to come up with larger and larger numbers—as if numbers themselves clinch an argument—overwhelms many inquiries. To attribute mass deaths to utopian striving is standard.
In the 1930s scholarly opinion estimated the population of the Western Hemisphere at the time of Columbus’s arrival at about 8 million. Fifty years later many scholars posit 150 million inhabitants. But as David Henige notes in his Numbers from Nowhere, between 1930 and 1980 “there was no change in the evidence at all.” The “High Counters,” as he calls them, “find numbers, believe them, multiply them.”60 However, the numbers form only the subtext of the debate; the cause and responsibility for the American Indian depopulation constitute the real issue. With larger numbers comes more blame.
The classic account here is David Stannard’s 1992 American Holocaust, which examines the devastation of Native Americans after Columbus. Stannard provided many figures and much evidence, but it is the numbers that primarily to bolster his case. “The destruction of the Indians of the Americas,” writes Stannard “was, far and away, the most massive act of genocide in the history of the world.” He estimates one hundred million dead—over at least a century and leveled mainly by disease.61 Yet Stannard does not look for any utopian mission among the slaughterers—for good reason. If anything, the Native Americans appeared the utopians, living communally in peace and ease. As Peter Martyr stated in his sixteenth-century account of the New World peoples.
The land belongs to everybody, just as does the sun or the water. They know no difference between meum and tuum, that source of all evils. It requires so little to satisfy them, that in that vast region there is always more land to cultivate than is needed. It is indeed a golden age, neither ditches, nor hedges, nor wall to enclose their domains; they live in gardens open to all…. Their conduct is naturally equitable.62
The Native Americans, wrote Hoxie Neal Fairchild is his classic study of the Noble Savage, were frequently “represented as a virtuous and mild people, beautiful, and with a certain intelligence, living together in nakedness and innocence, sharing their property in common.”63
No one presented the Spaniards (or the English) in these terms. Familiar and pedestrian goals fired them. “The Spaniards’ mammoth destruction of whole societies generally was a by-product of conquest and native enslavement, a genocidal means to an economic end,” writes Stannard.64 He could have cited the sixteenth-century denunciation The Devastation of the Indies, by the Spanish priest Bartolomé de las Casas, who put it concisely. “The reason for killing and destroying such an infinite number of souls,” wrote Las Casas in 1552, “is that the Christians have an ultimate aim, which is to acquire gold, and to swell themselves with riches in a very brief time…. It should be kept in mind that their insatiable greed and ambition, the greatest ever seen in the world, is the cause of their villainies.”65 Las Casas, writing just several decades after More’s Utopia, found no hint of utopianism in these mass deaths.
To be sure, assessing the intentions of the slaughterers and estimating the numbers killed in any century is a dark business, and few have essayed it. The opening sentence of Gil Eliott’s neglected Twentieth Century Book of the Dead reads: “The number of man-made deaths in the twentieth century is about one hundred million.” The major terrains of violence he considers—the book was published in 1972—include World War I, China (mainly the Sino-Japanese war), the Russian Civil War, the Soviet state, the Jews of Europe, and World War II.66 Only a portion of these deaths, about one-fifth or one-quarter, could be chalked up to utopians, even loosely conceived.
One might counterpose to Eliott a recent book by French scholars, The Black Book of Communism. Stéphane Courtois, its main editor, comes up with the same figure as Eliott—but just for those killed by communists in the twentieth century, which is also the number that Stannard computes for Native American deaths. “The intransigent facts demonstrate that Communist regimes have victimized approximately 100 million people in contrast to the approximately 25 million victims of the Nazis,” writes Courtois, although he is vague as to how he derived his figures. For Courtois, the numbers infer a “similarity” between Nazism and Communism and justify extending the term “genocide” to the communist system. He identifies utopianism as a root cause. “The real motivation for the terror,” writes Courtois in his conclusion, “becomes apparent: it stemmed from Leninist ideology and the utopian will.”67
Those who refer to Nazi and Stalinist totalitarianism as the death knell of utopia might give more attention to World War I, a bloodletting that directly spurred the Russian Revolution and, indirectly, Nazism. Scholars have never found a shred of utopianism either in the events leading to its outbreak or its four years of hostilities. World War I, writes Enzo Traverso in his Origins of Nazi Violence, was “the founding act” of twentieth-century violence. It introduced the world to “the bombing of towns, the internment of nationals of enemy countries and the deportation and forced labor of civilians.” It marked “a new threshold in the escalation of violence.”68 “Bolshevism and Fascism are the children of World War I,” stated the historian François Furet. If that is so, then what do we know of the parents? Unfortunately very little. World War I “remains one of the most enigmatic events of modern history.” It is almost beyond recall or sense, noted Furet. “It is very difficult for us to imagine the nationalist passions that led the peoples of Europe to kill each other for four full years.”69 Yet isn’t this often the pedestrian story of modern violence? Not utopianism, but plain old nationalism? Even though he tried to classify as “utopian” the movement for a greater Serbia, Weitz in his Century of Genocide admits it was mainly driven by the “desperate efforts of an old elite” that mobilized people by nationalism.70
A recent effort to survey war deaths over three centuries concludes that “the most frequent objectives” were “territory or independence.” In other words, nationalism spurred these conflicts, although the twentieth century showed a sharp increase in “civil wars.”71 For the period from the end of World War II to the end of the twentieth century, perhaps the most comprehensive accounting of war deaths has been assembled by Milton Leitenberg of the Center for International and Security Studies at the University of Maryland. To assess only the conflicts that caused the greatest loss of life—above 500,000—communist “utopians” certainly held their own, mainly in Asia, where many millions of deaths are ascribed to the Pol Pot regime in Cambodia (1975–1978), perhaps 2 million; and many more are attributed to Chinese policies of land-reform (1949–1954), 4.5 million; their suppression of counterrevolutionaries (1949– 1954), 3 million; and the cultural revolution (1965–1975), 2.1 million.72
Numbers like these are impossible to apprehend. But the morgue of history allows no easy exit. To get to the door requires a walk through cavernous rooms to count the corpses and inspect the death certificates. For instance, the Chinese Civil War (1946–1950), which preceded these blood baths, cost the country 6.2 million lives. To step back a bit further—before 1945—total deaths in the Sino-Japanese war, which began with the invasion of China by Japan and ended with Japan’s defeat by the United States, reached 10 million.73 Or, to remain in Asia for the post–World War II years, we find 1.5 million deaths in the civil war in Bangladesh (1971); almost a million in the partition of India (1946–1948); and perhaps another million in Indonesia, largely communist party (PKI) members or sympathizers, following the unsuccessful coup of 1965. A CIA study, not usually guilty of hyperbole, concluded, “in terms of the numbers killed, the anti-PKI massacres in Indonesia rank as one of the worst mass murders of the twentieth century.” (The study added that unlike Nazi genocide or Stalin’s purges, these deaths have gone virtually unnoticed.)74 Or to look at some numbers from sub-Saharan Africa: we find a series of civil wars, ethnic conflicts, and independence struggles, each with tolls of about a million or more in the post–World War II years: Angola, Ethiopia, Mozambique, Nigeria (two million in the 1967–1970 civil war), Rwanda, Sudan, Uganda.
What can be gleaned from this melancholy listing? Perhaps nothing. Or this: the human community has much more reason to fear those with an ethnic, religious, or nationalist agenda than it has to fear those with utopian designs. Indeed, this jumps out more forcefully in scanning contemporary lethal conflicts. “The last decade of the twentieth century,” writes Samantha Powers “was one of the most deadly in the grimmest century on record.”75 She is referring to Rwanda, but internecine hatreds marked virtually all recent conflicts: the Ethiopia-Eritrean War, Sudan, Algeria, Sri Lanka, Kosovo, Bangladesh, Kashmir, Afghanistan, Congo.76 The toll in a few of these—the Congo and the Sudan—reach into the millions. But where are the utopians?
In We Wish to Inform You That Tomorrow We Will Be Killed with Our Families, Philip Gourevitch tells of meetings that preceded the Rwandan genocide in which hatreds were fanned and payoffs promised. Local leaders “described Tutsis as devils—horns, hoofs, tails and all—and gave the order to kill them…. The local authorities consistently profited from the massacres, seizing slain Tutsis’ land and possessions … and the civilian killers, too, were usually rewarded.”77 This seems a far cry from utopianism. In an increasingly rational and scientific world, primal attachments of blood, clan, and religion enflame global slaughter. This is not exactly ignored; on the other hand, neither is it given the equal gravity with the misdeeds of utopians.
Imagination nourishes utopianism. Zamyatin intuited that imagination, far from leading to a totalitarian society, threatens it. The authorities in the One State of his We discover a medical operation to surgically remove the subversive imagination. “Rejoice!” proclaims the One State Gazette, science can now overcome imagination, the “last barricade on our way to happiness.” Imagination has been a “worm” that gnaws at people, causing widespread unhappiness. “The latest discovery of State Science is the location of the center of imagination—a miserable little nodule in the brain…. Triple-X ray-cautery of this nodule—and you are cured of imagination—Forever.” You become perfect and anxiety free. “Hurry!” command the authorities. Line up and get the “Great Operation.”78
If imagination sustains utopian thinking, what sustains imagination? As a historical problem, not a psychological or philosophical one, this is rarely pursued.79 Hundreds of books and articles consider imagination in literature, myth, cognition, perception, understanding, and philosophy, but skirt its history. Eva Brann’s encyclopedic The World of Imagination covers imagination from Plato to Bergson; she follows it through cognitive psychology and visual perception and surveys it in logic, poetry, and religion. She even takes up imagination and utopia. Yet in all her 800 pages she does not even allude to imagination as a historical entity, as something that shifts over the decades or centuries.80 The issue here is not so much its role in thinking, remembering, or learning but its social and historical configurations. How does the shape of imagination change over time? Does it evolve or weaken? Was “the imagination” the same in 1800, 1950, and 2000?
To ask is not to answer. It seems reasonable to assume that the woof and warp of imagination register historical changes. However, to get at the specifics of imagination requires a series of conjectures. Imagination probably depends on childhood—and conversely, childhood depends on imagination. To be sure, this was a notion cherished by romantics like Rousseau and Wordsworth, who idolized the child as a creature of imagination and spontaneity.81 Yet nowadays a facile historicism about the “construction” or “invention” of childhood (or family or anything) often ends in a more facile relativism, suggesting that what is invented or constructed cannot be sturdy or desirable. However, buildings are constructed; they also stand up and, sometimes, dazzle. To put it differently: The romantics might have idolized childhood as the domain of imagination; this does not invalidate imagination. Nor does the fact that the “romantic child” was “narrowly confined to an elite” render it spurious.82 What is restricted to a few is not for that reason illegitimate; it may be the reverse. Literacy was once limited to an elite, but who would argue that it should remain so?
If childhood nourishes imagination, what nourishes childhood? Much historical work has explored the shifting dimensions, indeed, almost the discovery, of childhood. The classic study, Philippe Airès’s Centuries of Childhood, published over forty years ago, argued that childhood itself belongs to the modern period.83 Employing pictorial evidence, statistics on child morality, and the custom of sending children off for service or work, he concluded that childhood went unrecognized till the sixteenth and seventeenth centuries; until then children were seen as miniature adults. While this idea has proved resonant, historians have challenged most, if not all of it. “Airès’s views were mistaken,” writes Nicholas Orme in his recent Medieval Children, “not simply in detail but in substance. It is time to lay them to rest.”84 The danger, however, is switching them around and finding that nothing has changed. The historian Keith Thomas has concluded that the continuity of childhood far surpasses any changes in it. “Since the development of the child’s mind and body is essentially a biological constant, it is not surprising that there should be great similarities between the ways of children in early modern England and their ways today.”85
This seems clearly to exaggerate the continuities. At least in the modern period, a series of factors have shaped, if not revolutionized, childhood. Family size, mortality, child-rearing practices, labor, schooling, and play assume major roles.86 None of these permit easy generalizations as to how they have changed in the last two centuries. Some elements can be more easily documented than others. The establishment of compulsory education and restrictions on child labor can be tracked, for instance, and indeed were often linked. The first French child-labor legislation of 1841 required children to attend school in order to be employed (and limited the work day to eight hours for children between ages of eight and twelve!).87 These laws did not always exist and became more comprehensive over the years. Eventually, they kept more children out of the workforce or shortened the workday and created a prolonged reserve for learning, growing, and playing.88 At the same time “the decline in family size from an average of six children in the 1860s to three in the 1900s and two in the 1920s” and a rising standard of living may have allowed increased care and attention to each child.89
With what consequence for imagination? Things can sometimes be best glimpsed in their decline. As the waters ebb, the old depth lines catch the eye. A “fall of childhood” literature has emerged that posits a thinning of the emotional and psychic space that enveloped the growing child. A protective zone—always delicate—succumbs to marketing forces. “The modern make-up of society,” wrote Max Horkheimer in the 1940s, “sees to it that the utopian dreams of childhood are cut short in earliest youth.”90 If this happens, then the “classical” notion and the reality of childhood turn out to be not only fragile but obsolete, arising in the late eighteenth century and being eclipsed in the late twentieth century. But does it happen? The “decline” literature itself may be situated historically as a product, as one critic put it, of a “peculiarly contemporary malaise … of panic and nostalgia.”91
Yet this same critic, David Buckingham, admits the contours of leisure have shifted for children in the last decades. Not only have merchandisers targeted children, but due to increased affluence and anxiety about external dangers, “the principle location of children’s leisure has moved from public spaces (such as the street) to private spaces (the bedroom).” Playing outside has been “steadily displaced by domestic entertainment (particularly via television and computers) and—especially among more affluent classes—by supervised leisure activities such as organized sports, music lessons and so forth.”92 This tallies with Neil Postman’s observation in Disappearance of Childhood of an erosion of games that children play by themselves. “Except for the inner city, where games are still under the control of the youths who play them, the games of American youth have become increasingly official, mock-professional, and extremely serious.”93
In a fifty-year period, the time children spend before television screens and computers has jumped from zero to at least three or four hours daily.94 During the same stretch, the advertising budget of toy makers has escalated from zero to billions. Advertisers target children, who have become a major market. This has brought about what Juliet B. Schor has called the “reshaping” or “commodification” of childhood.95 In 1955 a major toy manufacturer with $50 million in sales spent a few hundred dollars on advertising to children.96 Today, the television-advertising budget alone directed to children is about a billion dollars. “The average American child watches more than thirty thousand commercials a year, or roughly 82 a day.”97
Can it be doubted that those hours and those advertisements affect children? Is it possible that unstructured playtime that gives space to imagination has diminished? While boredom has hardly disappeared, “boredom” as a dreamy Sunday afternoon with nothing to do may be short-circuited by channel surfing and computer games. “Boredom is the dream bird that hatches the egg of experience,” wrote Walter Benjamin in 1936. And he added, “his nesting places—the activities that are intimately associated with boredom—are already extinct in the cities and are declining in the country as well.”98
Does boredom, an unstructured zone of inactivity and purposelessness, allow imagination to develop? And is boredom itself a product of time and place? “Boredom does have a history,” writes the historian Peter Burke, “in the sense that the occasions of boredom and also what might be called the ‘boredom threshold’ are subject to change over time.” He finds that boredom was fairly common among the intellectual elite in early modern Europe; in their villas and life the elite sought to pass the time to escape from tedium.99 Of course it is possible to go back to the Greeks, Roman, and early Christians. Acedia, or sloth, is as old as the Greeks, although it is not until the early Christian Fathers that it becomes a major concern. Sloth threatened pious souls, especially monks, but also marked a stage in spiritual growth. To attain oneness with God, it might be necessary to risk acedia. In Foundations of Coenobitic Life, acedia ranks as the sixth of eight temptations.100 As sloth, tedium, or ennui, boredom has a long history.101
Like many historical inquiries, the disputes turns on the accents placed on continuity and discontinuity. How new is boredom, or how different is modern boredom? The term itself may be instructive. “The word ‘boredom’ dates from the nineteenth century,” writes Patricia Spacks in her book on boredom. “The verb ‘to bore’ as a psychological term comes from the mid-eighteenth century.”102 Etiquette manuals regularly counseled how to avoid being a “bore,” although their advice frequently went unheeded. “Only eleven o’clock, is it?” recounted Mrs. Humphry Ward in her account of a mid-nineteenth-century country dinner party. “I thought it was at least one—that Lady Broadlands is such a stupid, proud fool, and he such a bore.”103
Yet what might be an elite phenomenon and a personal failing undergoes two changes after World War II, argues the historian Peter Stearns. “Being bored began to be much more important than doing the boring.” To claim “boredom” was a legitimate complaint. Second, the arena of boredom shifts to children. “I’m bored,” as expressed by a child, is not a fact but an accusation; it means “entertain me.” Stearns adduces various parental columns and books that address the danger of boredom among children. “Quite simply, boredom increasingly mutated, after the late 1940s, from being an attribute of personality … to being an inflicted state that demanded correction by others…. Children were easily bored,” and the fault lay with parents and society.104
While Stearns does not cite it, he could have referred to a 1957 best seller to bolster his case. The title itself, “Where Did You Go?” “Out” “What Did You Do?” “Nothing,” expressed a new anxiety about boredom—or more exactly a protest against it. Robert Paul Smith, its author, tells about what he did as a kid—and what his children and their friends do. As a kid he didn’t know the word “bored”: “We never thought that a day was anything but a whole lot of nothing interrupted occasionally by something.” Much of his book describes those somethings. “We sat in boxes…. We stood on boards over excavations…. We looked at things like knives and immies and pig nuts and grasshoppers and clouds and dogs and people. We skipped and hopped and jumped. Not going anywhere—just skipping and hopping and jumping and galloping.” What he means is that “we did a lot of nothing…. But now, for some reason, we’re ashamed of it.” Now if a grownup sees a child doing “nothing,” we “want to know what’s the matter.”105 As Smith explained elsewhere, “I think it is a dirty shame society won’t let kids alone. Children should learn they don’t have to be doing something every minute. It’s fine just doing nothing.”106
When did “doing nothing” become unacceptable to parents, and perhaps to children as well? Why the shift? New parental anxieties about delinquency and school failure; suburbanization that insulated children from other children; smaller families that meant fewer sibling available—and those around were closer in age and more rivalrous; more mothers working: into this vacuum stepped manufactured products: comics, movies, television, toys. They answered a need, and they were needed. Parents felt an imperative to keep their children busy. “The need to keep children entertained,” writes Stearns, “and, often, to buy things or services in order to do so is a fruit not only of growing marketing aimed at children but also of growing parental commitment to the provision of fun.”107
Of course, the vast enlargement of the toy market is part of the story. Toys are obviously not new, but their incessant selling to children is. “The most obvious impact of toy-marketing,” writes Steven Kline in Out of the Garden, “is that it makes children want more toys.”108 Indeed, “the children’s market is huge,” over $200 billion in 1997 in the United States, both in direct purchases and family purchases influenced by children.109 Such purchases pervade play and fantasy. Toys and video games, made by adults, replace toys and street games, made by children. “We are seeing something new,” observes John Holt. The old games of house and hide-and-seek succumb to games patterned on television superheroes. Now children “have most of their daydreams made for them.”110 Or, as Gary Cross put it in his history of toys, Kid’s Stuff, “Captain Action replaced Hopalong Cassidy. Barbie bested Betsy Wetsy…. New playthings embodied dreams of growing up fast to a glamorous world of consumption or a heroic realm of power and control.”111
Is this really true? The classic study by Iona and Peter Opie, Children’s Games in Street and Playground, finds little change. Their book, which is subtitled “Chasing, Catching, Seeking, Hunting, Racing, Duelling, Exerting, Daring, Guessing, Acting, Pretending,” details with great gusto a host of children’s games still played in streets and parks; the authors scoff at the notion of decline. “The belief that traditional games are dying out is itself traditional.” They note that children’s street games take place in the sides and corners of towns, where adults barely see them. As we age, “we no longer have eyes for the games, and not noticing them suppose them to have vanished.”112 Rather like Jane Jacobs in Life and Death of Great American Cities, they believe that unplanned spaces in cities support games and play, now and in the future.
Yet even the Opies note those spaces are changing, and more children live by schedules and play in teams or in school. Something disappears when games require coaches, uniforms, and equipment. Parks departments now advertise for “Play Leaders.” The Opies wonder if manicured lawns and renovated parks undermine playing. The authorities “invade” parks and organize play areas. “The center of our own home town possessed, miraculously, until two years ago, a small dark wood,” a natural habitat for playing children. No longer, or no longer in the same way: “Now the trees have been cut down, the ground leveled, a stream canalized, and the area flooded with asphalt.”113 Moreover, their study was completed decades ago, in the England and Scotland of the 1960s. Even the sanguine Opies doubt that children’s games persist unchanged.114
If unstructured childhood sustains imagination, and imagination sustains utopian thinking, then the eclipse of the first entails the weakening of the last—utopian thinking. To be sure, historical causes cannot be neatly marshaled, as if A causes B that causes C. Moreover, the subject at hand—the vitality of imagination—cannot be simply circumscribed or dissected. Despite these uncertainties, it seems likely that the colonization of children’s space and time undermines an unfettered imagination. Children have more to do, more done for them, and less inclination—and perhaps fewer resources—for utopian dreaming.
Robert Paul Smith’s book begins as he turns to a bunch of kids, none of whom “seemed to know what to do for the next fifteen minutes.” “I said to them, ‘How about a game of mumblypeg?’ And can you believe that not one of these little siblings knew spank the baby from Johnny jump the fence?” Nor do they play “immies” (marbles), nor stoop ball. When they play baseball it is “in something called The Little League and have a covey of overseeing grownups hanging around and bothering them and putting catcher’s masks on them and making it so bloody important.” When we played we just played with kids on the block with no interfering adults. “Kids, as far as I can tell, don’t do things like that any more.”115
The great scholar of Jewish mysticism, Gershom Scholem, once wrote that Jewish messianism can be described “as a kind of anarchic breeze.”116 He alludes to the “profound truth” that “a well-ordered house is a dangerous thing.” In that house, “a window is open through which the winds blow in, and it is not quite certain just what they bring in with them.” He added that it is “easy to understand the reticence and misgivings” of traditional housekeepers.117
At least in part, this book hopes to constitute an “anarchic breeze” in the house of utopia. In the utopian tradition, virtually all attention is focused on what may be called the “blueprint” school of utopianism. From Thomas More to B. F. Skinner, the blueprint utopians have detailed what the future will look like; they have set it out; they have elaborated it; they have demarcated it. Sometimes these particulars have been inspired, sometimes pedantic, sometimes mad. “The people dress in undershirts of white linen over which they wear a suit combining jacket and trousers in one piece,” wrote Tommaso Campanella of his seventeenth-century “City of the Sun.” “This has no folds, but there are splits on the side and beneath, which are closed with buttons. The trouser part reaches to the heels, which are covered with heavy socks and shoes.” Four times a year, when the sun enters Cancer, Capricorn, Aires, and Libra, “the Keeper of the Wardrobe” distributes new outfits.118
Such details can bestow on utopian speculation a certain weight and plausibility. This is how people shall work or dine or play, they suggest. It is feasible, they imply. The utopian blue-printers give the size of rooms, the number of seats at tables, the exact hours at which to arise and retire. Yet the strength of the blueprinters is also their weakness. The plans betray, and sometimes celebrate, a certain authoritarianism. They say: this is the way people must dress; this is the hour they must eat. In his reconsideration of his Story of Utopias, Lewis Mumford mulled over the fact that where he had been looking for fresh ideas in the history of utopias, he stumbled upon far too many dictatorial schemes: “These rigid virtues,” he wrote, “these frozen institutions, these static and self-limiting ideals did not attract me.”119
To be sure, these unattractive features, Mumford explained, do not constitute the whole of utopias. If so, he would have “dropped [his] investigation” immediately. Beyond the authoritarian rules, he found a largesse of spirit and imagination woefully lacking in contemporary society. “Utopian thinking … then, was the opposite of one-sidedness, partisanship, partiality, provinciality, specialism.”120 This is true—but not true enough. The blueprints not only appear repressive, they also rapidly become dated. Even with the best of wills, the blueprinters tether the future to the past. In outfitting utopia, they order from the catalog of their day. With their schedules and seating arrangements, their utopias stand condemned not by their capaciousness but by their narrowness, not by their extravagance but their poverty. History soon eclipses them.
The blueprint tradition constitutes only a part, albeit the main part, of utopianism. Less noticed and less easily defined are the anti-blueprint utopians, who could be called the iconoclastic utopians. Rather than elaborate the future in precise detail, they longed, waited, or worked for utopia but did not visualize it. The iconoclastic utopians tapped ideas traditionally associated with utopia—harmony, leisure, peace, and pleasure—but rather than spelling out what could be, they kept, as it were, their ears open toward it. Ears and eyes are apposite, for insofar as they did not visualize the future, they listened for it. They did not privilege the eye, but the ear. Many of these thinkers were Jews, and explicitly or implicitly, they obeyed the commandment prohibiting graven images. God, the absolute and the future defied visual representation. Like the future, God could be heard but not seen. “Hear, O Israel!” begin the Jewish prayers.
The iconoclastic utopians were also Jews drenched in German romanticism. If they were against images of the future, they sought hints of it in music, poetry, and mystical moments. “In the years before 1914,” recalls Hans Kohn, who is linked with several of the iconoclastic utopians, “German intellectuals rediscovered romanticism and mysticism.”121 Iconoclastic utopianism drew upon not only Jewish sources but the romantic idiom of fin de siècle Germany. Ideas of “Geist,” “society,” “experience,” “unity,” and “life” permeated the romanticism of the early twentieth century; the romantics specialized in spiritual transcendence. The “Neue Gemeinschaft” (New Society) group, for instance, founded by Heinrich and Julius Hart, sought to overcome “the spirit of disunion, of duality.”122 Jewish utopian thinkers borrowed this romantic idiom and gave it a sharper edge; they translated a largely mystical and individualist tongue into a political language. They fashioned a utopianism committed to the future but reserved about it. Against the dominant tradition of blueprints, they offered an imageless utopianism laced with passion and spirit.
Iconoclastic utopianism did not entail a puritanical severity. On exactly this issue it frequently differed with conventional utopianism and socialism infused with notions of purity and obedience. Heine may best capture this sensuous utopianism in his presentation of German thought to the French. “Don’t be angry with us,” he beseeched the French, “you virtuous Republicans.”
We are fighting not for the human rights of the people, but for the divine rights of mankind. In this and in many other things we differ from the men of the Revolution. We do not want to be sansculottes, nor simple citizens, nor venal presidents; we want to found a democracy of gods, equal in majesty, in sanctity, and in bliss. You demand simple dress, austere morals, and unspiced pleasures, but we demand nectar, ambrosia, crimson robes, costly perfumes, luxury and splendor, the dancing of laughing nymphs, music and comedies.123
What Adorno wrote about Heine could be said of any of the iconoclastic utopians. “In contrast to socialism he held fast to the idea of uncurtailed happiness in the image of a just society, an idea quickly enough disposed of in favor of slogans like ‘Anyone who doesn’t work won’t eat.’”124 The renunciation of images of the future protected the idea of the end of renunciation.
The classic book of the Jewish iconoclasts is Ernst Bloch’s The Spirit of Utopia from 1918. Bloch called it a work of “revolutionary Romanticism” or “revolutionary gnosis.”125 The book explores inwardness, music, and soul. Not a sentence addresses the size of the sleeping quarters. Yet “elective affinities”—Goethe’s term, employed by Michael Löwy in his Redemption and Utopia—can be found among a series of central European Jewish thinkers and writers—from Martin Buber and Gustav Landauer to Walter Benjamin and T. W. Adorno, and perhaps Kafka.126 These were iconoclastic utopians without precise maps, yet utopians nonetheless. Scholem highlighted their utopianism—and, indeed, he was a close friend of Benjamin. For Jews, said Scholem, redemption was never a purely inward event.
After all, that restitution of all things to their proper place which is Redemption reconstructs a whole which knows nothing of such separation between inwardness and outwardness. The Utopian element of messianism which reigns so supreme in Jewish tradition concerned itself with the whole and nothing but this whole.
Scholem found this utopianism alive and well in Bloch, Benjamin, Adorno, and Herbert Marcuse, “whose acknowledged or unacknowledged ties to their Jewish heritage are evident.”127
While the prohibition on graven images has spurred innumerable commentators, no one has argued that the refusal to visualize the deity diminishes him. On the contrary: this refusal was conceived as an act of piety. For the fiercely monotheistic Jews, the prohibition on graven images, and the corresponding reluctance to write the name of God, did not belittle but honored God. It suggested the vast gap between God and humankind. God, said the twelfth-century Jewish philosopher Maimonides, could not be described positively; every positive attribute limits the deity. He can only be approached indirectly—“by negation.” “For whatever we utter with the intention of extolling and of praising Him, contains something that cannot be applied to God, and includes derogatory expressions; it is therefore more becoming to be silent, and to be content with intellectual reflection.”128 For the same reason God could not be painted or portrayed; visual details define and confine.
An Ariadne thread runs from this negative theology of the twelfth century to the iconoclastic or negative utopians of the twentieth. The refusal to describe God transmutes into the refusal to describe utopia, which can only be depicted in negative terms. Yet like the resistance to naming God, the reluctance to depict utopia does not diminish but exalts it. It bespeaks the gap between now and then. It refuses to reduce the unknown future to the well-known present, the hope to its cause.
“The gate to justice is learning,” wrote Benjamin in his essay on Kafka. “And yet Kafka does not dare attach to this learning the promises which tradition has attached to the study of the Torah. His assistants are sextons who have lost their house of prayer; his students are pupils who have lost their Holy Writ.”129 Benjamin understood the “negative inversion” of Jewish categories in Kafka, according to Scholem: “The teaching no longer conveys a positive message, but offers only an absolutely Utopian—and there as yet undefinable—promise of a post-contemporary world.”130 Michael Löwy, who sought to tease out the Jewish and anarchist dimensions of Kafka, titled his chapter on him “‘Theologia negativa and Utopia negativa.’”131
“Utopia negativa” is not exactly a rallying cry. Yet the positive utopian tradition of blueprints for the communal kitchens of the future has atrophied. It has suffered too many reversals; it has been eclipsed by too much history; and its imaginative sources have been drained. In an age of triumphalism and self-promotion, to advertise the future only adds to the clutter. Another utopian blueprint looks like just another billboard or video. The future, perhaps, can be heard, not envisioned. The iconoclastic utopians knew this. They approached it as they approached the absolute—with open hearts and ears.